A photo of Alshamrani is displayed on the phone of one of his sisters at their home in Saudi Arabia last month. (The New York Times: Iman Al-Dabbagh)
US Attorney General William Barr last week asked Apple to provide access to two iPhones used by Saudi Air Force officer Mohammed Saeed Alshamrani who shot and killed three people at a naval base in Pensacola, Florida, in December. Barr accused Apple of providing no “substantive assistance” to investigators trying to break into the phones.
Apple says it has already handed over all the data in its possession to the FBI. But the FBI also wants data on the phone that might not be on the cloud. Two days after Barr’s demand, President Donald Trump tweeted his frustration with the company: “We are helping Apple all of the time on TRADE and so many other issues, and yet they refuse to unlock phones used by killers, drug dealers and other violent criminal elements. They will have to step up to the plate and help our great Country, NOW!”
This is the second time in four years that Apple is at the centre of a battle that pits personal privacy against public safety. In 2015, the FBI went to court to force Apple to help them unlock an iPhone 5c that belonged to Syed Rizwan Farook who, along with his wife Tashfeen Malik, killed 14 people and injured 22 others in a terrorist attack in San Bernardino, California.
What does the FBI want this time?
FBI wants Apple’s help to unlock an iPhone 5 and an iPhone 7 that belonged to Alshamrani. It appears the phones were damaged, but the FBI got them working again — but is unable to unlock them due to Apple’s encryption and enhanced security tools. As in 2015, this would require a “backdoor entry” to the phones, bypassing encryption.
Apple insists it does not create back doors for anyone, including law enforcers and its own employees, because these can be exploited easily, and would compromise the security of all iPhone users. At the recent CES 2020 show in Las Vegas, Apple’s Senior Director of Global Privacy Jane Horvath defended the company’s position on encryption, saying iPhones contain a lot of personal data on health and financial matters, etc. that need to be protected in case a device is stolen.
Can an iPhone be unlocked without Apple’s help?
As has been demonstrated earlier, it is not impossible to break the encryption. Alshamrani had models that were relatively old, and it is argued that they can be opened by specialised cyber-security firms that offer their services to law enforcement agencies. The FBI had ultimately got into the San Bernardino terrorist’s iPhone 5c with help from a third-party vendor that supplied the software, reportedly for $900,000.
It is being said that on this occasion too, the FBI might not need Apple’s help, especially if Alshamrani’s phones run an older (and therefore less secure) version of iOS. But the FBI has said in a statement that it has turned to Apple only after exhausting all options. Since the 2015 showdown, Apple has fixed security vulnerabilities or ‘bugs’ in its software, thus making it harder to ‘hack’ into a device.
However, software such as Cellebrite and GrayKey can break into iPhones. GraftShift, the company behind GrayKey, is focused exclusively on iPhones, and is believed to have been used by the FBI in the past.
How did Apple react on the earlier occasion (2015)?
After Apple refused the help that FBI wanted, the agency got an order from a judge who directed the company to provide “reasonable technical assistance” to the investigators. Apple was required to load a specific iOS recovery file onto the device so the FBI could recover the password.
Apple CEO Tim Cook wrote a letter to all customers during the crisis, which was posted on the company’s website. Cook underlined in the letter that the order would have an impact far beyond the case at hand — and pointed out that basically, engineers who had worked to secure the device were now being told to undo those same protections. This would be opening the Pandora’s box, Cook said, because it would put consumers’ data at risk from cyber criminals.
Apple did not have to ultimately comply, which it had seemed unlikely to anyway. The FBI managed to open the iPhone 5c, and the government moved to have the order vacated, arguing it was no longer needed.
But why do tech companies not want to allow access to phones?
The issue is not just about a terrorist’s or a criminal’s phone, as Trump said on Twitter. For the tech companies, this is a much wider issue of privacy. Smartphones are today the most important device for many people, who store personal information, including financial information, on their devices. For tech companies, be it Apple with its iOS devices or Google and its Android ecosystem, it is critical to secure devices against any kind of vulnerability. That is why companies push out monthly updates to the software — or sometimes a quick security update when they discover serious a flaw.
No “back door” can be restricted only to law enforcement; it can be exploited by anyone, including hackers. If a key to open a device without a passcode is created, it can be easily replicated in the digital world by someone for whom the key was not originally intended.
The knowledge that their devices could be accessed by someone else will result in a major loss of consumer trust — not something tech companies want. Across the world, there is deepening concern over unauthorised surveillance of devices by state agencies.
The privacy vs encryption debate is playing out in India too. In several cases, law enforcement authorities have demanded that tech companies hand over consumers’ data. However, in end-to-end encrypted apps such as WhatsApp, data such as messages are with the user, and are not backed up to the company’s servers. Companies are saying they don’t have the data that the agencies are seeking. And as the Apple case shows, this tussle is unlikely to be resolved anytime soon.
Don't miss from Explained: Faith or insecurity? Contesting claims over Sai Baba