President Trump is ratcheting up the pressure on Apple (AAPL) to unlock two iPhones used by the suspect in a shooting at the Pensacola Naval Air Station in December.
On Wednesday, Trump tweeted, "We are helping Apple all of the time on TRADE and so many other issues, and yet they refuse to unlock phones used by killers, drug dealers and other violent criminal elements."
The issue, however, is far more complicated than Apple simply unlocking the suspected shooter's iPhones. That’s because creating a so-called "backdoor" for a single iPhone instantly opens every other iPhone on Earth to the risk of attack. And even though Apple relies on Trump's tariff exceptions, the company is unlikely to change its mind.
Unlocking one phone unlocks them all
While Apple has refused to unlock the phones used by accused Pensacola shooter Mohammed Alshamrani, the company has said it has given the government access to Alshamrani's iCloud account, and other documentation.
Why not unlock the iPhones then? Because every iPhone runs on Apple's iOS software. And if Apple were to attempt to unlock the phones used by Alshamrani, the company would have to purposely break iOS, creating a way to access all data stored on the devices.
But since the iPhones used by Alshamrani are, more or less, the same as those owned by you or me, any exploit Apple creates to unlock his phones, would work just as well on our phones.
"It's similar to, you know, why don't we just make it so that every single combination lock in the world that's made, the police have a combination they can input to get themselves into any lock," explained Justin Cappos, professor of computer science and engineering at the NYU Tandon School of Engineering.
"And why don't they have that? Because as soon as criminals figure out how to use that, then you're in trouble."
That's where the idea of some kind of "backdoor" falls flat. There isn't a single method that law enforcement can use to access all iPhones, or any devices for that matter, that will stay secret for long.
"Many people in the security community have expressed serious concerns about introducing a backdoor, as it is very difficult to monitor its use and contain the effects of any leaks — e.g., someone leaking access to the backdoor," explained Petros Efstathopoulos, global head of research, NortonLifeLock Research Group.
In the world of cybersecurity criminals, hackers, and other malicious actors are constantly probing every piece of software they can find for weaknesses they can exploit to access your data, or take over your device.
Companies like Apple are, at the same time, continuously working to patch any potential flaws they find in their software to harden their defenses. The iPhone is especially secure, because Apple tightly controls its software compared to say Android, macOS, or Windows devices.
But because flesh and blood humans craft the operating systems we use, they are bound to introduce errors. Hackers then create viruses and malware, etc. that can exploit those errors to crack into your computer or other device.
And no matter how hard tech companies work to keep their devices and software safe, they'll always be one step behind the bad guys.
Purposely introducing a weakness into Apple's iOS, then, would make it all the easier for hackers to break open any iPhone they want.
"Apple is being very smart to say we don't want our phones to be hackable," Cappos said. "We don't want our phones to be weak. We want to protect someone who blogs about what's happening in Hong Kong. We want to protect anyone who uses our phone to have the right to privacy."
There's also the question of how foreign governments could abuse iPhone access, Cappos said.
"If you really, really trust your government in the United States and whoever is in power, you may be okay with law enforcement having a way to do this. But how do you handle situations when the Chinese government is asking for this? How do you handle situations when the Iranian government is asking to use this backdoor?"
The government has lost the keys before
Trusting the U.S. government's ability to keep the keys to every iPhone on Earth safe isn't a wise move either if past is precedent.
Take for instance, the National Security Agency's EternalBlue Windows exploit. The tool, developed by the NSA, was used to gain access to and take remote control of Windows PCs.
But EternalBlue became public in 2017 when a group calling itself the Shadow Brokers stole it from the agency, and released it to the world. From there, it was used to create the WannaCry virus and a slew of other pieces of malware, which continue to wreak havoc across the globe.
Importantly, EternalBlue was a tool the NSA developed by taking advantage of a then-unknown weakness in Windows. Now imagine what would happen if Apple purposely created a weakness in iOS, and you begin to understand why it could be such a dangerous move.
Every hacker would instantly begin poking at the operating system in an effort to access that weakness and attack as many iPhones as possible.
"As demonstrated by many cases in the past, if confidential information — such [as] instructions or code allowing access to the backdoor mechanism — is leaked, then everyone using that particular phone brand may be attacked," explained Efstathopoulos.
"This has obvious legal implications as well as practical implications on people’s lives, including physical safety."
What's more, the government has already proven it can break into iPhones without Apple's help. After the company refused a court order to give the Department of Justice a means to unlock the iPhone used by one of the San Bernardino shooters in 2016, the FBI announced it was able to break into the phone using a third-party tool.
Apple has made significant security improvements to iOS since then, which likely killed the avenue of attack used by that tool, but if the government was able to use a third party before, it stands to reason that it could do so again.