Why the US government is not asking Apple to unlock backdoors for law enforcement agencies


For years, the U.S. government has begged Apple executives to create backdoors for law enforcement agencies. Apple publicly refused to do so, arguing that any action made for law enforcement could quickly become a backdoor for cybercriminals and terrorists. That means good security protects us all.

But recently, the federal government no longer calls for a way to bypass Apple's security system. What is the reason? Because it can penetrate by magnetic force. iOS security and Android security aren't as strong as Apple or Google claims.

The Johns Hopkins University Cryptographic Team recently published a very detailed report on both mobile operating systems. The report concluded that both operating systems have excellent security, but they do not extend enough. This means that anyone can hack with the right tools.

For corporate CIOs and CISOs, this means the reality that extremely sensitive conversations on an employee's cell phone can easily be beaten by industrial spies or data thieves. 

Let's take a closer look at what the hell is wrong. First, let's look at the content of the John Hopkins University report on Apple iOS.

“Apple advertises its widespread use of encryption to protect user data stored on devices. However, they discovered that they use the AFU (Available After First Unlock) protection class, which is vulnerable to the enormous amount of sensitive data held by embedded applications. This method does not delete the decryption key when the device is locked. As a result, most of the sensitive user data from Apple's built-in applications can be accessed by obtaining a device that has a locked but powered device and uses it logically. We also found circumstantial evidence in the US Department of Homeland Security's processing and investigation documents, and law enforcement now routinely uses decryption keys that can be used to capture large amounts of sensitive data from locked phones.”

It can be said that it is a limitation of the smartphone itself. So how about Apple's iCloud service?

“I also checked iCloud's current data protection status, and surprisingly, enabling these features transfers significant user data to Apple's servers, allowing remote access by criminals who gained illegal access to users' cloud accounts. Is transmitted. Of course, a legitimate law enforcement agency with a subpoena can do it. What's even more surprising is that many of iCloud's counterintuitive features exacerbate the vulnerability of these systems. For example, Apple advertises that the 'iCloud Messages' feature uses a container with end-to-end encryption that Apple cannot access to synchronize messages between devices. However, when you enable iCloud backup as well, the decryption key for this container is uploaded to Apple's servers in a format that Apple can access. Of course, potential attackers or law enforcement agencies will have access. Similarly, I investigated Apple's iCloud backup design and found that a device-specific file encryption key was sent to Apple. These keys are the same keys that the device used to encrypt the data, and if the device is physically compromised, this transfer process may pose a risk.”

What about Apple's well-known Secure Enclave Processor (SEP)?

“IOS devices strictly limit passcode guessing attacks through the assistance of a dedicated processor called SEP. We investigated the results of a public investigation to find evidence that proves this fact, and as of 2018, a tool called GrayKey could be used to launch a passcode guess attack on an iPhone with SEP enabled. To the best of our knowledge, this shows that a software workaround for SEP was practically available at this time.”

Now let's look at the security of Android. First of all, Android's encryption protection turned out to be far worse than Apple's.

“Like Apple iOS, Google Android provides encryption for Pyroga data stored on disk. However, Android's encryption mechanism provides a low level of protection. In particular, Android does not provide a protection function equivalent to Apple's CP (Complete Protection) encryption level, which removes the decryption key from memory as soon as the device is locked. Therefore, the Android decryption key remains in memory after the first unlock, and user data is vulnerable to criminal investigation data capture.”

The survey means you shouldn't trust CIOs or CISOs with Android, Apple, or neither. And, whether criminal or law enforcement, as long as they can physically access the device, they must assume they can access user data as well. If high-margin corporate spies or cybercriminals target specific corporate executives, it could potentially be a big problem.

Post a Comment

0 Comments