U.S. Gov't Orders Apple to Backdoor iPhone

A letter you write is a physical thing, handled by the post system. It gets into contact with few people on its route, and the post offices and transportation has enough physical security to protect the mail. Sorting through lots of physical letters to find the interesting ones is time consuming and inefficient.

Electronic communication goes everywhere. The routes between the sender and receiver are manifold and only partially known and controlled by reliable operators. Innumerable people can get in touch with the digital packets, and the signals can be copied, registered, analysed, modified and stored, all without the sender and receiver really knowing. Sorting through the interesting bits is trivial. The only thing actually protecting this communication is the mathematics used to encrypt the digital packets so that only the intended receiver can open them.

The same goes for digital storage as with communication. If a device is lost or stolen, the only way to protect the information on it is through the proper use of encryption.

Mathematics do not bend to what is convenient. Either encryption and computer security is implemented properly and both the bad and good guys are denied access, or it is implemented wrongly, and sooner or later everyone will have access.

For this problem, there really is no middle ground. We as society must make a choice.

Security and privacy, or none of it.

And make no mistake, if we choose the latter, once the Internet of Things catches on for real, we will all be in deep doo-doo.

I don't think encryption is the only way - my smartphone, for example, has an app whereby I can delete all of the data on it from my computer if I happen to lose it or have it stolen. That said, you've done a good job of explaining why this is fundamentally different, so thank you.
 
That is true. As long as a lost phone has a connection, it is possible for you to connect to it and wipe all personal data from it. This requires that it does have a connection though, and that it has enough battery for you to have time to do so.

But what do you think it is that stops other people from connecting to your phone and wiping it just to harm you, or for a prank? Again, that is encryption. Which either can be implemented correctly, meaning only you can be able to wipe it, or incorrectly, leaving open the possibility that anyone can eventually do so. ;)

Any good adversary whom actually want to take your phone and steal the data on it (instead of just stealing the data while it is still in your possession) will know to disconnect it from the network (turning it off, putting it in a Faraday cage, etc.).

I'm happy that my explanation has helped you to see a bigger part of the picture. The essence is really that, unlike the post office, there are no trusted parties in the whole digital system, except for perhaps your device and the recipient. :)
 
We've never had that for those communications in the past - if I write an incriminating letter to either of those people, the police are perfectly at liberty to read it, or decode it if I encrypt it. We don't see that as a fundamental breach of privacy, I don't think. There absolutely is a middle ground.

That is not true in every country. In Germany, the privacy of letters is a constitutionally protected right. There are exceptions, but the police is not a liberty to open a letter.

That is Apple's problem: They might be able to find a compromise in the USA, but they want to sell their phones globally and their customers in other countries are not as willing to give up their liberty.
 
I'm not seeing the huge issue with this. The FBI asked Apple to do the following:

http://www.bbc.com/news/technology-35601035

From what I have seen so far it sounds like the FBI is doing a standard document request through the proper channels during a criminal investigation that is a bit wonky because they are entering unfamiliar territory due in part to Apple deciding to make phones encrypted by default. This isn't the government wanting a backdoor installed on all phones so they can listen in whenever the urge gets them - this is part of a criminal investigation.

Part of me wonders how many people would be supporting Apple if the situation was reversed. That is, if a criminal case was opened up against an FBI agent who killed people and Apple was refusing to comply with court instructions to open up the Agent's phone which was believed to have relevant information on it.

So you honestly think the government is only going to use this backdoor on this one phone? If so, I got some land on the moon you might be interested in buying.
 
I don't think encryption is the only way - my smartphone, for example, has an app whereby I can delete all of the data on it from my computer if I happen to lose it or have it stolen. That said, you've done a good job of explaining why this is fundamentally different, so thank you.

This actually uses encryption!

When you send the signal to the phone, it just deletes the encryption keys on the phone, so the data becomes unreadable.

If your phone isn't encrypted, the delete signal just has the phone issue a "consider this storage as free" command to all the storage, which is how deletion works on non-encrypted devices. To actually securely delete data on non-encrypted devices, it needs to be overwritten. (Depending on level of security, with random data, and possibly multiple times.) Overwriting all the storage on a phone would generally take some hours. (And is tricky for technical reasons, I'm not sure phone storage actually supports it well, so it could take many hours.)

If you erase an unencrypted device, anyone who has it in their possession can just disassemble it, pop out the storage chips, and recover all the data by reading the storage chips directly.
 
So you honestly think the government is only going to use this backdoor on this one phone? If so, I got some land on the moon you might be interested in buying.
I would think there is a world of difference between asking a company to do a specific action in compliance with a legal court order during the course of a criminal investigation -where the company maintains control over the software and only needs to put the software on a very specific phone- and fear-mongering that the government is now going to be snooping on everyone's phone.
To me it seems like the equivalent of saying the police ability to search your house with a warrant is some sort of massive breach of privacy because "it sets up the precedent they can do it".
I agree that by Apple creating the software that allows the FBI to try multiple passcodes without erasing the device can lead to some issues with privacy down the road which is why it is important that the process by which the government does it stays above board in courts and not skulking around in secret tribunals like with drone strikes.
 
If you erase an unencrypted device, anyone who has it in their possession can just disassemble it, pop out the storage chips, and recover all the data by reading the storage chips directly.

You could do the same with an encrypted device and then try to brute-force it without iOS interfering. That the FBI is not doing that means either there is a problem with that or they are using this particular device as a pretense to grant themselves more power.
 
You could do the same with an encrypted device and then try to brute-force it without iOS interfering. That the FBI is not doing that means either there is a problem with that or they are using this particular device as a pretense to grant themselves more power.

Brute-forcing a PIN or user password via iOS is possible, brute-forcing an AES-256 key is impossible.
 
I would think there is a world of difference between asking a company to do a specific action in compliance with a legal court order during the course of a criminal investigation -where the company maintains control over the software and only needs to put the software on a very specific phone- and fear-mongering that the government is now going to be snooping on everyone's phone.
To me it seems like the equivalent of saying the police ability to search your house with a warrant is some sort of massive breach of privacy because "it sets up the precedent they can do it".
I agree that by Apple creating the software that allows the FBI to try multiple passcodes without erasing the device can lead to some issues with privacy down the road which is why it is important that the process by which the government does it stays above board in courts and not skulking around in secret tribunals like with drone strikes.

Sure, they only put it on one phone, but guess who has that phone? The government. I would imagine it wouldn't take them too long to analyze what Apple did to that one phone and replicate it for use with all phones with absolutely no one being the wiser.

The analogy of searching my house with a warrant also doesn't really hold up because the police can't really do that without my knowledge. If they figure out how to crack iPhones though, they can monitor any phone they want without the owner of the phone even knowing they are being monitored. Now I don't doubt this would mostly be used for criminal investigations, but even criminals still have a right to privacy. I think the only thing the government should be allowed to pull that is related to phones are any records that the phone company themselves keep. However, authorities should not be allowed to access the actual content of the phone without the approval and consent of the owner of the phone.
 
Sure, they only put it on one phone, but guess who has that phone? The government.

No, the court order specifically allows for the phone to remain with Apple.

If they figure out how to crack iPhones though, they can monitor any phone they want without the owner of the phone even knowing they are being monitored.

That has nothing to do with this case.

However, authorities should not be allowed to access the actual content of the phone without the approval and consent of the owner of the phone.

In this case, the owner of the phone gave their approval and consent for the FBI to access it.


I support Apple, but when you try to support them without checking the basic facts of the case, it does neither you nor them any service.
 
No, the court order specifically allows for the phone to remain with Apple.

Aren't they going to have to take possession of it eventually though to get the information they want off of it?
 
I would think there is a world of difference between asking a company to do a specific action in compliance with a legal court order during the course of a criminal investigation -where the company maintains control over the software and only needs to put the software on a very specific phone- and fear-mongering that the government is now going to be snooping on everyone's phone.
To me it seems like the equivalent of saying the police ability to search your house with a warrant is some sort of massive breach of privacy because "it sets up the precedent they can do it".
I agree that by Apple creating the software that allows the FBI to try multiple passcodes without erasing the device can lead to some issues with privacy down the road which is why it is important that the process by which the government does it stays above board in courts and not skulking around in secret tribunals like with drone strikes.

They'll need to create said software version first though and test it to see that it does what it is supposed to do, meaning that software most certainly needs to be present on many more devices than just the specific phone in question, so even if this is really a one off thing (unlikely) and the company along with all involved techs immediately forget/attempt to delete (somewhere close to impossible I'd suppose) what is neccessary to perform this task, chances that this hacked version persists inside the company and eventually make their way outside are relatively high.

However, given that if this court order stands there will almost certainly be more of the kind, the company in question also will not actually be able to delete everything as it will have to anticipiate having to perform the same task again, meaning it would be almost expected of them in the future to quickly unlock any phone certainly quicker than in the initial case. Thus Apple will have to keep a permanent backdoor version in some fashion or other at the ready going forward, which almost certainly means such code will find its way outside the company as well one way or another. So creating a backdoor for this specific phone is precisely the same as requiring Apple to produce a backdoor for all of the same kind and most likely for all produced by the same company. Stating it can all be done by the company is here merely a means of shifting the blame for the eventual criminal misuse of the backdoor to the company for failing to protect its own code instead of having that blame fall on the state for requiring the backdoor to be implemented.
 
Brute-forcing a PIN or user password via iOS is possible, brute-forcing an AES-256 key is impossible.

iOS has to somehow generate the AES key from the PIN. If you have access to everything stored on the device and know the algorithm, there cannot be more possibilities for the AES key than there are for the PIN.
 
iOS has to somehow generate the AES key from the PIN. If you have access to everything stored on the device and know the algorithm, there cannot be more possibilities for the AES key than there are for the PIN.

It generates the AES key from the PIN in combination with a UID fused into the hardware (not the storage hardware, specific hardware location depends on iPhone version). It's impossible to read the UID directly from either hardware or software, only the result of the decryption operation after inputting the PIN. If you pull the storage, there's no way to use the UID.

You can see details in the iOS Security Guide. They're really quite clever.

TPM devices on PCs (and Windows Phones) do the same thing - if you pull the storage from a device with a TPM chip, you can't brute-force the password, you have to attack the AES (or other cipher) keyspace directly.
 
So it looks like there was an easier way to get the data the FBI is looking for, but it is the FBI themselves that screwed it up:

There might have been an easier way.

According to senior Apple executives on Friday, the FBI might have been able to obtain data from an iPhone 5C belonging to Syed Farook, one of the San Bernardino terrorists, by connecting it to a familiar Wi-Fi network and having it create a new backup on Apple's iCloud service.

The idea was foiled, the executives say, because the password to the terrorist's iCloud account was reset shortly after the FBI took possession of the phone. That meant iCloud and the iPhone couldn't recognize each other, the executives said.

The password reset is the newest wrinkle in the standoff between the government and Apple, which received a court order this week compelling it to create a custom version of its iOS operating system that bypasses security features on the iPhone. Apple rejected the order, saying it will fight the government's request -- all the way to the Supreme Court, if necessary -- because it means creating a "master key" for all phones that will undermine privacy and security.

If this is true, then Apple shouldn't be forced to compromise their company principles to cover the incompetent blunders made by the FBI because they don't understand how Apple's technology works. Seems to me like the FBI had their chance to get the info they were looking for and blew it and now they are trying to bully Apple to cover their mistake.

Also, Apple is clearly saying that what the government is asking them to do will compromise the security of every other Apple phone out there, which flies directly in the face of some of the comments made in this thread. And I am much more apt to believe Apple over random posters here on CFC when it comes to discussing the ins and outs of Apple's own tech.

Source: http://www.cnet.com/news/apple-says-investigators-ruined-most-promising-way-to-access-terrorist-data/#ftag=YHF65cbda0
 
It's silly to try to come up with a narrative of the FBI trying to "cover up their mistakes" when the obvious narrative of them trying to establish legal precedent using a case with the best possible optics for them is so much more straightforward. The iCloud reset is irrelevant to the legal matter, which is very likely why it wasn't mentioned in the court order. The FBI understands well enough how Apple's technology works - they receive assistance as required from Apple, and there's credible speculation that Apple assisted with the technical details of the court order so that the technical points would be sound, and the legal case could focus instead on the legal issues.

Bruce Schneier: "The order is pretty specific technically. This implies to me that what the FBI is asking for is technically possible, and even that Apple assisted in the wording so that the case could be about the legal issues and not the technical ones."

The legal precedent will compromise the security of every other Apple phone. I don't know whose comments you're referring to, but mine have all been accurate, and I haven't disputed anything Apple themselves have said, only people's mistaken interpretations of it.
 
It's silly to try to come up with a narrative of the FBI trying to "cover up their mistakes"

Not "cover up mistakes", but "cover mistakes". There is a difference between the two. No one is saying the FBI is trying to hide the fact that they screwed up early in the investigation, just that they want Apple to do them a solid to make up for the fact that they are essentially bumbling buffoons when it comes to gathering information from technology.

EDIT: And I'm mostly referring to Ajidica's comments, since those comments seem to imply a belief that only this one phone will be compromised if Apple complies, when Apple is repeatedly saying this is not the case; which is why they are refusing to comply.

As an aside: I kind of like that we are entering a world where private businesses are starting to become too big for governments to contain. With Apple being as large as they are, and the fact that they have more popular support right now than the government, there is a good chance they are going to win this fight and the government is going to be told to sit down and shut up. We need more of that in this world.
 
As an aside: I kind of like that we are entering a world where private businesses are starting to become too big for governments to contain. With Apple being as large as they are, and the fact that they have more popular support right now than the government, there is a good chance they are going to win this fight and the government is going to be told to sit down and shut up. We need more of that in this world.

It kind of scares me that this is happening. Fair Use is worth nothing unless YouTube helps protect it, and that to me is scary, since why would a company help protect your rights? I mean, Apple is doing that here, but that is not the trend.
 
Back
Top Bottom