U.S. Gov't Orders Apple to Backdoor iPhone

I'm not sure owning an iPhone poses the same risk to others as owning a gun does.
 
That's how deep the conspiracy goes. It's an advertising plot of the highest order.

Seriously, I hated them for being all chrome and deciding to change adapters with every new model so you have to have a box full of dongles to hook up anything to anything. Now I have to like them. Thanks Obama.
 
I'm not sure owning an iPhone poses the same risk to others as owning a gun does.

I was nearly run off the road by a moron texting on an iPhone yesterday while trying to drive their car. I was not nearly shot at the range that I was driving to while evading that moron.

And yes, anecdote =/= datum and all that. :twitch:
 
I'm not sure owning an iPhone poses the same risk to others as owning a gun does.

Ah, but for mental health background checks to work for gun purchases, the database would need to be set up for anyone who could buy a gun, ie everyone. Meaning the data would have to be collected for everyone in anticipation of the potential of buying a gun.

Of course the database would have to be accessible to not only the government, but also for gun dealers and other private parties so they can conduct the database checks prior to sale.

A potential mental health database for gun purchasers is a much broader and invasive collection of data than breaking into a cell phone, and the assessment of whether or not either is permissible in relation to each other should reflect that.

Then there's your thesis itself. In 2014, there were 11,208 firearm homicides in the US. The same year, there were 32,675 vehicular deaths. Half of all vehicular deaths involved cell phone use. So there were about 50% more vehicular deaths related to cell use than firearm homicides.
 
Why do they need the mental health record of everyone? Why not just get the mental health records of the people who try to buy a gun?
 
A potential mental health database for gun purchasers is a much broader and invasive collection of data than breaking into a cell phone, and the assessment of whether or not either is permissible in relation to each other should reflect that.

:rolleyes:

The government can already access that data. The government cannot already access the data on every single phone and electronic device.

And make no mistake, this case is not about a single phone. It's about establishing the precedent. The fact that it belongs to a terrorist is just the bologna they're wrapping the bitter pill in.
 
Why do they need the mental health record of everyone? Why not just get the mental health records of the people who try to buy a gun?

How would that work? How would you arrange that system in a manner that is both effective and respectful of privacy?

I'm not sure I see how to do it.

For example, say that bipolar disorder renders one incapable of getting a gun (just to pick one ailment). A person has bipolar disorder and has been treated for bipolar disorder for years. When she goes to buy a gun, how can it be determined that she should not be given one if you don't already have a mental health record for her? I can't imagine a means by which one could effectively enforce that requirement without previously having a mental health records database.
 
I'm not seeing the huge issue with this. The FBI asked Apple to do the following:
BBC with text of the court order in the link said:
The FBI wants Apple to alter what is known as a SIF - System Information File. In this context, the FBI is basically referring to the software that runs on the device. The FBI wants Apple to create a new SIF to place on Farook's iPhone that will allow it to carry out several functions normal iPhones do not allow.

The FBI wants to be able to:
1. Prevent the phone from erasing itself. If certain security settings are enabled, after 10 failed attempts at entering a passcode, an iPhone can erase the personal data on the device. The FBI doesn't want to this to happen on Farook's phone.
2. Automate the process for trying out passcode combinations. Farook used a four-digit passcode, for which there are 10,000 possible combinations. The FBI doesn't want to have to guess them all manually, and so it wants Apple to allow the passcode to be tried electronically. This means the FBI could simply instruct a computer to try every passcode, something that would take just minutes, possibly seconds...
3. …and without unnecessary delay. The iPhone prevents you from entering a passcode for longer and longer periods of time each time you get it wrong. The FBI wants this barrier removed.
4. Control the process, but not know how it's done. This is an interesting line, as it is suggests the FBI is willing to allow Apple to work on the phone at its own HQ, and in a way that doesn't risk the encryption software being released into the world.

As this row goes through the courts, expect that final element to be a key point the FBI makes - it will argue that the SIF will only work on Farook's phone, and will be known only by Apple, who could choose to destroy it.
http://www.bbc.com/news/technology-35601035

From what I have seen so far it sounds like the FBI is doing a standard document request through the proper channels during a criminal investigation that is a bit wonky because they are entering unfamiliar territory due in part to Apple deciding to make phones encrypted by default. This isn't the government wanting a backdoor installed on all phones so they can listen in whenever the urge gets them - this is part of a criminal investigation.

Part of me wonders how many people would be supporting Apple if the situation was reversed. That is, if a criminal case was opened up against an FBI agent who killed people and Apple was refusing to comply with court instructions to open up the Agent's phone which was believed to have relevant information on it.
 
Because a terrorist who killed a dozen people is a more sympathetic person to support? Come on.
 
I'm not seeing the huge issue with this. The FBI asked Apple to do the following:

4. Control the process, but not know how it's done. This is an interesting line, as it is suggests the FBI is willing to allow Apple to work on the phone at its own HQ, and in a way that doesn't risk the encryption software being released into the world.

As this row goes through the courts, expect that final element to be a key point the FBI makes - it will argue that the SIF will only work on Farook's phone, and will be known only by Apple, who could choose to destroy it.

The bold part is where the argument fails: The mere existence of this Apple-created malware would risk its release into the world. Once the FBI has successfully established precedent, they will continue to use it in future cases. And then, agencies in other countries will demand the same thing. So, there would be no point for Apple in destroying that software (and destroying software is hard, anyway), because they would have to develop it again, two weeks later.
 
Zelig said:
I've been singing Cook's praises for years, but boy, Steve Jobs vs. the US Government would be fun to watch.

Cook's/Apple's stance is made much easier by who their customer is - they've got essentially nothing in advertising, and only weak government ties.

I'm curious to see if Brad Smith says anything further than the RGS blurb.

Indeed, Steve Jobs vs. U.S. would be fun to watch, but I'm not sure if he cared as much about user privacy as Cook does? And their customer base does help.

I've also gradually grown to like Apple more as unlike Google or Microsoft, advertising is not a major source of your income. Google and Microsoft monetize their mobile operating systems in no small part via monetizing your data (completely in Google's case?), whereas Apple makes their money by selling the hardware.

Brad Smith = Microsoft general counsel? I'm not 100% sure if I have that right, but he's an interesting one too if that's the one, given the Microsoft vs. United States case over the Irish data.

Either way, it seems that there is room here for competition to step in and steal the customers who care enough about security and privacy, if apple is forced to go through with this.

Blackberry?

They actually are perhaps the biggest non-U.S.-based smartphone OS maker remaining. And BB10 is still up to date, even if they are toeing in the Android waters. They may struggle to get regular everyday people to sign up in droves, but if Apple lost this case it could help Blackberry in the corporate sector. Enough to sustain them though, who knows.

Five and a half years ago, Nokia would've been in a good position to potentially benefit from this as well. But they sold their phone division to Microsoft, and their OS, Symbian, has been in stasis for several years. Its latest incarnation is actually a pretty good smartphone OS even today (updated software being the main problem), and does have encryption support built-in (though I don't know the nitty-gritty of how it compares with Apple's). But if anyone wanted to try to revive it, it would likely take a major injection of capital, and buying Blackberry would likely be a more appealing option. Or trying the Blackphone route and making a highly secure variant of open-source Android.

And you're probably thinking of Nokia's entry-level models as being low quality; they did indeed make very cheap dumbphones. Their higher-end smartphones were very good quality however, just not particularly common in North America.
 
How would that work? How would you arrange that system in a manner that is both effective and respectful of privacy?

I'm not sure I see how to do it.

For example, say that bipolar disorder renders one incapable of getting a gun (just to pick one ailment). A person has bipolar disorder and has been treated for bipolar disorder for years. When she goes to buy a gun, how can it be determined that she should not be given one if you don't already have a mental health record for her? I can't imagine a means by which one could effectively enforce that requirement without previously having a mental health records database.

How about when someone wants to buy a gun they have to submit a medical record for review. Doctors hold the medical records, patients request the medical records, and then give a copy of the records over to whomever needs to review them. No need for a global public database that the government or anyone else can access.
 
The bold part is where the argument fails: The mere existence of this Apple-created malware would risk its release into the world. Once the FBI has successfully established precedent, they will continue to use it in future cases. And then, agencies in other countries will demand the same thing. So, there would be no point for Apple in destroying that software (and destroying software is hard, anyway), because they would have to develop it again, two weeks later.
My line of thinking is that would it be a better precedent to set for a company to refuse to carry out a lawful court order based on what could possibly happen if Apple's own security team messes up? Given that the FBI is going to Apple for this I'm willing to bet their in-house computer experts aren't capable of doing this on their own.
 
Apple backdooring you say ? :lol:
 
My line of thinking is that would it be a better precedent to set for a company to refuse to carry out a lawful court order based on what could possibly happen if Apple's own security team messes up? Given that the FBI is going to Apple for this I'm willing to bet their in-house computer experts aren't capable of doing this on their own.

The entire crux of this argument is about whether this is a lawful court order.
 
I've also gradually grown to like Apple more as unlike Google or Microsoft, advertising is not a major source of your income. Google and Microsoft monetize their mobile operating systems in no small part via monetizing your data (completely in Google's case?), whereas Apple makes their money by selling the hardware.

Brad Smith = Microsoft general counsel? I'm not 100% sure if I have that right, but he's an interesting one too if that's the one, given the Microsoft vs. United States case over the Irish data.

Well MS doesn't really monetize their mobile OS at all, and their mobile OS isn't really a thing anymore, it's just Windows, with a special UI for small screens. Their recent strategy has been to slash and burn expenditures in order to minimize the loss they were incurring on every unit sold.

Yeah, former General Counsel, now Chief Legal Officer and President.

Blackberry?

They actually are perhaps the biggest non-U.S.-based smartphone OS maker remaining. And BB10 is still up to date, even if they are toeing in the Android waters. They may struggle to get regular everyday people to sign up in droves, but if Apple lost this case it could help Blackberry in the corporate sector. Enough to sustain them though, who knows.

BB10 is dead, just last week they laid off several hundred more staff, in large part from BB10 engineering. BlackBerry has pretty much always acquiesced to government requests anyway, even when they were a major player.

Five and a half years ago, Nokia would've been in a good position to potentially benefit from this as well. But they sold their phone division to Microsoft, and their OS, Symbian, has been in stasis for several years. Its latest incarnation is actually a pretty good smartphone OS even today (updated software being the main problem), and does have encryption support built-in (though I don't know the nitty-gritty of how it compares with Apple's). But if anyone wanted to try to revive it, it would likely take a major injection of capital, and buying Blackberry would likely be a more appealing option. Or trying the Blackphone route and making a highly secure variant of open-source Android.

The core of Symbian wasn't really workable as a modern OS. Sailfish OS is really the successor, and it's gone pretty much nowhere in the market.

My line of thinking is that would it be a better precedent to set for a company to refuse to carry out a lawful court order based on what could possibly happen if Apple's own security team messes up? Given that the FBI is going to Apple for this I'm willing to bet their in-house computer experts aren't capable of doing this on their own.

The expert consensus is that the FBI could almost certainly do this on their own, they simply picked this case because homegrown terrorism makes good optics for them to establish legal precedence with.

One of any number of security experts I could link, Bruce Schneier: Why you should side with Apple, not the FBI, in the San Bernardino iPhone case

"There’s nothing preventing the FBI from writing that hacked software itself, aside from budget and manpower issues. There’s every reason to believe, in fact, that such hacked software has been written by intelligence organizations around the world."
 
Back
Top Bottom