Encryption
Both Sides of Their Mouth
So let me get this straight… The FBI is in court right now telling the Department of Justice how it may sometimes need access to information, and Apple has made a…
deliberate marketing decision to engineer its products so the government cannot search them, even with a warrant
while at the same time, telling automakers
The FBI and NHTSA are warning the general public and manufacturers – of vehicles, vehicle components, and aftermarket devices – to maintain awareness of potential issues and cybersecurity threats related to connected vehicle technologies in modern vehicles
So, again, the FBI’s argument is “make it secure, but leave opportunities for us to get in.”
You can’t have both, kids. Interesting-er and interesting-er
Richard Clarke Talks Encryption
Richard Clarke, former National Security Council leader and security advisor to Clinton, Bush and Obama, in an [interview with NPR][1]
If I were in the job now, I would have simply told the FBI to call Fort Meade, the headquarters of the National Security Agency, and NSA would have solved this problem for them. They’re not as interested in solving the problem as they are in getting a legal precedent.
and later…
Every expert I know believes that NSA could crack this phone. They want the precedent that the government can compel a computer device manufacturer to allow the government in.
According to the guy who would know, the NSA has the ability to unlock the San Bernardino phone. I’ve always had a suspicion that was the case, but now it’s confirmed.
NPR has the whole transcript on the page, but I encourage listening to it to get some of the nuance.
[1]: http://www.npr.org/2016/03/14/470347719/encryption-and-privacy-are-larger-issues-than-fighting-terrorism-clarke-says title=“interview with NPR”
Apple’s FAQ on the FBI Request
This page is riddled with corporate-speak and has a bit of hyperbole, but it does try to lay out Apple’s case in the San Bernardino phone unlocking case.
Some hightlights that stand out to me:
Second, the order would set a legal precedent that would expand the powers of the government and we simply don’t know where that would lead us.
and
The digital world is very different from the physical world. In the physical world you can destroy something and it’s gone. But in the digital world, the technique, once created, could be used over and over again, on any number of devices.
Law enforcement agents around the country have already said they have hundreds of iPhones they want Apple to unlock if the FBI wins this case.
and most importantly, emphasis mine:
We feel the best way forward would be for the government to withdraw its demands under the All Writs Act and, as some in Congress have proposed, form a commission or other panel of experts on intelligence, technology, and civil liberties to discuss the implications for law enforcement, national security, privacy, and personal freedoms. Apple would gladly participate in such an effort.
Apple is standing up to the FBI’s effort to use a 227 year old law to unlock a smartphone in 2016. Obviously, technology has advanced a bit in the last couple centuries, so new laws need to be written. I, for one, am happy to see this fight, but it’s not without danger.
The big issues, as I see it:
-
There’s a big part of the population that will say Apple is helping the terrorists. It’s their right to say that, but I believe that the terrorists win when Americans voluntarily give up a part of their liberty, freedom, or privacy.
-
There’s also the possibility that Apple will lose and have to open the phone anyway, which would not only look terrible from a PR perspective, but also set a dangerous precedent for government requests for corporate-sponsored malware. (This is, admittedly, a bit of hyperbole on my part, but I see any software written to bypass security as malware, no matter who writes it.
-
The worst case scenario is Apple will win this battle, but lose the war for all of us. If the government can’t get into this phone, there is a slim possibility that laws could be enacted to outlaw unbreakable encryption, forcing some sort of back door into everything.
This is probably the biggest tech story of the year, and it’ll be interesting to see how it plays out, and the implications of it for decades to come.
Tim Cook: A Dangerous Precedent
In an open letter on the Apple Web site, Tim Cook lays out his case against helping the government unlock an iPhone:
The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.
“No reasonable person would find that acceptable." bears repeating. The thought that any government would have a powerful surveillance tool and never use it is ludicrous. I’m sure the FBI said “just this once, AT&T” when they did the first wire tap also.
The fact is, Apple can’t decrypt the phone. They stopped storing encryption keys on their servers years ago, just for this reason. If they don’t have the key, they can’t unlock the door. The FBI is requesting Apple build an all-new version of iOS that removes the limit on incorrect passcodes so the government can brute-force the phone by trying millions of passcode combinations. Currently, if an incorrect passcode is entered ten times in a row, the data is erased.
Apple is challenging the order, and here’s hoping they win.
We are challenging the FBI’s demands with the deepest respect for American democracy and a love of our country. We believe it would be in the best interest of everyone to step back and consider the implications.