I know y’all will have strong opinions on this! Via the NYT:
SAN FRANCISCO — Apple said on Wednesday that it would oppose and challenge a federal court order to help the F.B.I. unlock an iPhone used by one of the two attackers who killed 14 people in San Bernardino, Calif., in December.
On Tuesday, in a significant victory for the government, Magistrate Judge Sheri Pym of the Federal District Court for the District of Central California ordered Apple to bypass security functions on an iPhone 5c used by Syed Rizwan Farook, who was killed by the police along with his wife, Tashfeen Malik, after they attacked Mr. Farook’s co-workers at a holiday gathering.
Judge Pym ordered Apple to build special software that would essentially act as a skeleton key capable of unlocking the phone.
[snip]The F.B.I. said that its experts had been unable to access data on Mr. Farook’s iPhone, and that only Apple could bypass its security features. F.B.I. experts have said they risk losing the data permanently after 10 failed attempts to enter the password because of the phone’s security features.
[snip] [Apple CEO] Cook said the order would amount to creating a “back door” to bypass Apple’s strong encryption standards — “something we simply do not have, and something we consider too dangerous to create.”
Apple says creating software that would unlock Farook’s phone would compromise every iPhone’s security. Plus they make the point that other governments (e.g., China) could order them to bypass security as well, and then where would it end.
I don’t know enough about encryption, etc., to know whether or not that’s bullshit. But Apple’s objection seems reasonable to me. What say you?
kindness
I’m on Apple’s side here. No way the Feds take that software and only use it on 1 phone.
Corner Stone
If Apple complies their stock price is going to get bashed.
I'mNotSureWhoIWantToBeYet
I think Apple will lose on this.
It used to be that all encryption had to be inspected by the US government before an export license would be issued.
Everything else in our lives can be inspected if the police get a warrant. Why should our smart phones be any different?
Yeah, it sucks in many ways, but I don’t think Apple will win this.
My $0.02.
Cheers,
Scott.
singfoom
Apple is absolutely right in this case. Breaking the encryption on one phone = breaking the encryption on all phones or making a tool that can do so.
The WH spox suggested “it’s just one phone”, but that’s bullshit and just displays a level of ignorance around the technology.
fuckwit
The business of America is business. Money talks, bullshit walks. Whatever business wants, business gets.
Business wants strong encryption and tight security. They’ll get it. Fuck anyone else.
The mighty Market has spoken. The feds can pound sand.
What I’m more interested in how, exactly, such an exploit (and it is an exploit!) would work. How can Apple bypass encryption if they don’t have the keys? What exactly would this backdoor be? Why not just jailbreak the phone? And, most importantly, I cannot believe that the NSA doesn’t already have this capability anyway.
Tom Q
Okay, I’m no doubt exposing myself to ridicule for technological ignorance, but…can’t Apple perform the act of unlocking the security and give it to the Feds without letting them see how they did it?
Assuming that’s a ludicrous proposition…this would be one of those cases that law professors love to throw at their students. Because, truly, each side has major compelling interest at stake here.
Brent
I am all for privacy but this takes it a bit further than I think is necessary. That is, I don’t believe that in order to protect us from capricious or arbitrary government access to our files, we need to make it impossible to access any files, under any circumstances ever. In cases where there really is a valid state interest in knowing whats on a device, established by court order, I would think there really should be some technical method to be able to fulfill that order. I don’t see why it needs to go any further than that. In other words, I don’t really buy the slippery slope argument Apple seems to be advancing here. But I freely admit that there may be technical aspects of this that elude me at the moment.
jeffreyw
They want Apple to make them a hammer “to just drive that one nail”. Methinks the Feds will look around later and see quite a few nails in need of hammering.
Felonius Monk
Apple will probably lose on this and there will be not one useful piece of information on that phone, but then the genie is out of the bottle.
stacy
Well Morning Joe went on an extended rant against Apple so it must be a reasonable objection.
Doc Sportello
If the FBI breaks Apple, then turrorists can use any of the hundreds of encryption options available overseas (and beyond the scope of American courts). FBI is being impressively short-sighted here.
singfoom
@I’mNotSureWhoIWantToBeYet: Sure, searches with warrants are great. I’m supportive in general of useful searches.
In this case, the government wants Apple to produce a version of iOS with a specific security feature disabled, that the number of failed unlock attempts will brick the phone. This is so they can brute force their way past the 4 digit password.
This is not analogous to a pinpointed warrant search because that software could be used to unlock any other phone / expose it to brute forcing. I think Apple will win in this case. There’s no good WAY for Apple to decrypt this phone and NOT have the techniques / software they write for decrypting this specific phone not be applicable to a general case….
Gin & Tonic
@I’mNotSureWhoIWantToBeYet: “Used to be” is the operative term. Then source code was printed in an OCR-able font in a book, and ran smack up against the 1st Amendment, and it all fell apart.
I actually still have that t-shirt that’s pictured in your Wiki article, BTW.
Gin & Tonic
@stacy: Good way for the Republicans to get more of the youth vote, by attacking Apple.
I'mNotSureWhoIWantToBeYet
@Tom Q: IIRC, Apple argued that they couldn’t break the encryption, so there was no use in anyone asking for a key – it couldn’t be done.
I don’t think that’s the case, but I’m no expert on this stuff. I’m just relaying what I recall about Apple’s arguments before releasing this feature.
Cheers,
Scott.
singfoom
@fuckwit: See my comment at #12. The feds just want Apple to remove the # of wrong attempts = bricked phone security feature so that the feds can brute force (guess ALL the possible combinations) the unlock code.
But you’re right, the markets have spoken and this won’t happen.
Mnemosyne
I honestly don’t have an answer because this is a serious ethical problem even beyond the legal one. Does someone have a right to destroy evidence of their criminal activities even after a proper and constitutional warrant is issued?
Since the perps are dead, I’m not sure I see the urgency of the FBI being able to access the phone. It seems unlikely at this point that they were anything other than poseurs who wanted a scary-sounding excuse for their otherwise textbook workplace murders.
Gin & Tonic
@singfoom:
It may have been longer. I had a long alphanumeric password on my iPhone 5, and have a longer one (and a TouchID) on my 6.
Arm The Homeless
It’s a cynical outlook, but isn’t this some mondo-primo advertisement for security features of iOS?
After the SF AT&T debacle during the Bush regime, I assumed that the govt doesn’t ask, these large tech companies are more than willing to play ball in hopes of keeping federal purchasing chugging along.
What it smells like, IMHO, is that Apple gets to play protector for PR purposes while backdoors were baked in from the beginning.
Maybe as a Millenial I just assume privacy and secrets are from a byegone era?
singfoom
@Gin & Tonic: Fair enough, I’m still using a 4S, so I was basing it on that. That said, the approach is still exactly the same, it’s just more potential guesses.
different-church-lady
Wasn’t it just two years ago we were told with ABSOLUTE CERTAINTY that the government already had all of our data?
Roger Moore
I’m not sure if I understand it correctly because it hasn’t been reported very well, but I have a reasonable guess. What it boils down to is that the encryption on Apple’s phones may be great, but they have an inherent problem that the passwords are weak, so it would be practical to go after them by brute force. To avoid that, Apple sets it up so the phone will only allow 10 tries before scrambling the phone permanently; I assume this amounts to erasing critical data so you simply can’t make any more attempts and counting on the encryption to protect the device from further attacks.
There might be something Apple could do that would let them bypass the 10 try restriction, or to download the encrypted information before it’s irretrievably borked so that law enforcement could brute force it. I’m a little bit surprised that Apply hasn’t simply told the court that it is incapable of complying, rather than unwilling. It’s quite possible to design a device that would make it impossible for Apple to do what the court is asking even if they wanted to, and I would have expected them to design it that way specifically so they would have grounds to refuse when this kind of case cropped up. By going with the “we don’t want to” angle rather than the “we can’t even though we want to help”, it makes me suspect that there really is something Apple could do.
slag
Look, if the feds want to do this, they should be forced to get creative, independently of the phone’s developer. These kinds of constraints, if the FBI would embrace them, could do them some good. Make them a better organization.
MazeDancer
@fuckwit:
It does seem remarkable the Feds can’t get data off a phone. Or crack a password. Which makes about half the procedural dramas on TV be devoid of facts. They’re always getting stuff off phones.
@I’mNotSureWhoIWantToBeYet:
There is a logic to that. But telling Apple to “build some special code” seems different than a perpetrator being forced to allow a search of the premises.
OTOH, if phones are exempt, and can’t be cracked, then doesn’t that equal a new protected place to hide illegal data?
Microsoft doesn’t get ordered to help hack people’s PC’s, do they?
I
daveNYC
Can’t the feds setup x number of virt iphones and then just hammer through the various password combinations on them without risking the actual physical phone?
I'mNotSureWhoIWantToBeYet
@singfoom: I think you’re arguing for a distinction that isn’t there.
Sure, if Apple really can unlock this particular phone then they’ll be able to unlock any other (similar) phone.
But they can only legally do it for the police if they are told to do so by a warrant. That’s the way it works for anyone served with a warrant to empty their paper file cabinets, too. What’s the difference between a phone and a file cabinet, other than maybe degree?
I don’t like the argument that a capability shouldn’t exist because the law can’t be trusted. If the law can’t be trusted, then we’re doomed in many, many other ways than the police being able to look at our phones…
FWIW.
Cheers,
Scott.
Mike J
Why would China wait for the US to ask for this capability? Can’t they make these demands on their own?
When it happens, Apple will have the choice of producing the software or being thrown out of the largest market on earth. I have no doubt what they will do.
different-church-lady
@singfoom: How do you get the custom iOS on to the phone without unlocking the phone first?
Billy K
Here’s the deal – once there’s a backdoor, that backdoor will be exploited by others who are not members of our trustworthy, good-guy government. Hackers will find it. Other nations will buy it. Terrorists will get it.
Aside from the privacy issue (which is really important, too!), this is the biggest point here, and I can’t believe our wise and gentle friends in the government can’t see it.
sharl
Meanwhile, in legal action involving the adorable Bundy clan – patriarch Cliven, to be specific – here are a couple of paragraphs from a filing by some AUSAs out West {via this tweet}:
~
So take heart, city slickers: YOU TOO can ranch, the Cliven Bundy Way! It’s THAT EASY!!!!
Wvblueguy
This applies to everyone that makes any phone or OS for computers. Once law enforcement gets access to one type of device they will have an open door to force any creator to give them access to a device. Apple, Google and Microsoft have all created software that insures that your files are inviolate by hackers and governments. I’m sure everyone understands that any program that may be created that gives LEO access to your device that has been encrypted will leak to the world literally overnight. Apple is absolutely right to fight this very important battle. I’m sure that the tech world will totally support them on this.
Arm The Homeless
@fuckwit: if the nand image (whatever the proxy is on iOS) is encrypted jailbreaking won’t work because the now open system image still wont be able to read that partition. That’s how it works on Android at least
catclub
@Doc Sportello:
I wonder what phone Richard Stallman uses.
Billy K
@Arm The Homeless: The kool-aid is good today, eh?
different-church-lady
@MazeDancer:
Gin & Tonic
@MazeDancer: I’m sure they do, and I’m sure they comply.
singfoom
@different-church-lady: That’s a good question. I don’t have a good answer. In general it does ask you for your unlock code before an update. Having jailbroken some of my older models, they might be able to flash the device with a custom iOS directly without having to unlock it.
pamelabrown53
@Tom Q:
I’m a technological dumb-ass. I, too wonder why Apple just doesn’t hand over this person’s data since there’s a warrant. Is the federal government demanding their encryption codes for everyone?
Mnemosyne
@fuckwit:
After all of the serious security breaches from Home Depot to Target to Sony — all of which were made worse by corporate indifference — I’ll have to say that you have some facts not in evidence there. I haven’t seen many signs that businesses actually want tight security, just the illusion of it.
The Other Chuck
If Apple did this, there’s not a chance I will purchase another iPhone ever, and I’m far from alone on it. That’s why Apple won’t do it, and they’re right to refuse.
That said, I’m pretty amazed at the FBI for tipping its hand at how it lacks the capability. I guess budget cuts meant they can’t hire the best and brightest anymore.
coin operated
@Roger Moore: What it boils down to is that the encryption on Apple’s phones may be great, but they have an inherent problem that the passwords are weak, so it would be practical to go after them by brute force. To avoid that, Apple sets it up so the phone will only allow 10 tries before scrambling the phone permanently
hamletta
Y’all, read this article. It’s by a security analyst, writing for Macworld:
It really lays out the relevant background simply — so simply, even I could understand it!
Everything else I’ve read has been stupid and too focused on the evil shitbag from San Bernardino. It’s much bigger than that.
singfoom
@Arm The Homeless: Each company is different, but I think Apple has been against backdoors for a while. If the backdoors are already there in iOS, I don’t think the government would bring this lawsuit, but maybe this is all theater.
Mike J
@catclub: It runs elisp for a shell, and you dial with alt-meta-control 555-1212.
Easa Dara
I believe that the court has ordered Apple to remove the “Erase all data after 10 failed attempts” feature so that the FBI can use the brute force method of breaking into the phone. That is, keep going with all possible options until the phone unlocked. As there at 10 possible combinations and, more than likely, the access code is less than 8-10 digits there are 100 million to 10 billion possible combinations, most likely less than 1 million – 1 (a six digit code). The larger point is that opening this up will make it difficult to ignore other countries\regions (China, EU) requests in the future. In a way, the system is analogous to having a safe box where the contents will go up in flames (flash paper) upon unauthorized access and the contents are lost.
The Other Chuck
@pamelabrown53: Apple can’t access the data. They can hand the FBI an encypted blob off iCloud, but that’s all they have access to unless they backdoor the endpoint. That’s what the FBI is demanding, and what Apple won’t do.
The Other Chuck
@Billy K:
Ted Cruz will get it.
Mike J
@MazeDancer:
The NSA is not building a case to take into a court and get a conviction. The FBI has to be concerned about HOW they get info. The NSA doesn’t.
singfoom
@I’mNotSureWhoIWantToBeYet: We can agree to disagree. There’s no way this software ONLY is used for the most noble of reasons by the government.
You can’t control software very well, once it gets out, it gets out everywhere and that’s the problem here. Once the US govt can hack your phone via the iOS back door, what about China?
What about ransomware based on the software? The possible downsides security wise for Apple are infinite.
Cheers
NonyNony
@Mnemosyne:
They want to know everything that they can about everything. Law enforcement pushes for maximal knowledge on this stuff because it makes it easier for them to do their jobs. In this case, the phone might hold voice mails, texts, contacts, and other information about people they were working with or people who were encouraging them to do it. If those folks exist, the Feds want to know about it.
@Tom Q:
No. You can’t unlock security on a phone that way. The Feds want to be able to try a massive password crack of the phone by brute force, but Apple’s OS locks them out and wipes the phone after 10 tries, which makes brute force cracking impossible except through the dumbest of dumb luck. Apple has no ability to crack passwords, but can[*] make a version of their OS that doesn’t lock them out after 10 tries. So the Feds want them to do that.
[*] The caveat here being that I don’t see how they can force the update onto the phone even if they write it. I would assume that they couldn’t because that by itself would be a security breach.
WJS
This is a business decision on the part of Apple, and it’s a smart one. The FBI will, through the courts, eventually get what they want and Apple will unlock the phone. However, Apple has to appear to be fighting back against this because a significant population that uses their phones believes that encryption actually keeps their bland little lives free of surveillance.
Eventually, they’re going to get what they want off that phone. They’re trying to get this done quickly and Apple has publicly said no. My question is, why aren’t they doing all of this behind the scenes, and quietly? Every major corporation in America negotiates with the government and hands things over, especially the big carriers who allow Apple to use their networks. Why the security theater here? Is this just a marketing ploy for Apple devices?
Roger Moore
@Brent:
It’s a technical problem rather than a legal one. The problem is that if Apple creates a backdoor (or master key, or whatever analogy you choose) that lets the government get at data when it needs to, it can’t guarantee that it will only be used that way. The weakness that lets them unlock the data will be there for anyone who can discover it- or bribe an Apple employee into disclosing it to them.
Bubblegum Tate
Question I raised in the previous thread that may or may not have an answer: Even if Apple’s engineers can and do write this version of iOS that the FBI wants, how, exactly are they supposed to load it onto a phone to which they don’t have access?
Arm The Homeless
@Billy K: I’m sorry to hear about your lost binky.
Now kindly self copulate, laterally, utilizing an oxidized arborist implement.
Gin & Tonic
@I’mNotSureWhoIWantToBeYet: If they create the exploit code, it will get out. It always does. Recently an unauthorized back door was made public in firewall software made by a company named Juniper, whose entire business model is software security. Without getting too technical, it was an exploit of a weakness in certain functions introduced by the NSA.
The Other Chuck
@WJS: My guess is the FBI asked quietly several times and Apple leaked it to drum up support. I’d say their tactic is working.
The Other Chuck
@Gin & Tonic: Was that the DRBG algorithm from RSA? Those guys are still feeling the monetary fallout of getting in bed with the NSA.
singfoom
@Bubblegum Tate: One would assume Apple and the FBI would meet in a secure location and load the iOS onto the device in a manner similar to jailbreaking the phones… i.e. a USB key direct connection or something along those lines…
Betty Cracker
@Gin & Tonic: Yeah, my 5S has a six-digit code plus touch ID.
Mnemosyne
@different-church-lady:
I admit, that’s what I was thinking, too.
catclub
@Bubblegum Tate: That certainly sounds like a problem. I guess it depends
on what a phone does when it is turned on but not yet logged into by the owner. What services are running?
Can it receive calls? Or the cell tower handshake? Does the cell tower handshake allow for other things?
Gin & Tonic
@pamelabrown53: Apple doesn’t have that guy’s data, it’s on the phone. The phone is password-protected. Enter an incorrect password 10 times and the phone wipes itself in a way that nobody can recover the data. The FBI wants Apple to write custom software that will work around that 10-try limit.
MomSense
@different-church-lady:
Yup.
Couldn’t the FBI just give the phone to a bunch of teenage nerds with all the pizza and mountain dew their hearts desire?
Loviatar
It’s interesting to see the BJ commentariat fall into two camps on this issue.
Wannabe authoritarians who’ll bend themselves into pretzels in order to justify an assault on the 4th amendment.
Those who know technology and or rightly fear the government’s habit of overreaching.
catclub
@The Other Chuck: DRBG Digital random bunny generator?
A bunny is an easy shot in basketball.
WarMunchkin
@I’mNotSureWhoIWantToBeYet:
I’m not a lawyer. But:
In this case, it doesn’t appear that search and seizure is an appropriate analogy. The government can confiscate your stuff for the purposes of search and probably ask you to open it. The issue is whether a business manufacturing stuff that belongs to you must also in turn provide a way for it to be opened posthumously if their business is built on the idea that nobody can open it besides you.
This falls under the commerce clause certainly (in my understanding of it), but unless Congress passes legislation making encrypted products illegal, I’m not sure there’s anything for either Apple or the FBI to do. Not sure why there’s a fight.
Punchy
The Apple lawsuits are bananas. I mean, orange wondering why they get sued so often? Blackberry sued them, too, I think. And none of these ever bear any fruit….always reversed upon a peel. It fig’ures that now the Feds would choose this date and time to take them to court. I feel bad for the ghost of Jobs….Mangos and starts a great company, sweetens his wallet, only to pay off the ambulance chasers.
max
Plus they make the point that other governments (e.g., China) could order them to bypass security as well, and then where would it end.
Well, that’s rather the point, isn’t it? Who are the dissidents and who are the ebbil terrorists? Surely it does not depend on what country you live in? (He asked, in a jibe at rhetorical ridiculousness.)
At any rate, since the Paris attacks, the usual suspects have been on a tear about the evils of encryption (not ones to miss a political opportunity that lot), even though that the evidence suggests encryption had basically nothing to do with those attacks. Regardless of repeated assertions otherwise by the usual suspects.
In this case, the suspects destroyed their own phones before the attack, and the FBI is trying to break the work phone assigned to the male. Given that this thing has been mostly resolved, the obvious reason to do so is to use it as a political wedge to force Apple to hack its own phones.
The trick here is whether a) Apple left a backdoor b) the phone has vulnerabilities that Apple hasn’t fixed and c) whether the FBI can use the related crime as an excuse to demand the crack (if Apple has one) and use it to crack ALL iPhones. If Apple can build a crack, then obviously, their security wasn’t good enough, was it? On the other hand, if the crack exists, Apple could use it on just the one phone. A court order demanding the crack (not an order asking Apple to crack the one phone without giving up the crack) sounds like a violation of the 4th here. (The 4th amendment bars general warrants, period. Specific warrants are OK.)
Separately, yes, big bidness wants all your base, as do all the other power players, very much including the government.
max
[‘And that last part is the difficult part to solve when you’re merely a peon. That is, how do we unwind this insanity?’]
ETA: if Apple did leave in a back door, that’s on them and their unrivaled security, isn’t it?
The Other Chuck
@catclub: Deterministic Random Bit Generator. It’s an algorithm designed by RSA Inc (an Israeli company) with the help of the NSA, and it’s backdoored of course. Every reputable cipher suite disables it by default.
jl
” But Apple’s objection seems reasonable to me. What say you? ”
I don’t know enough about it to say that Apple’s arguments are reasonable, they seem plausible, but some of them seem weak to me.. The fed argument that there should be a means to allow searches of equipment that are involved in crimes seem reasonable to me too.
I don’t get the argument that ‘hackers will find it’. How do we know hackers cannot crack the privacy software now? That sounds like the over confident argument that only the US had the know-how and means to make atomic and hydrogen bombs that some confidently asserted. That idea was shown to be false very quickly.
I assume anything can be hacked, and Apple seems very overconfident that it knows its system is secure now. I think this is a hard case where both Apple and the feds have some plausible arguments.
Maybe someone can explain it to me.
Some of Apple’s arguments I’ve heard seem very weak. If a similar hard case comes up in China, they will pressure Apple to do what they want regardless of what happens with this case in the US, for example.
Shortribs
@daveNYC: That was my thinking too. I find it hard to believe the FBI can’t mirror the storage drive and then blow through their brute force options on the mirrored drives.
NonyNony
@WJS:
There are chain of evidence issues that the FBI has to deal with – if they get the data off the phone in a way that seems shady a defense lawyer will be all over that. And by shady I don’t mean ‘didn’t have a warrant’ – I mean can they prove to a jury that the data they got off the phone was the data that was on the phone and not something that could have been tampered with. I’d imagine that someone clever with access to the phone could take it apart, get at the data elements, and figure out a way to at least make a copy. But try to explain that to a jury if they find evidence on there that they need to use.
But more importantly =the FBI doesn’t just want the data, they want the legal precedent that they can compel the companies that make these devices to make sure that they can get the data. That’s probably more important in this particular case because it’s a high profile case, but the perps are obvious and dead. There is, as I said above, a need to do follow-up to see if there’s anyone else out there they were working with, but it’s one of the best opportunities for them to make the case that they need this kind of access in general to do their jobs, and so they’re going to use it.
Gin & Tonic
@The Other Chuck: Yes.
randy khan
@pamelabrown53:
@The Other Chuck:
If the data even is on iCloud at all. (Mine isn’t – it’s on my phone and my laptop.)
I'mNotSureWhoIWantToBeYet
@Gin & Tonic: Software has bugs. Software interacts with other equipment and other software that has bugs. There was just a story recently about a glibc bug that has existed since 2008.
There are lots of ways for Apple to give people security from bad guys and still comply with legal warrants. They’re clever, they can find ways to do it.
If one of those ways is discovered outside of Apple, they can change the code. They patch bugs all the time. And even the hardware can be changed if they use EEPROMs or similar in the right places.
This isn’t a “perfect security vs backdoor” argument, it seems to me. There is no perfect security, and the presence of a way to get in doesn’t mean that bad guys will find it. And even if a back door is present (via bug or malicious intent), it might take, oh, 7 years for anyone to notice…
My $0.02.
Cheers,
Scott.
Mnemosyne
@Arm The Homeless:
I think you may not realize how expensive and time-consuming court cases are if you really think this is just a marketing ploy.
jl
@max: If the gov wants Apple to rewrite the code and apply to all phones just for this case, I think Apple has a point. But what I heard on the news this morning is that the government wants Apple to help them with this one phone. That point makes a big difference, seems to me.
MomSense
@sharl:
Free range cattle?
Betty Cracker
@Mike J: The China angle is addressed in the linked article toward the end.
The Other Chuck
@randy khan: Yeah, sounds like they’re looking for a local unlock. Won’t help retrieve encryption keys, since they are themselves encrypted by a password, but the FBI wants to brute-force the code. From the end-user UI. Which goes to utter FBI incompetence, because you should be able to crack open the phone and JTAG the damn thing.
That or more likely, they are competent and this particular phone is a big fat red herring. They just want remote unlock for any phone anywhere anytime.
Peale
@singfoom: Yep. It has to be a much bigger crime than San Bernadino to get me to move that unlocking a phone is worth it. The couple who shot everyone is dead. There is no emergency. They don’t need to try or convict anyone.
Maybe if we had fewer murders and mass shootings each year, I’d care more about whether this particular shooter was in cahoots with someone.
Gin & Tonic
@jl: And once Apple “helps” with this one, what’s the over/under on when the next “just this one” comes up?
different-church-lady
@Mnemosyne: Do you KNOW how much cash Apple keeps on hand?
Poopyman
@Doc Sportello: As notes Ars Technica.
I’m curious as to how the government can force a company to produce a product without compensation. Or will this cryptohammer be produced under a forthcoming govt contract? (Apologies if that’s noted between Doc S’s comment and here.)
NonyNony
@Bubblegum Tate:
That’s pretty much my question as well – if they can do it, that’s a security breach. If they can’t do it, then this whole thing is pointless.
I also do not know Apple’s security model on their phones, but I would think that unless they’re doing something really clever if you have access to the phone you should be able to pull the data elements out physically, mirror the encrypted data, and then throw your supercomputers at that. If they’re tamper-proofed to the point that cracking the thing open and trying to remove the data elements scrambles them, then I’m seriously impressed with Apple’s engineering decisions.
That’s why I say up above that I think this is more about the FBI choosing a good case for them to try to set a precedent and not really about the mechanics of getting the data off the phone at all.
singfoom
@jl: Yeah, that’s not how software works. If Apple writes this code, it would work on any iPhone. And then how can Apple be sure that software doesn’t go beyond that phone?
They can’t and therefore they should not be compelled to write software to break into their own devices. This won’t stay in just the FBIs hands.
Mnemosyne
@Loviatar:
So your argument is that searching a cell phone is so inherently unreasonable that the government shouldn’t even be allowed to get a warrant to do it? Or is it that warrants themselves are somehow unconstitutional despite, you know, being specified in the actual amendment?
Mike J
@Shortribs:
The FBI doesn’t have the money or the tech that the NSA does. The government is not a monolithic entity.
jl
@srv: I heard that some spaces just opened up there at the refuge for survivalist tech ranger portesters. Why not head up there?
Note: This is a one time comment exploit for srv, and I am not bound by this case (in memory of Scalia).
Loviatar
@srv:
I agree, its kind sad isn’t it. To paraphrase Benjamin Franklin “Those Who Sacrifice Liberty For Security Deserve Neither”.
Ted
Is it just me that thinks it’s possible that the government already has what it wants and this is all an elaborate kabuki play to make the public think differently?
catclub
@Peale:
True, but the FBI wants to know if there are any others still out there – with all their plans laid out on this phone.
It could be!
The old Cheney 1% problem.
Arm The Homeless
@Mnemosyne:
If I assume a net cost of $25million to fight the case publically, how long will it take to earn that back when they have revenues over $200billion?
I just don’t believe Apple wants to really spat with the Feds, especially when their inversions are catching the ire of the Dems.
I am not convinced either way, but nothing would surprise me at this point.
OzarkHillbilly
It also seems reasonable for the FBI to want into the phone. A lot of gray here, not much black and white.
jl
OK, a lot of people here very confident that Apple knows everything for sure, and one’s safety and liberty are perfectly secure with a very large corporation that could never do anything wrong.
I’ll listen to arguments on both sides, but I’m not willing to trust Apple’s word on this any more than I am the feds.
Mike J
@Peale:
One of their friends is being held for trial for conspiracy for other incidents. There may be info about him. More importantly, there may be other contacts on the phone that they don’t yet know about.
WaterGirl
I stand with Apple on this, and not just because I’m an Apple girl.
“You have to program a new iOS that can be installed on an existing phone that will let us get around all the existing security features. We’ll only use it this one time, pinky swear! No take-backs.”
catclub
@singfoom:
Just put in this code:
if(phone serial number == SB killers)
{ break in}
else {be good now.}
That will keep all the other phones safe.
Arm The Homeless
@Gin & Tonic: but it’s only the tip of this shaper-probe. I love you, baby…
Napoleon
Maybe I missed someone raising this up-thread or in other forums, but the most amazing thing about this is not getting mentioned by anyone. The Obama Administration after getting beat up on this issue a few weeks ago Obama announces they would not be seeking having OS include a back door, and now they turn around and try do it on the low down as a court order that they are trying to sell as only applying to one phone.
Loviatar
@Mnemosyne:
No, my argument is that forcing a company to create a backdoor to their copyrighted and valuable software, is akin to unlawful seizure which is an assault on the 4th amendment. You know that old unreasonable search and seizure amendment.
Mike J
@Ted:
I have no doubt “the government” has it. That does not mean the FBI does, and the FBI has to generate evidence that will be accepted in court.
Gravenstone
@sharl: Whodathunk that Cliven Bundy would engage in holistic, organic ranching?
/snark
azlib
It is quite possible the encryption cannot be broken. The FBI should give the phone to the NSA and let them have a crack at it.
singfoom
@jl:
One does not necessarily follow the other. One does not have to believe that Apple will NEVER EVER do anything wrong in order to side with them here. That’s a strawman.
The reality is that due to the technical considerations that in this case there is no good way for Apple to break the encryption (they’re not really breaking it, just disabling the # of attempts lockout security feature) for FOR JUST THIS PHONE without providing the means to disable it for ALL the phones.
It’s not just Apple’s word, it’s mostly down to how security in software works.
MattF
For the relevant technical and legal details, daringfireball.net is currently offering lotsa links.
alhutch
Long story short, the version of iPhone in question (a 5c) is the only reason Apple can even help the Feds if they choose to (for technical reasons I won’t get into here, read this if you want more info):
http://wp.me/pfrKF-p1B
FWIW, it is a government owned phone (County of San Bernardino), not a privately owned one.
If the actual owner of the phone (i.e. the county) wants it hacked, Apple should comply.
WarMunchkin
@Loviatar: I *still* don’t think this is quite a 4th Amendment issue. To me, this looks like the FBI is requiring an ostrich to fly.
jl
@Loviatar: You seem very confused. Either the feds have a case, or they don’t, to search that one phone. Whether there is a legal means to execute that search is another matter.
dedc79
Doesn’t the FBI know they just need to click on the little Pi symbol at the bottom corner of the screen to get in through the back door?
eldorado
free kevin!
Mnemosyne
@NonyNony:
That’s why I think there’s an underlying ethical question even beyond the legal question. Some people seem to think that they have a right to conceal or destroy information even after a legal warrant is issued. Where do people here draw the line? To me, this case doesn’t seem urgent since the perps are dead, but if there was, say, a school bus full of kids that was buried somewhere in the desert, should the government be allowed to get a warrant and bust into someone’s cell phone, or is digital data inherently different than, say, getting a search warrant for someone’s house?
randy khan
@jl:
China actually is what everyone should be worrying about here. If Apple is forced to create a new tool for the U.S. government to get in, then China will be able to tell Apple it has to be given the tool, too. And, honestly, once it’s in China’s hands, it’s going to be used by the Chinese for non-law enforcement purposes (that is, espionage) and likely will get into the wild. This isn’t really even speculation – everyone knows that’s exactly what would happen.
By the way, the number of examples of situations in which manufacturers are required to create tools that let the government break into things is very small. If a lock company comes up with a lock that can’t be picked, that’s too bad for the government. If I make encryption software, I’m not required to provide keys to the government or create a backdoor the government can use, at least in the U.S. (A company that holds keys for its customers probably can be forced to disclose them in certain circumstances, but that’s different.)
This is important because Apple is being asked to make something new to solve the government’s problem. This isn’t a situation in which Apple has a key piece of information the government wants; what it has is expertise that the government wants to commandeer. That’s a big deal.
The other thing I’d mention here, even though I mentioned it on the other thread, is that the FBI likely is just fishing here. Everything on social media, in his email accounts, in texts sent via his wireless company, his Internet history, all of his phone records – all of that already is in the government’s hands through the underlying service providers. There probably isn’t that much left that’s on the phone and wasn’t somewhere else.
Mike J
@Loviatar:
The amendment that specifically allows you to go to a court and get a warrant?
singfoom
@jl:
One does not necessarily follow the other. One does not have to believe that Apple will NEVER EVER do anything wrong in order to side with them here. That’s a strawman.
The reality is that due to the technical considerations that in this case there is no good way for Apple to break the encryption (they’re not really breaking it, just disabling the # of attempts lockout security feature) for FOR JUST THIS PHONE without providing the means to disable it for ALL the phones.
It’s not just Apple’s word, it’s mostly down to how security in software works.
jl
@alhutch: Thanks for the info. Bot it bookmarked for later.
pamelabrown53
@The Other Chuck:
Thanks for your reply. It makes a world of difference.
Emma
Well, I can see another thing the government is doing. The next time a terrorist attack happens and the terrorists are using Apple phones because they are “unbreakable”, the government can set off a panic attack that will put most people on the side of more government intrusion. Imagine all the lovely laws that will be written then.
Another question that occurs to me is this: what makes cell phones so different that they can be a protected technology? Upon receipt of the appropriate warrants we are supposed to hand over paper files and electronic files. If they files are locked or passworded, we are supposed to unlock them or reveal the passwords. What’s the legal difference between those and information stored in a cell phone?
(added) And in passing to the snark about “we were told the government had all the information”, I will remind you that the question then was about metadata not information. There’s a difference.
different-church-lady
@azlib:
Why bother when the NSA already has the data? Snowden proved that.
WarMunchkin
@alhutch:
Still doesn’t work like that. If you forget your password to a site, it’s a customer service that the site offers you password recovery. In the case of encryption – that doesn’t make sense. Issuing that service as part of the hardware invalidates the hardware. Not possible in their business model.
You’re either selling something encrypted or you’re selling something unencrypted. There is no such thing as encrypted + backdoor.
singfoom
@Emma: The person who knows the password is DEAD. Apple doesn’t know the password. They’re being asked to write custom software to allow the US Govt to bruteforce (guess all the possible combinations) the password to get into the phone..
That’s not the same as
Mnemosyne
@Loviatar:
So we should just accept that criminals will be allowed to defy the courts and conceal or destroy evidence with no legal recourse? I guess that if we can’t actually get to those kiddie porn files, the harm done to those kids magically disappears.
singfoom
If I was chief counsel for Apple, I’d respond with a 4×5 index card of the updated iOS code to remove the attempt # lockout feature, ala Lavabit:
Lavabit on Wikipeda
The CEO of Lavabit is an American hero.
randy khan
@The Other Chuck:
Ding! Ding! Ding! Ding! We have a winner!
Corner Stone
@Loviatar:
Congrats, Loviator! You just got Cap’t Nemo’d!
But wait! There’s more! Don’t think for a second it will end here amigo. The brain jumbles, misrepresentations and outright lies will continue!
jl
@randy khan: China says they have the tool so they have to use it. And Apple says ‘No’. China says that they want Apple to create the tool, and Apple says ‘No’. China does whatever China can to get Apple to do what China wants. I don’t see how the court case here changes it one way or the other.
Since when does China follow US law, or limit itself to US Constitutional and legal procedure?
I haven’t made up my mind on the case. Seems to me a lot of people here credulously swallow whatever arguments the tech industry coughs up for these privacy and security issues, and some of them seem weak to me.
singfoom
@Mnemosyne: I thought this was about the San Bernadino terrorists. Are they pedophiles now too?
WarMunchkin
@Emma: Imagine that I were to manufacture a safe for you. The specifications of this safe are as followed:
Only you can open it (with your unfalsifiable, living DNA).
If someone attempts to open it who is not you, the contents of that safe are destroyed.
The safe is unbreakable by any physical or supernatural means. It can only be opened by you, provided your heart is beating and your brain neural scan shows you, living, not mind-controlled.
Imagine that this is a legal thing to market and that everyone has one of these for their prized possessions, and you buy it. Then, after owning it for a few years and storing some of your important items in it, you die.
According to the FBI, I should be able to open this safe. This does not make sense. My business model and contract with you requires that safe to be unopenable.
Gin & Tonic
@Mnemosyne: I store a document about my nefarious plans in a safe that is specifically designed to burn the contents when somebody tries the wrong combination too many times. Only I know the combination. The cops kill me. The FBI then asks the company that manufactured the safe to design and engineer a way to let them try different combinations as many times as they want without the contents catching fire. They say “it can’t be done.” What then?
Gin & Tonic
Wow, GMTA.
Betty Cracker
@jl: Check out the linked NYT article. Toward the end, it addresses the China issue.
different-church-lady
@Gin & Tonic:
Call Bruce Willis?
Mnemosyne
@randy khan:
To be clear, I tend to agree with Apple here. The FBI’s problem is that they’re trying to unlock the phone of a dead person, which isn’t really Apple’s problem. I’m not sure that it’s worth an entire technical workaround just to solve this one small problem.
Emma
@singfoom: In a criminal case, if the owner of the files is dead, the family of the deceased or the company he works for is supposed to hand over all the evidence if presented with a legal warrant. Including keys if they have them or passwords if they have them. And if they don’t, the FBI can go to the safe maker (for paper) or the computer company and request assistance, again using legal warrants.
If we want cell phones to be a protected class of information storage (and I do) then we have to show a reason why they are different from all other information storage equipment.
pamelabrown53
@Gin & Tonic:
Well, that make no sense. I would think that the government, with a warrant, should only ask for and receive this one person’s password.
Gelfling545
I think Apple wins this one. I don’t think the courts will order a company to develop software they don’t currently have. From what I’ve read they don’t seem strapped for evidence I n the San Bernardino case and it’s not as if Apple is refusing them access to something they have. While it might make their job easier, though I’m not convinced of that, I don’t think there is much to be gained from it in this case and they’re hoping to use it somewhat more broadly than they are suggesting.
different-church-lady
@WarMunchkin: WHAT IF THERE IS A LIVING BABY IN THE SAFE?! AND THAT BABY WILL DIE IF WE DON’T OPEN IT?!?
Betty Cracker
@randy khan:
That seems like a big deal to me too.
randy khan
@Emma:
A fair question. The law is that you can be compelled to turn over your phone with a warrant, just like your papers. (There’s a very interesting Supreme Court case on this point that’s pretty recent, in which the police argued that they didn’t need a warrant to look at your phone and were shot down.) And if you’re alive, you can be compelled to provide the password, just as you could be required to provide the combination to a safe.
But here the user is dead and can’t provide the password. As a result, this is more like the government going to the company that makes the safe and asking it to create a new way to open the safe because the safe has systems to destroy what’s inside if a safecracker fails. There’s really no precedent for that.
Emma
@WarMunchkin: You’re dead. Your rights ended. Especially if you’re a criminal who might have left information in that safe.
Gin & Tonic
@pamelabrown53: That person is dead. Nobody else has the password. Apple doesn’t have it and can’t get it.
Emma
@randy khan:Are you sure? Because although I’m not a lawyer I am willing to bet that there are plenty of times when the government have cracked open the safes of dead criminals.
? Martin
@jl:
Totally different. Encryption is defined by mathematical proof. Some of these encryption algorithms are small enough to fit in a tweet, and they are unbelievably well tested because they are public. If there was a way through, we’d know about it. Vulnerabilities in encryption algorithms do not stay secret.
The atomic bombs were a matter of found knowledge, and we were the first to find it. Encryption algorithms are about knowledge which by mathematical proof cannot be found (other than by random guessing, and we can calculate how long that would take).
Anything which is found knowledge can be hacked. A secret key stored in a vault in the bottom of Cupertino can be found. It might be hard as hell, but it can be found. Encryption is special because it doesn’t fall in that category. Now your personal password for that encryption falls back into the found knowledge category, but through multi-factor authentication that can prove to be insufficient as well.
I don’t think this applies to the 5C, the phone in question, but starting with the 5S, even a 4 digit password can’t be brute forced because as I suspected the encryption is managed in a special independent code execution inside the CPU that the OS cannot reach. That execution unit has instructions in hardware (cannot be changed) that adds a geometrically escalating timer between failed attempts. On average, it would take several thousand years to hack a 4 digit password with the most powerful computers we believe the govt owns. That is, it’s absolutely impossible for Apple to comply with that order for all devices newer than the 5C. The FBI could get really lucky, but that’s what they’d be reliant on. I know the NSA can read some of the information hard-coded into the CPU using scanning electron microscopes and the like but this would require they be able to effectively rewire the CPU, not just read from it.
Not kidding here, some of the stuff Apple has done to make their devices secure is completely unprecedented in the consumer space, largely because they have complete design control of the device. They aren’t dependent on buying off-the-shelf chips from other companies. That doesn’t protect them from bugs and other failings (which they have), but from a design standpoint, they are designing to a very high degree of security.
randy khan
@jl:
I thought this was obvious, but Apple has a much easier time resisting China if it hasn’t (a) made the tool in the first place; and/or (b) given it to the U.S. government.
Gin & Tonic
@different-church-lady: Schrodinger’s baby?
Mnemosyne
@singfoom:
I actually agree with Apple about the dead terrorists part, but I’m curious about the hypothetical of a living criminal who refuses to comply with a court order. If the evidence is physical, you can prosecute someone for destroying evidence, but do the digital fans (not you) think that’s kosher?
It sometimes seems that people draw a line between physical evidence and digital evidence, and I’m trying to figure out why.
burnspbesq
@Mnemosyne:
They can think that all they want … while they are sitting in prison after being convicted of obstruction of justice.
Loviatar
@ WarMunchkin: @ jl: @ Mike J:
It always frustrates me when those I consider reasonably intelligent use semantics and or sophistry to make their case (thats why in my opinion most people hate lawyers). However I’ll take all three of your arguments at face value and respond in kind.
The government is using its warrant powers to force a private entity to give to the government at no cost a highly profitable and valuable piece of property. Now I’m not a lawyer, but how is this not unlawful seizure. You make the case that they have a warrant, but again having a warrant does not preclude it from being unlawful, it just means it is officially unlawful.
Those of you willing to accept this I guess also accept the government under color of a warrant seizing pre-conviction personal property without recompense. Civil forfeiture
randy khan
@The Other Chuck:
And you shouldn’t buy any other phone, either. Apple is by far the company that is most protective of its customers’ privacy. It’s practically an obsession.
WarMunchkin
@Emma: I think we might have reversed language here. In my story, you’re the owner of the safe and have passed away. I’m the seller of the safe. The FBI wants the seller of an unbreakable safe to open the safe of the buyer after it has been sold.
pamelabrown53
@Emma:
Is there a reason that cell phone info. should be a protected class? Sounds to me to be akin to the Swiss bankers hiding ill begotten spoils. I must be missing something because I’m feeling more dense than usual.
singfoom
@Mnemosyne: A living criminal could be compelled by court order and incarceration. They could technically serve out their time.
I think if you could prove that someone deleted files that were proof of a crime you could prosecute them for destroying digital evidence just as with physical evidence.
jl
@randy khan: I understood perfectly well. Your point was obvious. You think it is obvious that making things easier for Apple to do nor not do something is a compelling interest that outweighs any other consideration. I disagree.
I think my point should be obvious too.
randy khan
@Emma:
The government breaking open a safe on its own is the equivalent of the feds brute-forcing the password here – the manufacturer isn’t involved. That’s different from requiring the manufacturer to create a way for you to bypass the lock after the safe has been manufactured.
Gin & Tonic
@Mnemosyne:
Because in the physical space, some people can’t get their head around the concept of an un-openable safe. In the digital space, that’s what we now have. Law and public perception haven’t caught up.
Van Buren
@Mnemosyne: Be interesting to hear what Tom Brady ‘s take is on that issue…
WaterGirl
@sharl: I actually feel kind of sick to my stomach after reading that. I have zero sympathy for the disgusting creature known as Cliven Bundy. Truly reprehensible. Ugh.
randy khan
@jl:
If you think that was my point, then you missed it completely. I think it’s not in our national interest to create a situation in which the Chinese are likely to get access to technology to break into otherwise-secure phones (particularly when, as a natural consequence of the Chinese getting access to that technology, others with even fewer scruples about how they use it will get access as well).
Aaron
Encryption exists in 2 states: secure, and unsecure. Anything with a backdoor, or a tool or anything else, is in the second category. Encryption is freedom of speech. Lack of encryption is the death of the 4th amendment.
Mnemosyne
@Loviatar:
Okay, but this is a completely different argument than one about warrants and seizures under the 4th Amendment. If Apple is fairly compensated for their time and effort, is it still unreasonable to issue a court order to make them do it?
Steve in the ATL
@Loviatar:
I spend large amounts of my work and personal time with lawyers, and I find that they rarely use sophistry and semantics outside of making legal arguments in legal disputes. Most people hate lawyers because (1) they have had a bad experience with one, such as in a divorce or accident, and (2) many lawyers are just assholes.
WarMunchkin
@Loviatar: I think we’re also talking past each other. I’m saying you don’t need to even invoke the 4th Amendment because Apple is protected under existing commerce laws that take effect well before the 4th Amendment is needed to be invoked. As in – the FBI is punching at thin air.
? Martin
@Emma:
Two things:
1) Phones aren’t protected in as much as they are massively harder to get into than say a safe. People *can* crack safes. People *cannot* crack 512 bit encryption. And the way that Apple has set up their devices, they don’t have the password and cannot get it. The reason for this is that Apple can combine software (which they could influence after the fact) and hardware (which they cannot influence) to secure devices because they designed both the hardware and software. Most companies can’t do that. An example, on a modern iPhone, everything on the phone is encrypted by default. The key for that encryption isn’t your password. The key is a generated from 3 pieces of information:
+ Your password, which Apple doesn’t have
+ A unique device key which is set in hardware in your device when it is manufactured (cannot be changed later), which could be determined through forensics
+ A unique key which is randomly generated by a secure computing element within the CPU, which the OS cannot see.
Those 3 things are combined to lock your phone. If you brute forced the user password, desoldered the flash memory and put it in a new device, determined the unique device key by using a scanning microscope and put that in an emulator, you would still fail because you cannot reproduce the unique randomly generated key. And more, Apple can’t get 2 of the 3 pieces of information either. These locks are not like other locks.
2) The government can compel people to provide information, but they cannot compel people to act. That is, if you have a combination to a safe, they could subpoena that information, but if you uniquely had the skill to break into that safe, they could not force you to do it. There’s a distinction between information and action and Apple is somewhat objecting that the FBI isn’t asking them to provide information, but to act – to actually build something. That’s not something unique to phones, but its something that because of the strong security as noted above leave law enforcement with few other options.
Steve in the ATL
@Loviatar: And I have never seen anyone on Balloon Juice defend civil forfeiture. It is universally despised. Well, not sure about srv.
Emma
@WarMunchkin: Not seller. BUILDER. Different thing. But you’re right, it was a bit “talking past the other.”
@pamelabrown53: I don’t know. Should it? “Privacy” in the electronic world is a very different thing. For example, the metadata for probably most cell phone calls made are captured and stored (as per Snowden) to be searched if/when your name comes up in a government case. Supposedly (I am not a great believer in the fairness/liberality of governments). it doesn’t get looked at (even the government has limited time/resources). However, information s a different thing, but it is clear that in criminal cases, the privacy wall has/continues to be breached in the modern world with regularity. If we are to argue that there are limits that apply to cell phones we need to distinguish that from the existing privacy wall.
And the not-quite-logicality of that paragraph should show that I’m not sure of anything at this point.
Calouste
@randy khan:
That doesn’t really matter, as the government can always get in using a blowtorch. Although I think that law enforcement hasn’t really gotten its head around it that in the digital world thing really can be inaccessible if you don’t have the key, where in the physical world it just takes longer and the safe/suitcase/whatever is no longer suitable for its original purpose after you’re done with it.
randy khan
@Mnemosyne:
As a theoretical matter, if the government wants to pay what it would cost, its argument gets much stronger. However, I’m confident that it would be really expensive and that Apple might even argue that it has to be compensated for lost profits as a result of people thinking its phones are less secure. Given Apple’s profits, that would be very, very big number. (That would be a fascinating argument.)
ruemara
@different-church-lady: sssshhhh. Toss in the manpower it would take to accomplish what we were told was happening, and it was even more ridiculous.
CONGRATULATIONS!
It is not. Backdoors get hacked. Always. Doesn’t matter how well built or implemented (they’re usually neither) they get hacked.
Emma
@? Martin: Then what you’re saying is that criminals are protected in their criminality by the technology they use. Good luck selling that to the average human being.
I am a lot more convinced by Loviatar’s argument about civil forfeiture. (added) Or the ones about national security if the “key” were to get into the wrong hands.
jl
@randy khan:
” Apple has a much easier time resisting China if it hasn’t (a) made the tool in the first place; and/or (b) given it to the U.S. government. ”
You did make a separate argument about national security. As I understand it, is, that if Apple was forced to create this software, it would more likely escape into hands that would use it to damage US national security, or at least escape sooner. That argument involves many issues beyond what requests by foreign governments this case would make it easier or not easier for Apple to comply with.
Edit: if you have arguments specifically about that, or links, I will gladly read them.
randy khan
@Calouste:
The safe hypothetical requires one of those safes that seem to be key plot elements in caper flicks – with some kind of internal mechanism that destroys the contents if you try to open it and fail. I have no idea if those exist, but if they do, they’re very rare. In real life, of course, the government just gets out the blow torch or a fancy drill.
Mnemosyne
@Gin & Tonic:
I guess part of my question is, what can the law do in order to catch up with the technology and still remain faithful to the Constitution? IIRC, there was a Supreme Court case over wiretapping in the early days of the telephone, but we now have the rules and regulations worked out and very few people think that wiretapping with a warrant should never have been allowed.
randy khan
@jl:
You do understand that if Apple never creates the software, it can’t be forced to give it to anyone else, don’t you?
Mike J
@? Martin:
512 bit RSA is trivial. 4 hours and $75 worth of compute time from Amazon for a complete novice.
a hip hop artist from Idaho (fka Bella Q)
@MomSense: Yup. Cattle free to range on someone else’s land while they starve. Nasty motherfucker, Cliven Bundy.
Loviatar
@Mnemosyne:
Yes. Apple’s OS is their property and by forcing them to create a backdoor the government will make their property less valuable. In this case the government will be seizing two items from Apple; the backdoor and financial worth.
Also, please show me where the government indicates that they will fairly compensate Apple for their time and effort.
ET
This sort of feels like some chickens coming home to roost for the law/DHS crowd. The US government law enforcement/intelligence apparatus has spent years been hoovering up anything and everything with what seems to be little discretion all the while hiding behind the veil of secrecy and doing so with a huge amount of arrogance. Now when it really matters, they are being told no because their past actions and arrogance put the companies in a spot they shouldn’t have been.
? Martin
@Mnemosyne:
I think the distinction comes down to the fact that the government’s ability to prove that evidence was destroyed in either case. With physical evidence that’s usually straightforward. The fact that the phone remains locked doesn’t prove that it contains any evidence, and you cannot be compelled to incriminate yourself by unlocking it. Let’s call it Schrödinger’s evidence. It may or may not exist and the burden is on the government to prove that it does exist, which it can’t do, and there’s no outside party that can unlock it to prove it on behalf of the govt. It’s a new class of problem that relies on the protection against self-incrimination.
WaterGirl
@catclub: I surely hope that was a joke!
Loviatar
@WarMunchkin:
sorry about that. lumped you in with those not understanding what this case could do to the 4th amendment.
a hip hop artist from Idaho (fka Bella Q)
@Steve in the ATL:
QFT. Semantics are the least of it – and many people enjoy semantic jousting. People generally dislike assholes, and many of the best known lawyers fit that description. Many of the least known too.
? Martin
@Emma:
That’s always been true. Simply memorize everything and you’ve achieved the same result. Memory is just another lock that cannot be picked. Citizens are not legally obligated to leave behind an evidentiary trail.
Loviatar
@Steve in the ATL:
yeah, thats why its kind of surprising to see so many here not realizing that this is akin to civil forfeiture and a slippery slope attack on the 4th amendment.
? Martin
@Loviatar: I’ll point out that the iOS business has a market value of nearly half a trillion dollars. That’s a lot of money to unlock one phone.
Cheap Jim, formerly Cheap Jim
Does anyone seriously believe Apple when they say they don’t have a way to unlock the products they sell? This is a company that is manic about keeping control.
Steve in the ATL
@Loviatar:
To be fair, slippery slope arguments are also despised here
Mnemosyne
@? Martin:
IANAL, but I think that if someone gets a court order to provide their password and refuses, they can be jailed for contempt until they agree. The destruction of evidence part is trickier, but I think most courts agree that it’s constitutional to prosecute someone for destroying evidence if you can make a case that, say, the person threw a bunch of papers on a bonfire and refused to say what they were. What I’m not sure of is if, say, reformatting a hard drive or wiping a cell phone would count (I’m kind of assuming that physically destroying those would count, but again I’m not sure).
Hoodie
@Mnemosyne: No, it’s eminent domain. The FBI is taking Apple’s intellectual property for a public purpose. The FBI is saying that Apple already knows how to do what the FBI wants (which appears to be the case for the 5c), so the property already exists. The FBI is offering different ways for Apple to hand over the property. The government can take all kinds of stuff under eminent domain, just ask Donald Trump.
Emma
@? Martin: Let’s see what happens if there’s a major (or even minor) terrorist event. United States citizens are notoriously easy to stampede into the arms of authority.
randy khan
@Hoodie:
Technically, the government is not employing its eminent domain powers for at least two reasons (1) that’s not the legal principle it’s using; and (2) everyone agrees that Apple does not have the software to do this right now. (The entire argument is about whether Apple can be forced to create the software.)
But even if it were eminent domain, the government would be required to pay Apple reasonable compensation, which so far it hasn’t offered as far as I can tell.
Mnemosyne
@Loviatar:
We’re starting to get into territory where we might actually need a lawyer to speak up, because I honestly have no idea if a court has already decided a case where someone had to put time/money/effort towards providing evidence to the government and what was decided about whether or not compensation was due, much less if it was a 4th Amendment violation to require that. If there really is a safe that needs to be cracked by law enforcement, does the safecracker get paid? And would that change if the company that built the safe was refusing to do the job voluntarily? I have to go do some actual work now, but I’ll be curious to see if an answer exists.
randy khan
@Cheap Jim, formerly Cheap Jim:
Apple doesn’t say it couldn’t create the software for this particular phone, just that it hasn’t. I believe that, largely because Apple has no reason to have created the software. Unlike Google, Apple is not in the business of selling information about its customers to third parties, and is quite public about that.
And in this particular case, the FBI is asking Apple to assist it breaking a customer’s password. Even if Apple were collecting customer data, that’s a different kind of issue entirely. (And if Apple were collecting the data, then the FBI could just subpoena the data directly from Apple, rather than going through this elaborate process.)
Anoniminous
So let me see if I got this right.
1. The FBI has the phone in their hands.
2. Therefore they have the evidence they need.
Lemma: I assume the courts do not issue and police do not serve search warrants for evidence already in police custody but IANAL so WTFDIK
3. They can’t read the evidence.
4. So they want to compel a third party to read the evidence for them.
Antonius
Or this is an orchestrated public spat to convince everyone that Apple tech is perfectly private, to attract use when the reverse is true.
JPL
@Cheap Jim, formerly Cheap Jim: I agree with you.
CONGRATULATIONS!
@Mnemosyne: You are correct. There are limits on how long a person can be held for contempt, but it’s a lot longer (I think a year) than most people would be able to tolerate.
It does. It’s called “spoliation” and is a sanction. You’d be amazed how judges don’t take it seriously with regards to digital evidence.
I just finished a six-year trial where the outcome hinged on spoliation. Nobody walked away happy, shoulda been a slam dunk but there’s a lot of judges who are not willing to deal with tech of the 20th century, never mind the 21st. (not a lawyer, but a digital forensics guy).
J R in WV
@sharl:
Off topic, but so what!
What an asshole this Bundy guy is, and no wonder his kids are so totally F’ed up. Cow abuse in my book, if he was doing that to dogs they would drag him off to jail… oh wait, he IS in jail!
Wonderful. I think they should start a whole new case of animal cruelty, with a thousand counts, against this total ASS. And his kids who know about the abuse and probably participate in it.
There are right ways and wrong ways to do most everything, and the Bundys have hit WRONG WAY on just about everything they do or think.
Hoodie
@randy khan: No, they’re not specifically invoking eminent domain, but they could say that Apple knows how to do this and take that information, even though Apple hasn’t actually created the software. Sure, they have to pay for that knowledge, and the issue is what the fair market value is. If it’s software that apple retains exclusive control of and can be constrained to one phone, then the value is probably pretty small. If it can’t be constrained to one phone, that’s another matter.
LongHairedWeirdo
The way I understand it, encryption can’t be “broken” – now, I don’t know if this means that you simply can’t take the raw data and determine what it represents, or determine the decryption key, or if it means “we could, but the fastest computers would take thousands of years.”
Now, Apple could maybe read the raw data from the iPhone and give them the data in a format that lets them try dictionary attacks against the password, without a “10th try wipes the data” restriction. But a good password will, again, take a lot of computing power to guess. That can be sped up if you have a password hash (or maybe a password hash table – with many examples).
What the government wants is to have Apple have *two* decryption keys – their own, personal, private one to be used (Only when Absolutely Essential and Vital To National Security of course!) and another – the owner’s password. Due to the trickiness of allowing decryption by two passwords, it weakens encryption significantly. Also, if 100 ASIC-based machines would have to labor for 10 years to crack a password, it’s not worth it to crack one password for one phone. But if that’s what it takes to crack the skeleton-key password of every iPhone out there, the NSA would have 10-100,000 of those machines operating as fast as they can be built.
jl
@randy khan: Yes I understand that.
I think what makes things easier for Apple to do or not do is a separate issue from the desirability of them being required to create it.
Andrey
@Mnemosyne:
This is currently unsettled, and has been ruled in both directions by various district courts. There has not been a definitive Supreme Court decision. In re Boucher (2009) ruled that giving up the password would be self-incrimination, but was reversed in part because Boucher had already shown some of the unencrypted data to a border agent. In US v. Fricosu (2012), a judge ruled that the defendant must unencrypt a hard drive, but in US v. Doe (2012), an appeals court ruled the opposite – that requiring an unecrpyted hard drive violates the Fifth Amendment.
Given that this is not only unsettled at the SC level, but has specific cases ruling in opposite directions, it’s unlikely that anyone but perhaps a constitutional law scholar could accurately predict what the ultimate resolution will be.
A Ghost To Most
@Cheap Jim, formerly Cheap Jim: not an Apple fan, but yea, I believe it. Their business is built on it.
RSA
@alhutch:
Hey, cool, thanks for the link. I was wondering how this could be possible (from the article):
So that all makes sense. It also means that (I think) this really could be a one-off exploit that couldn’t be used in other cases without Apple’s direct involvement.
singfoom
Here’s a good technical breakdown that I found:
Trail of BIts Blog
For all the people asking how you could load the custom software, a snip from the article:
Even given what was said in the article, I still think Apple is right to fight this as much as they can.
J R in WV
@different-church-lady:
“Do you know how much cash Apple keeps on hand?”
Billions and billyouns and billions of dollars, like stars in the sky!!
About that much, amirite?
A Ghost To Most
Wierd thought (IANAL); can the government force you to build the gallows you are hung with?
Andrey
@LongHairedWeirdo: This is not entirely correct. Here’s the key element:
This is actually exactly what they want from Apple. They want a way around the “10th try wipes data” restriction so they can brute-force the password. They are not asking for a separate “master password” in this case. Other cases may have requested something like that, but that’s not at issue here.
different-church-lady
@J R in WV: Pft. That’s less than what Tim Cook could find in the sofa cushions.
Cheap Jim, formerly Cheap Jim
@A Ghost To Most: I thought their business was built on stylish design, cheap East Asian labor and tax avoidance.
But my point, to the slight extent that I had any, was that this is a company quite willing to brick one’s phone for taking it to a third party for service*. Leaving themselves no way in seems inconsistent.
*From what I’ve heard. My phone folds in half and costs me about 7 bucks/mo.
Anoniminous
@different-church-lady:
Roughly $216 billion in cash and cash equivalents.
Mike J
@LongHairedWeirdo:
6 digits, no alpha allowed, is not a good password. That’s only a million combinations. People will not enter strong passwords every time they pick up their phone, unless it;s done with something like a fingerprint scanner. Of course if you already had a copy of the person’s fingerprints, you could feed the correct signal to the chip and it would be easier than guessing a password.
True story: in between reading and writing comments on this post, I got into a laptop for a FoaF whose brother had died. Of course no special protection had been taken beyond a login password or it might have taken me longer than 5 minutes.
sylvainsylvain
Something else at play here…
Apple really wants people to start using their phones as a payment mechanism/platform (ApplePay?, can’t remember the name ATM). If Apple gives the FBI the way to get into this particular phone, the technique will make its way into the wild. Eventually.
As it stands now, Apple’s made it as secure as possible to use your phone to pay for stuff (say, like a credit card). They’re catching flack for making the newer phones (the ones w TouchID) brick themselves if the fingerprint reader is repaired by anyone other than themselves; if you fvck up yr phone, & get it repaired by the shop down the block, when the next iOS update comes down, yr phone is dead. Deaddeaddead. A paperweight dead.
**Let’s not get bogged down w the wisdom of that particular aspect, let’s just acknowledge they’re being consistent w their desire to keep the system secure**
So, if they let the FBI backdoor into their system, they’re gonna lose whatever security they have to make an iPhone a payment platform. A hacker will be able to break in, get yr payment info, steal all yr $, and piss u off. Who can u sue? The FBI? Good luck w that. You’ll sue Apple.
So Apple’s looking at this like their future relevance (at least as a payment platform, which they see as the future) is diminished, which is the fear of all big tech companies. They’ve spent $, time, and talent trying to make a go of this, to the best of their ability. And now the FBI wants them to throw all that away.
TL;DR? Apple’s not doing this because they care about your privacy, at least not to the extent they care about YOU, or ethics, or privacy as a concept. They care about the money to be made by making the iPhone a more integral part of your life. They don’t want that jeopardized.
different-church-lady
@Anoniminous:
The Green Stamps alone fill three warehouse buildings.
A Ghost To Most
@Cheap Jim, formerly Cheap Jim:
Agreed about Apple and their business model; I have never owned a single Apple product. But their marketing (and as I am learning here, hardware) is based on the security of the device and they will fight to protect that.
Steve in the ATL
@A Ghost To Most:
Hanged, not hung.
Sorry; one of the reasons that many lawyers are assholes is that many of us are pedants.
LongHairedWeirdo
@Andrey: Thanks, Andrey, I didn’t know that.
Bill Arnold
@hamletta:
Thanks for the macworld link. Now I’m only wondering how the password guess rate limiter can be disabled. The article says that the rate limiter is in hardware; not sure what that means, or how tamper-resistant the hardware is.
People are underestimating the sophistication of hardware-level attacks given resources at national-lab level. Think of it as partial vivisection at the microscopic hardware level.
The FBI is not tossing the problem to the NSA et al, but demanding a vendor-provided facilitation of a brute force attack, and the motive is probably to set a precedent (and develop some public sympathy to their broader anti-encryption demands) as suggested here. Edit – and as suggested, the integrity of the phone as a payment platform is at risk. (Did not think of that, oops.)
(iPhone passcodes can be long and alphanumeric/special characters if you turn off “simple passcode”.)
Sad_Dem
@fuckwit:
I suspect that the terrorist attack offers good cover for the government’s letting people know it can crack their handheld tracking and surveillance devices phones.
TallPete
Glad I use an iPhone!
D58826
@Loviatar: I’m not sure if it is an assault on the 4th amendment. Courts issue search warrants all the time as part of a criminal investigation. I’m not sure why the data on an Iphone is so much more sacred than the data in my desk drawer.
As a long time card carting member of the ACLU I understand why Apple doesn’t want to just turn the keys to the phone over to the FBI. I’m fully aware of the abuses that have happened in the past. On the other hand there are bad people out there who use this technology to plot terrorist attacks that kill lots of innocent people. For some unfathomable reason the FBI did not pursue a FISA warrant in 2001 after they arrested Mossaoud. After 9/11 when they did access his laptop there was information on it that might have helped prevent the attack. I
If there was a major attack with a heavy loss of life and it turns out that the FBI could not access information on an apple phone that might have stopped the plot, I’m not sure I would want to be the Apple executives who have to explain why the did not co-operate to a Congressional committee.
As far as this technology falling into the hands of the Chinese or the Russians, I suspect they have enough smart people and enough time that they will find a way to access the phones with or without Apple’s help.,
Since all the FBI is asking for is a way around the code that will erase the phone after ten triers, isn’t there some kind of compromise where the FBI and Apple go into a secure facility, Apple techies do what they have to do to bypass the block and then give the phone back to the FBI w/o giving the FBI the actual software? I just find it hard to believe that there isn’t some way to get around this. Or are we doomed to a world in which the bad guys get to encrypt their plans and the good guys are simply left to count the dead. Yea I known that may be a bit of overkill but I just don’t like the idea of saying well it is in the hands of fate.
Andrey
@Bill Arnold: The rate limiter operates differently in different versions of the iPhone. In this particular version, the rate limiter is actually implemented in software, which is why there’s even an option of using software to bypass it.
@srv: It’s technically true that the update could be written to only work on one hardware signature. However, 99% of the update would be reusable for any other hardware signature within that class of iPhones. In other words, the next time the FBI or someone else wants to get Apple to do this, they would just have to say “use your existing solution and target this other hardware signature”. Or they could try to force Apple to remove the hardware-signature validation after the fact. I don’t know how these tactics would fare from a legal perspective, but I can certainly understand Apple’s reluctance to let them even become a possibility.
randy khan
@jl:
I understand that you think that, but it’s not true. Consequences of actions aren’t really that separate from those actions.
PghMike4
I think Apple’s most concerned about setting a bad precedent that it
can be ordered to work to remove the security protections it inserted
in iPhones. Right now, the request is that they generate firmware
that sends keys to a piece of hardware without extra rate limiting.
In 5S and later phones, there’s a secure enclave chip that makes it
much harder to do the passcode testing, since the enclave itself
performs rate limiting. So, theoretically, this attack will only work on 5Cs
and earlier.
But in the future, Apple MIGHT find itself ordered to make use of any
bugs it later discovers in the enclave, or even to physically modify
an enclave, to allow the same types of attacks on a more secure phone.
And they might be required to turn over the resulting firmware to the
FBI to use on other phones (along with keys to allow the FBI to sign
it for other phones). Apple no doubt wants to stop from even starting
going down this slippery slope.
Also, note that although the keys used to encrypt phone data are
highly random AES keys, those keys are protected only by the passcode,
and for most people, those passcode are way too small (4-6 digits).
So, while someone who’s really careful could make it very hard for the
FBI to break into a phone, even with Apple’s help, a novice (99.99999%
of Apple users) would use a passcode from a much smaller space,
conceivably small enough that with Apple’s help, and a few bugs in the
enclave here or there, the FBI could break into the phone.
Most significantly, this level of detail is way too technical for
nearly everyone following this story. To them, the question is going
to be a very simple “Are iPhones secure from the government, or are
they not?” Apple wants the answer to that to be “Yes, they’re secure.”
So, I bet they’ll fight this all the way to the Supreme Court. Where, of course,
it’ll tie :-)
Grumpy Code Monkey
@pamelabrown53:
Apple does not have access to “the data” any more than the FBI does; there’s nothing for them to hand over.
What the FBI is demanding is that Apple disable a feature in the iPhone that prevents anyone from “brute-force” guessing your passcode; basically, you get a few tries to enter the passcode correctly, and if you fail so many times in a row the phone is bricked. The FBI wants Apple to remove that limit so that they have an unlimited number of tries to get the passcode right. They are compelling the manufacturer to defeat a security device on their own product.
Apple is claiming that they do not currently have the ability to disable this feature (which is probably true, assuming they did it right). The government is demanding Apple develop that ability, and Apple is balking.
Before going any further, let us note – the suspects in this particular case are dead; the FBI already has access to their home computers, they have records of their online activites, etc. There’s no ticking time bomb here, and it’s very likely that there isn’t anything on the phone that the FBI don’t already have. Also note that Apple has complied with search warrants in the past, retrieving data from phones and other devices as requested.
Apple is calling this a case of government overreach, and I don’t disagree; again, the suspects are already dead and the FBI already have access to records of their online activities. If the government can’t make a case with the evidence it already has, then whatever’s on the phone probably won’t change that.
If the government is successful in compelling Apple to develop new technology to cripple their existing security, then every iPhone is vulnerable. If they patch that one phone and hand it back over to the FBI (which they must), then the FBI will have access to that patched OS, and can use it to brute-force their way into any iPhone without Apple’s assistance.
It’s a nasty, nasty precedent to set.
randy khan
@Hoodie:
The government quite specifically is *not* asking Apple to give it the information on how to create a custom version of iOS that bypasses the security set up in the phone. (That would be much worse from a security standpoint than what the government actually says it wants – it would give the government the ability to break the security on any phone.) It’s asking Apple to make the custom version of iOS and use it on the phone.
And, as others have said, the likelihood that it really could be constrained to one phone is pretty close to zero. Not to mention that the cost may not be limited to what it costs to make the software, but might include the impact on the market for Apple products, which is a very big number.
Andrey
@D58826:
I’m not aware of any precedent for the desk manufacturer being forced to open the desk for the FBI. That’s one of the major problems in this situation – no one is questioning whether the FBI has the authority to attempt to access the data; one of the points of contention, however, is whether they can force a third party, which is not a suspect in the actual crime, to help them out.
Betty Cracker
@sylvainsylvain: Excellent point.
PghMike4
@LongHairedWeirdo: I wondered about this as well. I think the issue is that while the encryption is nearly impossible to break, the encryption keys on an iPhone 5C are only protected by the passcode, and the passcode was probably just a 4 digit code. So, with 10,000 tests, they’ll get the keys.
If the previous owners used a long alphanumeric string, the FBI will be screwed anyway, but I doubt they did.
And Apple knows that most people aren’t following this level of detail. The real question to them is simply this: can the FBI break into my supposedly secure phone, or not?
J R in WV
@sharl:
Thanks for posting this link, it has a lot of details that shed a lot of light on these guys’ history.
I’ve read the whole thing – Cliven Bundy won’t ever see the sky without bars across it. I expect his sons, all of his family, will be in jail for a very long time.
And most of those civilian defenders who came to his side, they can be picked up by Federal LEOs at any time. They’ve been looking deep into these folks crazy, and taking names for a list, a long list.
I’m grateful for the Federal LEOs slow approach to these guys. The crazy runs very deep, mostly I believe out of greed, which is obviously running deep in Cliven’s mind. Don’t want to pay grazing fees, don’t want to pay taxes, just want to take and take and keep. Greedy SOBs, the whole lot of them.
Davebo
If only the guy had upgraded his phone! They could just dig him up and cut off his thumbs….
Betty Cracker
@Steve in the ATL: Aha — perhaps that explains the unusually high pedantry levels around here; you can’t swing a cat without hitting a barrister, and you can’t misconjugate a verb without triggering a correction.
Mingobat f/k/a Karen in GA
@WJS:
Or because people get work-related emails containing confidential info on their phones, and mega-corporation employers will require their employees use different phones if the mega-corps aren’t happy with the security on those phones. If I couldn’t do work-related stuff on my iPhone, I’d switch to a different phone in a heartbeat.
Grumpy Code Monkey
Dan Goodin at Ars Technica says it better than I did:
Emphasis added.
J R in WV
@Gravenstone:
That’s good, that’s very good!! Thanks for the excellent snark!
B-J, where only the very best snark is served!
D58826
@Andrey: I take your point but lets change the facts a bit. The FBI gets a warrant to search your bank account. The data is on the banks computers. The FBI is asking a third party to provide them with your information.
I’m not arguing that the FBI should be given free reign of all apple phones and I understand the precedents that may be set. I’m just not comfortable with the idea that the bad guys can lock up their attack plans on a phone and the best the FBI can do is say well we will know what the plan is when it happens. I believe it was a SCOTUS justice who said the Constitution is not a suicide pact. I would lean more in the direction of Apple’s position if we were talking your average run of the mill bank robber or drug king pin but terrorists are in a different class of nasty..
Steve in the ATL
@Grumpy Code Monkey: I thought it was well established that the FBI was not asking for, much less requiring, the source code from Apple. They just want the phone unlocked.
Steve in the ATL
@Betty Cracker: Warning: it’s even more concentrated over at Lawyers, Guns, and Money!
Betty Cracker
@Steve in the ATL: Jaysus, thanks for the heads-up — I’ll continue to stay out of the comments at that excellent blog!
randy khan
@Grumpy Code Monkey:
This is a really important point. The FBI is asking Apple to create brand new software for one phone (something that easily could cost 7 figures or more, by the way) to enable a fishing expedition that probably will turn up nothing of use the FBI doesn’t already have.
randy khan
@D58826:
But on those facts, the bank has access to the data already because it needs the information to run its business. Apple is much more analogous to the desk manufacturer or the company that makes the bank’s computers – it doesn’t have the data in the first place or any way to get the data right now, and is being asked by the government to create a way to get to the data.
Billy K
@Arm The Homeless: Wow. Irritable AND inscrutable. You must’ve had the acid today, not the kool-aid.
Hoodie
@randy khan:
Which means that the FBI is being more reasonable than you seem to think. I was saying that they could take the information as to how to do it under eminent domain, not that they actually are asking for that directly. Seems like the FBI is trying to give Apple the opportunity to do this in a way that is less likely to compromise their proprietary information.
D58826
@randy khan: True there is no ticking bomb but it’s an assumption that there is nothing on the phone that the FBI can use. Maybe the fact that there is no ticking bomb is a good thing in this case. It gives everyone a chance to argue the pros and cons. The FBI and Apple to maybe come up with a compromise solution. And to figure out how we can maintain secure online communications without giving the terrorists an unbeatable advantage (and yes the terrorists can always go back to paper and pencil but it is slower and more vulnerable to exposure). . All without the imminent threat of an attack. This issue will come up again and again and we might not have the time to argue about it. Better to sort out the legalities now.
WarMunchkin
@Andrey: @singfoom:
Having looked at that, I take back my safe analogy. It’s completely wrong. However, it looks like the iPhone 5c is straight-up hackable. The FBI could just obtain the firmware signature used by Apple via warrant. Why not do that instead of doing this third party force-to-act nonsense? (Besides the fact that it would tank Apple’s software security completely).
A Ghost To Most
@Steve in the ATL:
You could have at least answered the question, pedant.
Lawyers.
Bill Arnold
@Andrey:
Now I’ve gotten to wondering about the physical security for the Secure Enclave (not that it matters in this particular case). I haven’t found much hardware-level detail, just this old (2014) backgrounder Why can’t Apple decrypt your iPhone? and related (also 2014) What is Apple’s new Secure Enclave and why is it important?
Do you know, or know of a better description?
D58826
@randy khan: The FBI isn’t asking for the data. They just want a way around the 10 tries and the phone is toast block. Yes the bank needs your data to do its business but Apple needs to sell the security of its phones if it wants to move into the apple pay business in a big way.
All of this is part of the brave new world of secure digital commerce and the state does have a legitimate interest in being able to access the information of the bad guys. We just have to figure out where the balance point is between privacy and the protection of the public.
randy khan
@Hoodie:
I’m not sure what you think the FBI could demand that Apple provide – the knowledge in the brains of its engineers? There’s really no precedent for that. But it’s really a moot point, since this is under the All Writs Act and not some eminent domain statute.
Loviatar
This a bad precedent people, a very very bad precedent.
You’re saying you’re willing to give a Ted Cruz (or pick your worst case scenario) the ability based on some vague non-critical security threat the precedent to force a private entity to create at no cost something which will devalue their property. This item once created has the potential to not only impact that private entity’s financial worth, it may also severely impact the privacy and security of every user of that entity’s products.
This precedent may not be limited to federal level crimes you might also have this precedent being used to force compliance at state level criminal and/or civil cases or anything else a creative lawyer can think up. I remember when ez-pass logs were never going to be accessed, then only for criminal crimes now they’re being used in divorce cases.
This is bad.
randy khan
@D58826:
You’re right that I’m making at least a bit of an assumption about what’s on the phone, but so is the FBI. And the FBI’s argument really is that there might be something there that it hasn’t found elsewhere that’s relevant to its investigation. Given how extensively the FBI already has looked at the rest of the perpetrators’ lives, the likelihood is that this is in the nature of scouring the bottom of the pot to see if there’s one more piece of rice there.
As an aside, keep in mind that the main tools for dealing with terrorists – surveillance, for instance – are unaffected by this entire brouhaha. The FBI only gets people’s phones after they’re arrested or dead.
Andrey
@D58826:
That’s qualitatively different. “Give me information” and “Build a tool” are not the same thing.
It is generally considered to be a fundamental tenet of the US justice system that the severity of a crime does not alter the requirements of due process, or alter the limitations to the government’s ability to compel private entities to action. Obviously, that is not a universal view, and there have always been actions taken that violate that principle, some of them even enshrined in law and Supreme Court decisions.
Many people and many organizations hold those violations to be inherently problematic. The simple answer to your concern is “Yes, some criminals ‘getting away with it’ is a totally reasonable and acceptable price to pay.” Even if those criminals are terrorists, and ‘it’ is terrorist plans or acts.
D58826
@randy khan: Where would you draw the line in a ticking bomb scenario which I realize is only on 24. But for the sake of the argument, in 2001 instead of the laptop it was a secured phone. The FBI and the CIA were on to a plot based on other investigative leads but they have this phone that may or may not provide important clues to stopping the plot. What would you recommend?
Andrey
@Bill Arnold: Unfortunately, I’m not as familiar with that information. Hardware hacking is outside my area of direct knowledge.
Baud
Based on what I know, I support Apple.
D58826
@Andrey: Eliminate the TSA security at the airports then? Given the overreaction to even minor terrorist related events in which no one was hurt, I suspect that in the event of a major attack that might have been prevented by unlocking a phone, then the national security state will explode in its scope. It just seems better to try and come up with a way to yes have our penny and cake both. If possible.
Andrey
@D58826: The recommendation is to look for other leads. The problem you’re proposing is no different from any other reason to disregard due process or any other constitutional protection. For every single right or protection afforded by any law, constitution or treaty, it’s possible to construct a “ticking bomb scenario” in which the choice is to throw away that protection for a single individual – or let hundreds or thousands die.
The answer I consider to be most correct is to stand by the principles that led to those protections, and assert that letting hundreds or thousands die is a reasonable and acceptable price to pay in those circumstances. Because as you clearly acknowledge, such time bomb scenarios are extraordinarily rare; and standing by our principles every single time prevents a great deal of harm, saving – in the long run – much more than thousands of lives.
Feathers
This is a national security issue. Is the United States stronger by being the technology leader of the world and having Silicon Valley be the driver and innovator of cyberspace throughout the world, or are we going to fuck up and throw that out the window because the FBI and NSA can’t stand not having every piece of possible evidence available to them? If the world stops trusting US computer products, that is far more dangerous than anything the San Bernadino shooters did. The largest danger from terrorism is overreacting.
Also, Apple sells more iPhones in China than in the US. We are not the market for iPhone encryption. My own reply would have included something about get back to me when you’ve passed meaningful gun control reforms.
alhutch
@WarMunchkin:
In this case, it actually is. If you read the link I posted, the 5c (and probably earlier iPhones) has a unique weakness that would allow Apple (and them ALONE) to get into the phone and make the firmware changes the FBI wants so they can hack into the phone.
To take it a step further, Apple could open this 5c, dump the contents and just hand those contents to the FBI. No “special sauce” software has to leave Apple HQ. Dead guy didn’t even own the phone and this has no ramification on any iPhone from the 6 onward (which Apple couldn’t hack, even if they wanted to because of Secure Enclave), so the hand wringing in this thread doesn’t make a ton of sense to me.
I'mNotSureWhoIWantToBeYet
@D58826: I think there’s a reasonable solution to the ticking time bomb problem. If there is information that cannot be obtained in time except by illegal means, then the people there are certainly free to use their judgement and break the law.
And when the episode is over, they should surrender and face the legal consequences for breaking the law.
What we shouldn’t do is say the rules get thrown out in a ticking bomb situation.
In your example, of the information on the phone, I don’t think the ticking bomb situation applies. Even if they got the information, there’s no reason to assume that it would have been appropriately acted upon in time. If they broke into the phone illegally, then the evidence wouldn’t have been admissible in court anyway.
My $0.02.
Cheers,
Scott.
Andrey
@I’mNotSureWhoIWantToBeYet: This is also an acceptable solution that I neglected to mention in my own reply to the time bomb scenario. “Another human is willing to suffer the full consequences” is a reasonable safeguard.
alhutch
@RSA:
We have a bingo!
WarMunchkin
@alhutch: Yeah, I agree with you now. Could the FBI get Apple’s firmware signature via warrant? Would be a bit like stealing the Krusty Krab’s secret recipe, though.
D58826
@Andrey: Logically I generally agree with you but I don’t think the political system of constitutional check and balances will survive another major attack esp. if it turns out that there was a lead that could not be followed because Apple said no. Look how much ground we have lost to the national security state since 9/11. We lost 4 people in the attack on Benghazi and 4 years later it is still a major issue. We can’t even agree to try KSM in an article 3 court in New York, in spite of sending dozens of people to jail on terrorism charges.
Andrey
@D58826: It seems like your proposal to avoid increase of the national security state is for Apple to capitulate to the national security state, under the theory that if they don’t give an inch, the state will take a mile. If that is how it works, then there’s simply nothing we can do, because we have to either give in bits and pieces or give all at once, and we will never stop the process.
The opposing school of thought is that the way to stop the security-state expansion is precisely by opposing it, and I don’t see a reason to believe that this is doomed to fail.
D58826
@I’mNotSureWhoIWantToBeYet: I’m not suggesting that people act illegally. What I’m saying is that we are confronted with a new technology and we have to figure out how to fold it into our existing legal system in a way that does the least amount of damage. I suspect there were all manners of arguments during other periods of technological change.
Andrey
@alhutch: The concern is not that this could be applied to other iPhones. The concern is that this could be applied to other situations and other companies.
trnc
@alhutch:
Agree completely. Apple gets the data off the phone and hands it over to the feds, and should expect to do that any time there’s a warrant. Feds get the data and Apple hasn’t given up any trade secrets.
alhutch
@sylvainsylvain:
This is a non-issue. Any Apple Pay enabled phone (5s or later) would not be subject to the “exploit” (for lack of a better term) that Apple could use on the 5c in question. Any iPhone with Touch ID (fingerprint scanner) can NOT be opened by Apple under any circumstances. The 5c in question does not fall into this category.
One of the reasons they made the encryption changes to the iPhone (again, starting with the 5s), is so they can’t be compelled to pop them open by anybody. They really can’t do it, even if they wanted to.
Andrey
@D58826: The least amount of damage is to not force companies to break encryption. The damage from breaking encryption is far greater than the damage done by all encryption-using criminals combined, terrorists included.
alhutch
@WarMunchkin:
I hope the FBI doesn’t get anywhere near the keys needed to alter the firmware. If they did though, it would be a backward looking solution (only good for older phones).
D58826
@Andrey: @Andrey: I would like to think that a strong judiciary would keep those ‘inches’ few and far between under carefully controlled rules. Unfortunately since 9/11 the courts have been way to accommodating to the national security state when they didn’t have to be.
I'mNotSureWhoIWantToBeYet
@D58826: I agree that the law needs to catch up here. I mentioned early in this thread that I do not see much of a difference between a warrant for information on a phone and a warrant for information in a filing cabinet. IIRC, when MS was subpoenaed during the anti-trust case, they were required to provide information to investigators in a useable format. It seems to me that that was a reasonable request. Apple should do the same thing here as they are the only people capable of doing so.
I’m not sympathetic to the argument that Apple’s marketing should trump a legal search warrant.
I’m also not sympathetic to the argument that if some technology exists then Big Brother or Big Hacker will necessarily use it against us. Big Brother is constrained by the law. Big Hacker is constrained by IT companies being clever. Neither are arguments that law enforcement cannot have access to information on a device simply because Apple says it’s too hard.
If information can be subpoenaed, and warrants for information can be issued, then companies that supply IT infrastructure must recognize that they will be served with warrants to supply that information. Trying to be clever by saying “it’s all encrypted, sorry” isn’t going to cut it for long. The law will change, or people will revolt.
“Why can the cops turn my house and my business upside down after serving a warrant, but Tim Cook can sit in Cupertino with his billions and not have to provide information because he says it’s encrypted??”
I see the arguments on the other side.
We’ll see what happens…
Cheers,
Scott.
alhutch
@Andrey:
Well, I’m guessing the Android universe doesn’t have the level of security that Apple implemented in the 5S, so your concern might very well be justified.
Andrey
@I’mNotSureWhoIWantToBeYet: The analogy has been made before, but to restate and perhaps summarize: the difference is that filing cabinets are flawed, and it is possible (albeit, yes, difficult) to build flawless encryption.
Suppose that a company is capable of building physical safes so utterly secure that not the FBI, not the CIA, not even the company itself can break into them. Would it be reasonable for the FBI to demand that the company stop selling such safes, and/or build a weakness into all future safes? That’s a major component of the current encryption concerns.
This case isn’t quite the same thing, but the reason it’s receiving so much attention is because it occurs in the context of many other requests and orders that have actually been exactly that same thing – not just requiring the company to exploit an existing flaw, but actually ordering the company to build flaws into its future products.
D58826
@I’mNotSureWhoIWantToBeYet: This is why I think this could be a good test case. Work out the arguments now when we have time and passions are relatively under control. No constitutional right is absolute (well except for the 2nd amendment) so its better to figure out the exceptions now and as narrow as possible. And the burden of proving those exception must be on the government not Apple.
Maybe it is just an emotional response on my part but I just don’t think the argument that hundreds died in a terrorist attack to protect apple’s business model is going to fly. And in that environment the exceptions that will be written into the law. will be about as wide as the Grand Canyon. They will make the Patriot Act look like a model of restraint.
Andrey
@D58826: In this case, there’s not even a suggestion of a ticking time bomb. No one is worried that this will be relevant to any ongoing terrorist plots.
I'mNotSureWhoIWantToBeYet
@Andrey: But this – arguments about encryption – is nothing new. As mentioned earlier encryption used to have to be approved for export. There were people in the government who would go over the code and decide whether it was exportable or not.
All software had bugs. Even TeX. Even if Apple’s encryption routines are perfect, there are bugs in the OS, bugs in the CPU, etc., etc. It’s impossible to prove that a computer has not been infected with a virus, also too (see Fred Cohen’s dissertation). So if Apple is arguing that their business will be destroyed if they don’t help the FBI gain access to the information on this phone, because their phone is so perfect, well….
All of this means, IMHO, that the law is eventually going to impose its will on Apple (and other IT companies) over encryption and similar things. There will have to be a way for governments to legally gain access to information. If iPhones are walled off as sacred, unbreakable information storage devices, then warrants and legal discovery and so forth will become meaningless. The law depends on getting access to the facts and the truth, and that means access to information.
People who are worried about government over-reach should be worried about what happens to the legal system when courts cannot get access to the facts… :-(
Thanks for your input on this thread. It’s been enlightening.
My $0.02.
Cheers,
Scott.
D58826
@Andrey: That’s why I think it is a good time to hash out the issue.
Doug R
White guy problems.
Gavin
The best on this subject is, as for all things related to security, Schneier’s blog.
FBI is very much against the national interest on this one, and the judge needs to be slapped down.
This is also not a small precedent: Apple is being forced via court order to be a government contractor for an indefinite amount of time.
Steve in the ATL
@A Ghost To Most: Answer the question without my retainer check clearing first?
Ok, just this once. With the caveats that (1) I don’t practice criminal law, and (2) I haven’t been paid for this: forcing a prisoner to build the gallows from which he will be hanged would likely be considered cruel and unusual punishment and would therefore violate the Constitution. Were Scalia still alive, he would uphold that punishment as perfectly reasonable, even if the prisoner were proven innocent. But he’s now rotting in Catholic church approved Hell, so it’s cool.
Gavin
As well, there’s 2 problems with the judge’s order.
Don’t forget that a side issue is that now the cops wouldn’t even have to be physically present to hack your phone………. As well, all banking apps must be immediately eliminated once this order is approved.. because the data is no longer secure.
A] The method of breaking won’t remain solely in police custody – the code will escape, inevitably, and you can’t shove that genie back in the bottle.
B] Now, I will download a tool that will
Install a password copy of my device data to another device of my choosing.
Detect via checksum if the firmware has been updated [eg, if the FBI tried to hack] and if no match, wipe. Of course I still want to perform firmware upgrades that I approve. But if not, no dice FBI.
Side funny: Now, all criminals know what type of password to use to avoid getting caught in this lifetime…
sylvainsylvain
@alhutch:
Ok, thx.
So Apple’s headed this issue off at the pass, going forward
EthylEster
I think Kevin Drum has a reasonable take on this.
But I had to ponder it a while so…never mind.
randy khan
@D58826:
I won’t play the ticking time bomb game. Sorry.
I would note that because of what the FBI wants, it’s going to take a while to create the new version of iOS (and test it – you don’t want to brick the phone accidentally), so the ticking time bomb scenario wouldn’t really apply in any case.
dantanna
@Punchy: Jesus, apparantly nobody on this thread gave your masterful punfest even so much as a glance.
I guess I did the opposite, I read yours and skimmed and/or skipped over the rest. I think I’m the better for it too.
WJS
@Mingobat f/k/a Karen in GA: It really doesn’t matter–Apple routinely unlocks phones for the Feds. This time, they’re putting up a fuss because it makes good marketing sense. Your confidential material is not protected because they just don’t care.
randy khan
@I’mNotSureWhoIWantToBeYet:
That was information in Microsoft’s possession; in fact, information generated by Microsoft about Microsoft (not to mention that it is a standard condition in discovery that information has to be produced in a usable form). Here, Apple is not being asked to provide information at all (since it doesn’t have the information in the first place), but to break into a phone it doesn’t even own. Put differently, it’s one thing to be asked to get files out of your own file cabinet; it’s something else entirely to be asked to open up someone else’s file cabinet (or safe; pick your metaphor).
randy khan
@WJS:
Examples? Or just an assertion?
randy khan
@WJS:
So I did my own research on this. The answer is that Apple can unlock phones with iOS 7 or earlier, but cannot unlock phones with iOS 8 or later. I would guess that any example you would find would involve a phone or iPad with an earlier version.
Peter
I find it very telling that every single person arguing the government’s side in this thread has not the first idea about how the technology in question works – either through their own admission or by plain observation.
AxelFoley
@Tom Q:
That’s what I was thinking. Do it, but don’t let the Feds see how you unlocked it. This shit has national security implications. Fuck your purity, Apple.
And I say this as an Apple supporter.
randy khan
@AxelFoley:
Once the software exists, most of the issues people have raised also exist. In particular, maybe the U.S. government would be delicate about it and let Apple keep the software, but I doubt the Chinese would go along with that, since they would want to use the software to go after people who Apple wouldn’t think were criminals.
D58826
I suspect that the Chinese if they want to crack the code will commit the resources to do so with or without apple’s help.
Forget the ticking time bomb, the FBI by plan old pencil and paper methods uncovers a possible plot to do something at some time in the future. Even in 2001 without smart phones they had enough information by May to start warning POTUS of an impending attack sometime in the future. They follow all of the leads and arrest some one. All of the information they have from all the traditional sources leads them to the smart phone as being the key to connecting the dots. Are we supposed to just stand there and do nothing and hope for the best?
I guess we could use an old fashion non-digital approach and torture the guy but that doesn’t seem like a good approach either. It is that feeling of powerlessness that I find so frustrating about this.
Paul in KY
@Gin & Tonic: Cut thru the back of the safe.
Grumpy Code Monkey
@AxelFoley:
Here’s the meat of Judge Pym’s order:
So, problem #1: Apple will have to patch the operating system to allow brute-forcing of passcodes; this isn’t a feature that can be selectively disabled, it requires a one-off version of the iOS code.
Problem #2: The order directs Apple to make this change to allow the FBI to perform the brute-force attack, meaning that the FBI will retain custody of the phone with the patched OS on it. Apple will most likely not be able to take custody of the phone after patching the OS.
Problem #3: The FBI now has a copy of that patched OS which they can apply to other phones (with a warrant, without, who knows).
Problem #4: This sets a precedent that the government can compel anyone who writes any kind of encryption or security software to provide a backdoor or other defeat mechanism, which will render that security useless in the long term. If the backdoor exists at all, it can be exploited by any black hat in the world.
If the government has its way, everyone’s private data, not just the terrorists, is available for them to look at. It’s a 4th amendment nightmare waiting to happen.