Advertisement

Can the government force you to write code?

Started by February 19, 2016 10:15 PM
105 comments, last by frob 8 years, 5 months ago


In that case, you have to congratulate them because they did a really, really good job.

Yes, they did.

If you take the order in isolation, ignoring the other legal questions, the technical side of the order is straightforward and minimal.

In that respect it is well crafted.

But that overlooks the matters of law. The order may technically be feasible, but it appears to violate the first and fourth amendments and also establishes a completely novel theory about compelling someone to assist in inventing something new. As a matter of law, the order is probably unlawful. Technically possible to do, but unlawful.

It seems a 3rd party has helped the FBI unlock the San Bernadino iPhone. So the FBI may be dropping the case against Apple to force a backdoor.

It seems Snowden was right after all.

Beginner in Game Development?  Read here. And read here.

 

Advertisement

It seems a 3rd party has helped the FBI unlock the San Bernadino iPhone. So the FBI may be dropping the case against Apple to force a backdoor.

It seems Snowden was right after all.

I saw that. I wonder tho if it will end up being viable? Aren't they still testing the method?

No one expects the Spanish Inquisition!

It could be propaganda. Or what do you call it, a double bluff? And unluckily, we have no way of knowing.

You know, the whole thing is pretty pointless for the FBI, any tracks they might follow are cold by now. There seems to be some strong resistance to the whole idea, too. So... alltogether there's not much to win but a lot to lose.

They could as well give up. Except, if they do, they will lose face.

Except... if they say "Never mind, we got a little help from our friends whom you hear breathing on your phone, and we figured it out anyway. It was no biggie". That way, they not only don't lose face, but they may also very successfully spread FUD. Saying "government can relatively trivially break state-of-the-art encryption" provides a lot of fear, uncertainity, and doubt.

Why is this important? Fear and uncertainity play a big, big role. It's pretty much the same principle as with a polygraph test. A polygraph cannot detect lies, but it's enough if you have some fear and uncertainity about it. It's enough for the interviewer to shout at you afterwards, telling you that you failed the test. The machine very clearly showed that you are guilty, and you are going to serve 25+ years. Unless you confess, then maybe one might arrange for 10 years.

Now imagine that for whatever reason you are suspected (or maybe even guilty) of a crime. Police or FBI or whoever seizes your phone, and it's encrypted. What do they tell you? They'll say: "We can decrypt it, no problem. You know we can, we did it before. It's just that we would like to avoid the expenses, which are about half a million dollars, so we would prefer if you cooperated. Of course, if you refuse to cooperate, and we find the slightest thing, you will be liable for the cost. You know that we will decrypt it anyway, you know you can't hide anything, you don't have any way out".

The strong resistance against this backdoor that is presently seen now might cease to exist in the future, too. If the FBI can decrypt the data anyway (FUD), why make such a big fuss about it? Come on, don't be an asshole, just give them their backdoor, save tax payer's money.

Can't tell which one it is, since nobody but the FBI knows what is on that phone, and they aren't telling you what they find anyway, so nobody can verify. Might be they really broke the encryption, might be they found another way, or maybe the PIN was 1234. Or maybe... just FUD.

What's alarming is that even having the possibility of FUD in mind, it fosters the impression that there really exists nothing that is even reasonably safe. Perfectly safe doesn't exist, everybody knows... but one should hope "reasonably" safe exists. It does not seem so.

Makes me think of this story of stolen passports in Greece a week ago. That guy whom the TV team interviewed (have to wonder why he and his friends aren't in prison, seeing how a reporter found them easily enough, they shouldn't be too hard to find for the police...) said something like "Yeah, we only make like 80 of these per day. It is a bit tedious because we also need to reprogram the biometric RFID chips. You know, the photo on the chip and everything, has to be the same like the photo in the passport, or the border official will notice".

Excuse me? Reprogram the what? It's bad enough that there is no apparent security preventing anyone from reading the data. It's only super sensitive personal data after all, heh.

But... they are reprogrammable? Just like this? With a laptop in a backyard shop? You have got to be kidding me. I mean, if forgery is the one thing you want to prevent, what's the one obvious thing you shouldn't do? Place the information on a medium that supports being overwritten. Duh.


Why is this important? Fear and uncertainity play a big, big role. It's pretty much the same principle as with a polygraph test. A polygraph cannot detect lies, but it's enough if you have some fear and uncertainity about it. It's enough for the interviewer to shout at you afterwards, telling you that you failed the test. The machine very clearly showed that you are guilty, and you are going to serve 25+ years. Unless you confess, then maybe one might arrange for 10 years.

Incidentally this is the problem with metadata. If somebody really insists on getting your name dirty, they'll just look up if you could have talked to somebody considered dangerous to claim you're involved too, regardless of what you even talked about (for all you know you were just insulting them! :v). But if framed properly, a judge that sees the metadata without seeing the data could consider it valid proof and get you in trouble.

Partial proof is much worse than full proof because it can be easily manipulated to mean something else.

Don't pay much attention to "the hedgehog" in my nick, it's just because "Sik" was already taken =/ By the way, Sik is pronounced like seek, not like sick.

Well, the NSA does have direct access to Apple's servers... so they could easily have just gotten their hands on the OS source code and signing keys, given it to their own devs, made the update, and then laundered it out to the FBI.

I don't even need to put on my tinfoil hat for that to be plausible.

Advertisement


I don't even need to put on my tinfoil hat for that to be plausible.

I feel like that requires quite a bit of tinfoil hat'ing, myself.

I'd find it a little surprising that the NSA itself stores data in AWS, if they already knew of a backdoor into Apple's data stored there.

Tristam MacDonald. Ex-BigTech Software Engineer. Future farmer. [https://trist.am]

I feel like that requires quite a bit of tinfoil hat'ing, myself.

Thanks to Snowden, we know that "the number one source of raw intelligence used for NSA analytic reports" is PRISM, which collects data "directly from the servers of these U.S. Service Providers: Microsoft, Yahoo, Google, Facebook, PalTalk, AOL, Skype, YouTube, Apple."
That these companies are complicit in this NSA program. We know that Yahoo was threatened with secret/gagged fines of $250k per day if they didn't secretly cooperate with PRISM, and that eventually they caved in to this pressure.
That the NSA actively creates backdoors and hacking tools for every situation they could ever need.
That the NSA launders illegally-collected material to other agencies such as the FBI, making the inadmissible admissible.

And that the NSA steals embedded device encryption keys directly from the manufacturers, defeating physical-device based cryptography.

The surveillance state is not a secret any more, it's a fact.

Putting all that together, it's not insane to imagine they could access Apple's source code / user data, give it to their internal hack-makers, and launder the results to the FBI -- especially as this is supposedly a "national security" matter.

i.e. means, motive, and opportunity exist.

It quite well could be a fantasy, but it's also entirely plausible given the status quo.

And when Apple makes a big show of standing up to the FBI in this particular case, it's just a farce, because we know they've already bent over to the NSA.


And when Apple makes a big show of standing up to the FBI in this particular case, it's just a farce, because we know they've already bent over to the NSA.

You're making a lot of jumps in there, and (I assume intentionally) conflating the NSA's technical capabilities with Apple's compliance with their legal demands. PRISM isn't some backdoor exploit, it's tech companies intentionally (more or less) sending customer data across to the government servers in compliance with (secret) legal pressure.

I don't disagree that the NSA could force Apple to disclose these things, in much the same way the FBI attempted to force them to hack the iPhone. But the very fact that they did force Apple to be complicit in PRISM, suggests that they don't have the necessary tools to extract that data without Apple's cooperation.

I also seriously doubt that the production signing keys for iOS are just lying around in an S3 bucket. If Apple's security engineers are up to snuff, those keys are likely to be stored in an air-gapped HSM, locked inside a physically secure facility...

Tristam MacDonald. Ex-BigTech Software Engineer. Future farmer. [https://trist.am]

You're making a lot of jumps in there, and (I assume intentionally) conflating the NSA's technical capabilities with Apple's compliance with their legal demands.

Yep. Broad strokes illustrating the current environment tongue.png

But the very fact that they did force Apple to be complicit in PRISM, suggests that they don't have the necessary tools to extract that data without Apple's cooperation.

Not necessarily.
The FBI, which is on the legal/domestic side of the coin, wants the ability to defeat strong encryption. One of the important points in this case, is that they're attempting to hammer in a thin edge of a wedge that would massively weaken people's ability to use encryption against domestic law enforcement in general. Yes, this indicates that the FBI (probably) can't crack the phone, and a side-effect of succeeding here would be the ability to defeat a lot of domestic encryption in general. It's still possible that they can crack it, but their real objective is actually more interested in the thin edge of the wedge :wink:

The NSA is able to do things that would be illegal for the FBI to do. Perhaps they already can decrypt the phone, while the FBI can't. They've got access to all of the unencrypted communication that the phone has ever had with Apple HQ. If they've previously been able to record the unencrypted data flow between the device and Apple, that puts them in a very good position to attack it. Plus, they're in a position where they could demand the OS source from Apple without anyone knowing... and do have the staff capable of constructing an attack against it. It's also possible that once their attack is constructed, they could leak it to a 3rd party, so that the FBI can use it without being tainted by the NSA-connection.

I'm not saying that this is what has happened. Just that it's plausible that the NSA could do this, and do it secretly. You've got to put on tinfoil blinders to pretend they're not capable laugh.png

This topic is closed to new replies.

Advertisement