Advertisement

Can the government force you to write code?

Started by February 19, 2016 10:15 PM
105 comments, last by frob 8 years, 5 months ago


If it wiped the key that's a different story but i wonder how high the trial and error count is, if it was really low we'd have complains from people that their data was lost.

This happens all the time - if you hook your iPhone up to a corporate exchange server, it'll generally be set to wipe after 10 incorrect pin codes.

One of my coworkers lost the contents of their phone multiple times in the space of a week when their young kid got ahold of the phone and enter random pin codes. I've also had drunk friends at the bar wipe various phones this way.

Wonder what it looks like when you don't use a code (like me), is the HD just encrypted with a void code made of zero or does it change the key size? or is it not encrypted at all? Don't care about my phone security at all but makes me curious.


or is it not encrypted at all?

Not encrypted at all. Look under the security settings, and it should say something like "data protection disabled"

Tristam MacDonald. Ex-BigTech Software Engineer. Future farmer. [https://trist.am]

Advertisement


or is it not encrypted at all?

Not encrypted at all. Look under the security settings, and it should say something like "data protection disabled"

Are you sure about that? I'm inclined to doubt that. This would require writing (testing, maintaining) two separate paths, one that encrypts all data and one that does not. And a system where core components behaves differently for different people.

Much more likely, in my opinion, they take the same approach as SDD manufacturers are taking. That is, they always use encryption, but if you do not have a code set, it merely uses a default encryption key and shows "disabled" in the preferences. SSDs encrypt all data, always, not primarily for security but to keep the bits distributed more evenly (which is arguably better for the memory cells, or so they say). The fact that you can market the drive as "uses AES encryption" and that you can sell the exactly identical, same drive as "enterprise" version with OPAL-whatever-its-called-super-security-layer-X for twice the price comes as a plus.

Simply, instead of using a default key or a random key stored in a publicly accessible memory cell, let users choose to enter one at the boot prompt in the "enterprise" version, or pull it off a smartcard or TPM or whatever. Securitas ex machina.

If Apple did do anything different, I would be very much surprised. They'd have to be crazy. More work, more maintenance, more risk of something going wrong, and no benefits.

Looking at the Apple docs, you are correct that they have 2 separate levels of encryption. One with random keys that allows remote wipe on unsecured devices, and one with passcode derived keys to actually secure data.


If Apple did do anything different, I would be very much surprised. They'd have to be crazy. More work, more maintenance, more risk of something going wrong, and no benefits.

You'd be surprised by pre-Lollipop Android then, which doesn't encrypt *at all* without a passcode (and doesn't encrypt by default even with a passcode). And yes, the fact that it worked that way added a lot of work, and a lot of bugs :)

Tristam MacDonald. Ex-BigTech Software Engineer. Future farmer. [https://trist.am]

I guess it's too much work to ask Apple to give them the phone data.

Everything is better with Metal.

Isn't the point here that Apple can't in the first place? (hence the demand for a tool to allow bruteforcing)

Don't pay much attention to "the hedgehog" in my nick, it's just because "Sik" was already taken =/ By the way, Sik is pronounced like seek, not like sick.
Advertisement

Well, they want the tool more than the phone content.

Everything is better with Metal.

Obzen, perhaps you missed key parts of the discussion.

The FBI had a way in. If they had followed proper forensics techniques they could have used the existing iCloud connection. As far as forensics go, they messed up badly. Like "my investigator accidentally spilled 10 other people's blood all over the crime scene then burnt it down" badly. They started modifying account information (you don't modify anything under investigation, ever!) and it broke the sync.

What they are demanding is a way to enable brute-force attacks, akin to "we want a rock that will help us break glass, but we'll only use it on one specific piece of glass, promise!" The promise has already been shown to be false, as other police groups can demand it when it is created. New York City alone has stated they have about 200 phones they will use it on if the software gets created, which can be subpoenaed the moment it is created.

I wonder what would happen if they bring the iphone in house, create the software, apply it to the one phone, and then destroy said software.

It is no in and of itself evidence, so I'm not aware of any way that destroying it after the fact could be considered illegal, and it puts the world right back in the 'no tool exists' state it is currently in.

Old Username: Talroth
If your signature on a web forum takes up more space than your average post, then you are doing things wrong.

Read the text of the court order. That is not an option.

This topic is closed to new replies.

Advertisement