Advertisement

Can the government force you to write code?

Started by February 19, 2016 10:15 PM
105 comments, last by frob 8 years, 5 months ago

Possibly relevant:

https://www.techdirt.com/articles/20160223/07015733683/list-12-other-cases-where-doj-has-demanded-apple-help-it-hack-into-iphones.shtm

Whether or not they need could benefit from the same hack is not known, but it does show there could be a potential for it ultimately ending up being used in bulk.

Also just to make it sure, it seems:

http://www.nytimes.com/2016/02/25/technology/apple-is-said-to-be-working-on-an-iphone-even-it-cant-hack.html?smid=fb-nytimes&smtyp=cur&_r=0

Don't pay much attention to "the hedgehog" in my nick, it's just because "Sik" was already taken =/ By the way, Sik is pronounced like seek, not like sick.

Apple may care about their customer's "privacy" only insofar as it affects Apple's bottom line. But make no mistake, Apple is delighted to pry into every aspect of your phone themselves, if it makes them more money.

If your bottom line is affected so much by loyal customers ready to pay premium prices, there are a lot of things you don't do.

You don't use non-premium materials in user accessible areas.

You don't sell the stuff in shoddy stores.

And you certainly make sure people trust your company.

For apple, their image is everything. Is there anyone that has done a scientifical test showing that apple devices outlast devices from other manufacturers?

I think at least techies know that apple devices often are NOT faster than devices from other manufacturers. Their software might be more optimized (console effect), but apart from that, the components used are often not cutting edge. High midrange to low highend at best.

They might still be market leaders when it comes to design, but competitors have caught up a lot (Microsoft expecially), and design is subjective.

They tend to brute force clever gadgets to the market and find a use for it others have left to rot in their R&D. But that is a dangerous thing. People grow tired of gadget fads (I am sure that "force touch" is cool and all, but after having a fingerprint reader on my Android device that actually works quite well and is useful I have to say... I am not nearly using my phone often enough to be hyped about gadgets like that).

They might still have the edge on intuitive software... not a fan myself, but I can see that point. Still...

Apple is a Brand more than anything else. They sell good looking machines with clever software and some clever gadgets built in. That alone would most probably not be enough to keep so many people for so long in their walled garden.

Apple has an image to loose. Without that image, it will loose A LOT of business. Besides the walled garden problem and having to switch the whole IT department from Apple to MS, there are always alternatives in all areas of their business. Apple cannot compete on price, and in some cases also not in functionality (usability maybe... ).

If past expierience tells us one thing, its this: Yes, Apple is just as evil as any other big Corporation. But as soon as their evil ways become public, they try to change gear. And they always seemed to follow trough. At least until the general public forgot about.

And the general public still hasn't forgotten about the whole NSA Stuff, Snowden, and similar sh*t. So Apple, being good at marketing as they prove time and time again, will continue striving for more privacy until either they cannot improve it further (the unhackable phone, let the whining of government agencies worldwide begin), or the general public looses interest.

Advertisement

I've split off the discussions of Donald Trump from this discussion, it is now over here.

Please keep this discussion on topic to the Apple/FBI situation as originally asked.

what part of the IPhone has encryption.

are we talking about password or when in a call or some thing

what part of the IPhone has encryption.
are we talking about password or when in a call or some thing

All the user data is encrypted. It is encrypted using a cryptographically secure key and a cryprographically secure algorthm. The key is generated using the user's PIN, the phone's hardware ID, and a cryptographically secure unique random number embedded in a layer of silicon as part of the Technocal Protection Measures (TPM) embedded in the CPU.

To brute-force the cryptographically secure key used to encrypt the user data would take, on average, several thousand lifetimes of the universe (hence the adjectives). Of course, you could get lucky and find the right key in only dozens of expected lifetimes of the universe.

The hardware ID and the embedded random number are fixed and unique to the phone in question, so to crack the key, you only need to brute-force the 4(?)-digit pin on the device itself, which should take mere seconds of compute time. Unfortunately, there are added TPM that prevent such simple brute forcing, since the PIN is the weak link. Those TPM include a 5-second delay between each unsuccessful PIN attempt, and automatic wipe of the user data after a certain number of unsuccessful PIN attempts, and no ability to enter those PINs except by pressing the touchscreen of the phone manually. The court order in question is compelling Apple to produce a version of its operating system that will remove those additional TPMs and basically reduce hacking to simply brute-forcing a 4-digit PIN. The problem is that once such an OS image exists, it can be cracked, propagated, and installed on any iPhone and everyone's data is open to abuse by any agent of ulterior motive.

Some people are confident that signed images will prevent distribution of the cracked OS. After all, it's worked spectacularly well for copy protection in games.

It's possible to use a microscope and start slicing through the TPM chip to try to find the random number used to generate the key, but at a very high risk of permanently destroying it. The silicon was designed with that attack vector in mind.

Stephen M. Webb
Professional Free Software Developer

what part of the IPhone has encryption.
are we talking about password or when in a call or some thing

All the user data is encrypted. It is encrypted using a cryptographically secure key and a cryprographically secure algorthm. The key is generated using the user's PIN, the phone's hardware ID, and a cryptographically secure unique random number embedded in a layer of silicon as part of the Technocal Protection Measures (TPM) embedded in the CPU.

To brute-force the cryptographically secure key used to encrypt the user data would take, on average, several thousand lifetimes of the universe (hence the adjectives). Of course, you could get lucky and find the right key in only dozens of expected lifetimes of the universe.

The hardware ID and the embedded random number are fixed and unique to the phone in question, so to crack the key, you only need to brute-force the 4(?)-digit pin on the device itself, which should take mere seconds of compute time. Unfortunately, there are added TPM that prevent such simple brute forcing, since the PIN is the weak link. Those TPM include a 5-second delay between each unsuccessful PIN attempt, and automatic wipe of the user data after a certain number of unsuccessful PIN attempts, and no ability to enter those PINs except by pressing the touchscreen of the phone manually. The court order in question is compelling Apple to produce a version of its operating system that will remove those additional TPMs and basically reduce hacking to simply brute-forcing a 4-digit PIN. The problem is that once such an OS image exists, it can be cracked, propagated, and installed on any iPhone and everyone's data is open to abuse by any agent of ulterior motive.

Some people are confident that signed images will prevent distribution of the cracked OS. After all, it's worked spectacularly well for copy protection in games.

It's possible to use a microscope and start slicing through the TPM chip to try to find the random number used to generate the key, but at a very high risk of permanently destroying it. The silicon was designed with that attack vector in mind.

So this is only useful for remote data then not phones they actually got their hands on? If there's only 4 digits to bruteforce and the delay only is 5 seconds and the one forcing the delay is the TMP chip it seems like all you need is Something to plug instead of the screen sending finger inputs every 5 seconds? With 10 000 combinations that's 10 000X 5 second at most so 50 000 seconds or 833 minutes or 14 hours. Sounds plenty fast enough for any court case?

Advertisement

You missed the bit where the phone wipes itself after a number of incorrect attempts.

Tristam MacDonald. Ex-BigTech Software Engineer. Future farmer. [https://trist.am]

Can't they disassemble the physical hardware, and then clone the flash memory (or whatever memory device it has), and then crack a clone of it without fear of self-erasure?

Can't they disassemble the physical hardware, and then clone the flash memory (or whatever memory device it has), and then crack a clone of it without fear of self-erasure?

I don't know how the iphone does it, but in general cryptographic material is stored in a HSM (hardware security module) that is specifically designed to not only prevent people from just cloning its contents, but to also actively destroy itself if it detects any attempt to mechanically break into it (e.g. by releasing a dissolving chemical if you try and disassemble it, or something like that, but more elaborate). That makes it quite hard to clone, although not impossible for a sufficiently resourceful agency, I guess.

And if done right, without knowing the HSM's contents you have no chance to break the device's encryption, because the encryption keys are derived from the user's password based on the contents of the HSM (which could be anything) so you can't brute force the password, and it doesn't really matter that you can safely clone the rest of the data: that data is worth zero bits without the encryption key to decrypt it. In other words, unless you can learn the HSM's contents, all attempts to enter a PIN must go through it.

To be honest though I find it quite unlikely that this phone's security could really stand up to a government-funded agency with them having physical access to it. That is pretty much the hardest possible threat model and I doubt a commodity phone could survive it. This is quite clearly a political battle in order to claim precedent in future cases; it is, after all, much more convenient for security agencies to just be able to brute force random people's passwords than to surgically extract encryption keys from the phone's security module.

“If I understand the standard right it is legal and safe to do this but the resulting value could be anything.”

clearly a political thing
Even more so as the whole thing is pretty pointless. There is nothing new to be learned from that phone:
  1. The people who were killed are already dead, as are the terrorists. It is unlikely that they will be able to shoot more people from their graves.
  2. This happened 3 months ago. Whichever accomplice that might have planned a subsequent attack the next day would have done it by now. That didn't happen.
  3. Whoever might be behind the attack (the "commanding guy", if you will) has either left the country or lives under a different name in a different city, or maybe never even set a foot in the country. That kind of person has a dozen fake identities, and there's no reasonable chance of finding him anyway. If there is any doubt about that, look how many identities were found of Tarek Belgacem after he was shot.
  4. All phone calls and/or SMS including metadata of innocent people are eavesdropped, processed with voice recognition, and recorded anyway (and at least the metadata is stored indefinitely). This means, of course, that phone calls and/or SMS made by terrorists are also eavesdropped and recorded. So it is absolutely clear who that terrorist couple talked with and when, and what they talked about.
  5. If you just ask Google, they can even tell you the exact location of the phone during the last half year thanks to them tracking users with fake hotspots. I wouldn't be much surprised if a governmental agency did the same, anyway (so you likely don't even need to ask Google).

This topic is closed to new replies.

Advertisement