Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
FBI Refuses to Say Whether It Bought iPhone Unlocking Tech 'GrayKey' (vice.com)
88 points by _o_ on April 13, 2018 | hide | past | favorite | 62 comments


It seems safe to assume that they did purchase it -- or that they will, in the near future (perhaps they haven't technically "purchased" it yet if, for example, the PO or payment hasn't yet completed).

I mean, $30,000 for the "unlimited" one-year license? Seriously, why wouldn't they purchase it. If local and state police agencies have bought this device, I think we can all safely assumed the FBI has as well.

Hell, I wouldn't be surprised to hear that the FBI recommended the purchase of this device to local and/or state agencies.


>>Hell, I wouldn't be surprised to hear that the FBI recommended the purchase of this device to local and/or state agencies.

Like they need any recommendations, they have cases open :). Not all cases are a middle aged guy, minding his own business that gets his phone searched at JFK. A lot of crimes are thefts, drugs, sex and evidence is in the phone, they have search warrants but can't access it. Or couldn't. They are law enforcement conventions and business that cater to them, so they already know. Not to mention Twitter, FB forums and so on.

I'm with Apple, trying to make this unbreakable, but if cops figure it out, and they have warrants, kudos to them. Maybe Apple already does, but they should be buying all zero-days out there. If they don't sell for $1 million, offer them $5 million. Or $15 million and still no one at Apple would have to skip a meal to cut costs https://finance.yahoo.com/quote/AAPL/key-statistics?p=AAPL


Law enforcement using a throw away account here. Why is this such a mystery? Of course they have it. We're all scrambling to find the 15 grand to get a license. (its 15K for a web based license, 30K for a standalone license) Cold cases are being re-opened because we can now access devices we have shelved. I guess I'm just confused why it's such a big deal?


> We're all scrambling to find the 15 grand to get a license

Well, there's that part.

Apparently lots of my tax dollars are going towards purchasing a temporary license for a temporary solution to a permanent problem that isn't really a problem at all?

Though I must admit, one day those things are going to look great in some museum. They tell a really interesting story. Hopefully the price tag is included on the exhibit description.


Let's not forget that it isn't just one policy agency wanting to buy these.

It's every single department, in every single city, in every single state.

That's thousands, if not tens of thousands, of departments buying these when they'll all be useless as soon as Apple patches the exploit.

Seems like a whole lot of wasted money. But who knows, they might just catch enough "big bad Marijuana dealers" to offset the cost.

That's what the LEO really means by cold cases. Nothing important. Just more drug busts.


> Nothing important. Just more drug busts.

People dying because of drug overdoses seems like a pretty big deal to me, but then again (thank God) I've never had a family member or close friend succumb to a drug addiction.


I don't think it's very charitable of you to assume that just because people don't believe that police handling of drug crimes is effective they don't also think the opiate crisis in the US is a big deal (I'm assuming that's what you're referring to).

If you don't believe that a solution works, then further effort on a non-functioning solution isn't a big deal, even if you feel the problem is a problem.


Many drug overdoses come as a result of it being illegal. You can't be sure of the potency, and there is the incentive to use different active ingredients (higher potency, thus being easier to smuggle, but also increasing the risk of overdose).


To be fair, a lot of crime evidence is in the phones. Given that we want cops to investigate and jail them, you can't blame them for doing everything possible--within the law. $15k is nothing, labs cost money, detective time cost money and so on.


> Apparently lots of my tax dollars are going towards purchasing a temporary license

Depending on where you live, maybe 16 cents on the dollar is going to “defense.” And I’d imagine that $15k per license is a tiny fraction of that.

https://www.cbpp.org/research/federal-budget/policy-basics-w...


[flagged]


I would usually write you off as someone who self identifies as law enforcement with a throwaway account (ironic, since no upstanding citizens have anything to hide!), but a real question:

What departmental protections will exist for someone like yourself for using this without a properly authorized court order? Self restraint? Professional integrity?

If you cannot tell from the tone of people here, if I may be so bold and pretend to express a sentiment I find on behalf of others: I find the zeal for your industry to not properly control tools within legally mandated limits disgusting, specifically because I oft hear pleas for better and newer tools and never we will strive to double down on our civic duty and better exercise our control and use of dangerous tools.

To be fair, you learn from the best: anecdata from friends tells me the defense intel industry post Snowden scandals, with asshats misusing feverently denied tools for illegal citizen monitoring but you can blow that off too, was to increase background screening of employees, not fix and control monitoring system access. That would be ludicrous. So I am not surprised if they can't get it, neither can you.


> many of our cases are from drug overdose deaths were opening the deceased persons phone may allow us to find out who delivered the lethal dose

An alternative way of looking at it is that they died from a drug overdose and you are using that as a reason to invade their privacy. Something about that doesn't sit quite right with me - I know if I had family who died in those circumstances I wouldn't want you pouring through their life.


Doesn't seem like conducting forensics "over the web" is a good way to control the custody chain or prove that evidence wasn't planted.


the data from the device isn't being transmitted "over the web". The data stays secure on the device. Once the device opened the defense has the right to also examine the phone and determine if "evidence was planted".


Granting access to outside computers means you can't disprove that outside meddling hasn't happened. I would expect a certified forensics lab to maintain a strict firewall for all electronic devices.


I guess the prospect of outside meddling will seem sufficiently unlikely that juries will not view it as much of a hindrance. After all, even with a chain of custody, there are innumerable opportunities for evidence to be planted. You just have to get one or two people who are willing to lie about things. In the US, the standard is "reasonable doubt," and the faint chance that evidence is tampered with during this unlock process will not likely trip the "reasonable" part of that standard, even if it does add some non-zero amount of doubt.


It's a black box that you don't know quite how it works, and you are required to give a third party access to it at all times. I don't know how you can state with certainty that nothing else on the phone was changed.


For me personally, the problem is with trust. I'm not confident that these devices won't encourage the agencies that have them to "go fishing" on the phones of every person they arrest just because they have these shiny new toys.

I also don't trust these agencies will be held accountable when they abuse these devices.

Sally Suburb has a warrant for unpaid tickets? Better check her phone while we have her!

I recall a certain incident, which I'm sure wasn't isolated. I believe it was in California somewhere, where police essentially saw obtaining and trading nudes they would obtain from peoples' phones as a game.

Using the threat of force to obtain nude images of women you've pulled over sure has a rapey feel, to me. Going on and forwarding those images to your friends sounds an awful lot like revenge porn as well. Apparently the police in that area don't agree, though.

I don't believe any of those officers received any actual punishment beyond the standard "take a paid vacation while we let this blow over in the media".

I know, I know, it was just a few of those mythical and rare "bad apples".. but until the good apples who supposedly outnumber the bad start cleaning house, and stop standing up for those "bad apples" to protect the image of their "brotherhood", trust will continue to be an issue.


I know, I know, it was just a few of those mythical and rare "bad apples".. but until the good apples who supposedly outnumber the bad start cleaning house, and stop standing up for those "bad apples" to protect the image of their "brotherhood", trust will continue to be an issue.

This is the problem with the "bad apples" phrase. It seems to me people forget the line is "Bad apples spoil the bunch" it's categorically NOT "A few bad apples sure are annoying as heck" as if merely acknowledging that bad apples exist is enough to signal you're doing something about the problem.

Police Departments across the country seem to have a hard time getting this, and as do many of their supporters to the point of "it's a few bad apples" becoming an utterly worthless phrase anymore, IMO.


No mystery, just a morality tale: If you can f8ck someone, do it whether is legal or not.

The FBI wants backdoors, doesn't get backdoors - as far as we know at least - and uses illegal devices to bruteforce mobile phones. There's no grey area here, either you respect people's privacy or not. Circumventing the essence of law one way or another sends a strong signal and it is the wrong kind of signal IMHO.


My guess is that Apple will find a way to secure their own copy of GrayKey using a shell company and reverse engineer the exploit. Like others have said, it's a cat and mouse game.

This seems to be a software exploit if it requires not opening up the iPhone. There are more sophisticated hardware techniques (one was "decapping" the chip and reading the data out so you can try passcodes elsewhere), but I believe Apple's also finding mitigations for those as well.


> This seems to be a software exploit...

Yes but it certainly is exploiting a major hardware(?) flaw in the Secure Enclave Processor and/or the secure boot chain. The SEP is supposed to hold the decryption key, seeded at manufacturing time and unreadable at a hardware level, and block brute forcing with a chip-enforced time delay. And the whole system including the SEP is supposed to have a secure boot chain so only Apple can authorize new firmware.[1]

The FBI could break into the 5c because the brute forcing was only a weaker software limitation at that point. But since the A7 the SEP was supposed to handle it.

[1] https://www.apple.com/business/docs/iOS_Security_Guide.pdf


Wow so how is it possible for them to do a hardware exploit by just exposing the lightning cable?

From the look of this device, you just plug in the lightning cable and never open up the phone.


Based on the reported time-to-unlock it's pretty clear they're brute-forcing the passcode. The phone has an escalating lockout time after failed attempts, so they must have some way of evading that.

I've read a few theories of how that might work, from timing attacks to carefully-sequenced reboots of the phone to try to confuse the lockout mechanism.


That worked in the past but Apple got hip to it and specfically note that the SEP delay counter survives reboots. Maybe they missed something again but it seems less likely.

My money is on a set of vulns similar to what CTS-Labs found in AMD’s secure coprocessors last month.[1][2]

[1] https://thehackernews.com/2018/03/amd-processor-vulnerabilit...

[2] https://blog.trailofbits.com/2018/03/15/amd-flaws-technical-...


Well, it's technically a software exploit. They are clearly loading custom firmware onto the phone which the secure boot chain is supposed to prevent. On top of that the SEP coprocessor has its own secure boot chain and is supposed to be even more locked down.

I guess I can't speak to how the SEP implements its security features, i.e. which pieces are physically burned into hardware or just implemented in firmware with a secure boot chain. But either way this exploit points to a major flaw in the SEP, bigger than your average OS vuln.


Most people seem to have missed the reported fact that a security researcher now working as an executive at Grayshift was previously an employee at Apple. The exploit being used in the GrayKey device may have been exported directly from within Apple's walls. Perhaps this man was involved enough with iOS security that he a) built in a backdoor; or b) discovered a vulnerability, and instead of disclosing it, left Apple to profit off the exploit; or c) gained enough knowledge of the systems to be able to reverse engineer an exploit from scratch, after leaving Apple.

The fact that the company producing the GreyKey has an executive who is a security researcher and used to work for Apple is such an obvious, damning piece of evidence. I don't understand why this connection hasn't been more widely reported and investigated.


Well, he better be ready to flee the USA if he used his prior knowledge because the civil suit is guaranteed and a criminal suit is very likely. Of course, once he makes tens or even hundreds of millions, finding a country where Apple doesn't have a subsidiary shouldn't be too hard.


> gained enough knowledge of the systems to be able to reverse engineer an exploit from scratch, after leaving Apple

While the other two are suspicious, I don't think the last one is. If you're good enough to come up with a clean room implementation than I don't see any foul play here.


Isn't that by definition _not_ a clean room? You have knowledge about the device that you wouldn't be able to derive just from looking at the device itself.


No, that's what the other two were. I'm saying that there's the possibility that you got hired at Apple because you were smart, and end up continuing work on iOS even when you leave, but without any sort of knowledge of the internals. For example, say they rewrote some portion of code that you were familiar with: in that case, you should be free to take a look since you no longer have any inside information on it.


What was his position at Apple?


There are some really interesting noninvasive side channel attacks out there e.g. differential power analysis which is trickier but not impossible with a battery powered device


Since the DMCA prohibits the circumvention of Access Controls, couldn't Apple litigate the heck out of the GreyKey?


section 1201 (the annoying part of DMCA saying you can't circumvent copyright protection systems) states in section E...

> This section does not prohibit any lawfully authorized investigative, protective, information security, or intelligence activity of an officer, agent, or employee of the United States, a State, or a political subdivision of a State, or a person acting pursuant to a contract with the United States, a State, or a political subdivision of a State.

So it would probably go nowhere in court

https://www.law.cornell.edu/uscode/text/17/1201


You can sue anyone for anything.

On paper sure there's merit to it, in practice it's probably an exercise in futility while the exploit is un-patched and GreyKey is useful to law enforcement.


I would think the best use of Apple's time is reverse engineering (if the exploit is not already assumed/known) the device and releasing an update rendering it useless. If that isn't possible, perhaps litigation would be the next step. However simply making it 'not work' for LE would be the lowest hanging fruit (under many assumptions about the complexity of this exploit).


There are criminal penalties for violation of the DMCA as well, including jailtime.


According to the iOS Security whitepaper[0];

Each device has a unique 256-bit AES key called the "UID", and a programmable "device group ID" called the "GID".

The UID is "fused" and the GID "compiled" into the Application Processor and Secure Enclave during manufacturing, but no software or firmware can access them. The firmware can only see results of encryption and decryption, and the keys are accessible only to the AES engine's silicon. They are not available via JTAG or other debugging interfaces.

On some later chips the Secure Enclave generates the UID itself.

Apart from the UID and GID, the Secure Enclave can also generate new keys using a RNG. See also: Krypton[1].

(see page 12)

Passcodes are "entangled" with the device's UID, so brute-force attempts must be done using the Secure Enclave (or with an electron microscope?).

Each attempt has an iteration count calibrated for 80ms, which would mean an average of ~11 hours to brute force a 6-digit pin[2].

iOS also has longer delays for multiple attempts; 1 minute after 5 attempts, 5 minutes after 6, 15 minutes from 7-8, and 1 hour for each attempt after 9. The paper later mentions that devices with the Secure Enclave will enforce the longer delays, including after reboots, but this doesn't seem to to be the case for GrayKey.

(see page 15)

GrayKey claims to crack an iPhone (with 4-digit pincode?) in around ~2 hours, but more than 3 days for 6-digit pincodes. Which might work out to ~1s per guess?[3].

If you use a alphanumeric passcode, or a custom numeric code, you likely don't have to worry about these unlockers.

A random 10-digit pin will take an average of 12 years 6 months to crack[4].

[0] https://www.apple.com/business/docs/iOS_Security_Guide.pdf

[1] https://krypt.co

[2] 6-digit pin, 80ms/guess: 1e6 * 80 / 1000 / 60 / 60 / 2 = 11h 7m

[3] 4-digit pin, 1s/guess: 1e4 * 1000 / 1000 / 60 / 60 / 2 = 1h 23m

[3] 6-digit pin, 1s/guess: 1e6 * 1000 / 1000 / 60 / 60 / 2 = 5d 18h 53m

[4] 10-digit pin, 80ms/guess: 1e10 * 80 / 1000 / 60 / 60 / 24 / 365 / 2 = 12Y 8M 6d


This is strange... I posted here a theory on how this might work and the post has dissapeared completely while showing zero points in my comments page. Just a single downvote wouldnt make it not show, correct?

Does HN censor potential security disclosures?

All I said was it was probably using techniques like voltage and timing analysis for instance as described here:

https://www.coursera.org/learn/hardware-security/lecture/2Ug...


I still see that comment, fwiw.


If you are referring to this comment[0] I see it just fine.

[0] https://news.ycombinator.com/item?id=16827734


Thanks - duplicate story I didn't realise I was in the other thread


What's up with these stories? It's been known for awhile now that the fbi and other _american_ agencies have backdoors into every cellphone: https://wikileaks.org/ciav7p1/#ANALYSIS


If you're going to down vote me, at least state the reason you are.

Here's another article from the same source about the FBI using 'classified' tools: https://motherboard.vice.com/en_us/article/7xdxg9/fbi-hackin...


Probavly because they are independently discovered vulnerabilities, specifically not backdoors. That term implies something intentionally done.


How it can be solved:

Apple, announces $10 million bounty to reveal exploit. I guess within hours they'll have it, probably from GrayKey engineers (might be hard to claim given NDAs).


Hey if any GrayKay employees wanna split the bounty let me know...


Is anyone selling a charging dock modeled after this yet?


Dear Apple staff reading this: the continued silence of Apple on this matter is making me lose trust in the safety of my iPhone. I want to know what iOS version protects me against the exploit used by the GrayKey, if indeed I am, or I want to know I’m not if I am not.


The article says even the latest version of iOS is vulnerable. Apple hasn't patched it yet.

But if you have a strong enough password it won't work against you. The exploit only lets them try to guess your password more quickly. It doesn't let them bypass your password.


In all seriousness, what are you going to do about it with your loss of trust?


Buy a pixel? Google have been doing really good work with android encryption


As someone who has never used an iPhone and has always stayed within the Android ecosystem, I doubt any device on the Android market can compete with what Apple has in terms of security and privacy at the moment.


Would guess the Pixel is more secure. Google poject zero is first rate in terms of finding and mitigation of vulnerabilities. Pixel also gets monthly updates. Have a pixel 2 XL and love the phone and highly recommend.


i'll leave apple phones for the Librem 5.


Apple wont be able to tell you that until the get and reverse engineer a GrayKey.


I assume all versions do not have the fix until told otherwise.


and those iphones already in police custody will not get the update, so too late for them...and their owners.


I don't really care whether or not the FBI bought this device or another. What I want to know is what's Apple's response to all of this?

iOS11 seems to have almost purposeful security weakenesses. I'm willing to give Apple the benefit of the doubt here, but only if they fix whatever flaws these guys and Cellebrite are using to break into iOS11 iPhones.

Both those decryption devices seem to rely on iOS11 so it must a new change, which means it shouldn't be too hard for Apple to figure out which one of its recent changes caused this weakness in security.


I agree.

I'm fine with the FBI going about the letter of the law with a warrant to obtain evidence.

I'm also fine with Apple hardening their devices against such attacks, eventually the "GrayKey" technology will become commodity and we'll need to protect our information from casual thieves.

However, I take a strong stance against mass surveillance and obtaining (any/all, including metadata) information without due process. This includes getting pulled over and having police search my phone, or as a US citizen, re-entering my country and having my phone searched.


> iOS11 seems to have almost purposeful security weaknesses

Can you expand on this?

> Both those decryption devices seem to rely on iOS11

Why do you say this? The article says

> based on GrayKey marketing material, that the tool can unlock even the iPhone X, Apple’s most recent phone, as well as devices running iOS 11, the latest Apple mobile operating system.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: