it would simply brute force attack the encryption until it found words that it recognized.
It is a simple task to create encryption that this could never work on, no matter how many millions of years you threw at it.
But the encryption used would not be the kind that is built into most email clients, because it requires both ends of the message to be complicit, much like the one-time pads used during the second world war.
That is precisely the issue with brute force. It works only on dictionaries of known words. If the password is made on non-words or non existent ones and is long enough, I seriously doubt brute force has much chance. Besides, it works only if the number of attempts is unlimited and response time fast enough. A very simple defence against brute force is to have a slow response time, that gets slower with each failed attempt.
No one seems to have touched on the point that it is the US government trying to to force Apple down this route. If this is allowed to go ahead, how does this protect iPhone users outside the US? Will this backdoor only be created on Apple devices sold within the US?
[h]FBI ASKS U2 HOW THEY MANAGED TO GET INTO EVERYONES IPHONE[/h] OSCAR DE BOER FEBRUARY 18, 2016
The FBI has asked for help from Irish band U2 to help get into a locked iPhone, as they seemed to manage it without a problem.
FBI spokesperson, Frank Hanratty has personally asked for help from Bono and the rest of the gang to break into the iPhone of a terrorist in the US after Apple refused to help them break into iPhones, citing privacy issues.
We wanted to ask Bono and the rest of U2 for help with this, because they managed to get into everyones iPhones without asking anyones permission.
Plus, not only did they manage to somehow break into and insert their own software onto millions of iPhones, they also made it very difficult to remove the offending piece of music, so these guys really are the experts here.
Bfnn tried to reach Bono for comment but U2 are said to be unavailable at this time because they are furiously tapping at keyboards and occasionally saying Im in.
You dont have to brute force the lower level encryption if you can brute force the passcode. The phone will decrypt the lower level disk encryption for you if it thinks they are you.
The idea that this would only be used for this and maybe some other important things later is almost laugh worthy. Once apple demonstrates that they can do it they will have a queue of ten thousand of them to do for police everywhere who want to see who some guy convicted of minor drug offenses was chatting with just like AT&T and Sprint have to do. An entire department staffed with hundreds of people whos only job is to package up police information requests.
a) Apple has a reputation to uphold (yeah that comes under the “Market Strategy” excuse…
b) Apple may not even be capable of doing it… Just because they put the encryption in the device, doesn’t mean they can decrypt it… after all thats kinda the idea.
c) Could Apple create a hack to allow a brute-force decryption of the iPhones PIN #? probably… But who PAYS for it? I’m guessing here (as are most other pundits on this subject), But Apple “might” be able to disassemble the phone, create a custom chipset (ROM/PROM … A9X42 …who knows) and basically build a new phone with iOS 9.2Zeta … but lets see…
1) re-design a special set of circuits to over-ride/replace existing encrypted gateway
2) write a “Zeta” version of iOS with zero security features
3) retro-fit this to an existing iPhone… without compromising the current contents
At best that will activate the home screen. Decrypt existing data? Uh, no, that STILL requires a key…
How much would Apple spend on #1-#3 above? couple hundred grand? more? Is the Feds gonna pay for it? hell no
Now how long will it take to figure out the Pin?
AND assuming all that does happen,
Is there really anything worth a damn? I mean we know who did it, we know how they did it, they are dead (and yes unfortunatly are others)
And IF there is anything helpful (which personally I doubt), who is to say they didn’t apply their own encryption at a FILE level?
all Apple would have done is broken the DEVICE level… So now we are months in the future, 100’s of thousands or more in money spent
and the Feds STILL have nothing more than they do today, and the victims families still have no more information either.
Yeah, there is an ample supply of products available for secure storage and messaging. Check Bruce Schneier’s recent list https://www.schneier.com/blog/archives/2016/02/worldwide_encry.html for some inspiration.
Writing your own encryption software using Xojo for iOS where a swapping of bytes is done before AES-256, base64 and SMS sending and any eavesdropper will have a tough nut to crack.
IMO Apple should be forced to unencrypt the phone, otherwise i if they keep refusing they are guilty in helping those criminals.
I think that the citizens of the country should be protected for these violent creeps. They don’t respect the privacy of their victims so why should privacy of those criminals be more important?
The FBI didn’t ask to open up their total security buildin in the Iphones, just only the unencrypted information from that one specific phone.
I wonder how Tim Cook would have reacted if his family was murderd.
No. It would set a dangerous precedent that could jeopardize the integrity of the privacy of the citizens. The laws in the USA guarantee the right of privacy to the citizens and does not guarantee that law enforcement can infringe on these rights. My understanding of this issue, which may be wrong or incomplete, is that the FBI wants the phone unblocked so they can see what else may be on it. In other words, they are on a fishing (phishing?) expedition. For me, that is not good enough to warrant forcing Apple to potentially violate the privacy protections build into the iPhone unless there were a way to guarantee that it would only affect that particular iPhone and no other.
There is the famous quip from Benjamin Franklin regarding those who will give up essential liberty to obtain temporary safety deserve neither liberty no safety. A good quote but here’s a far more recent one:
“There is nothing new in the realization that the Constitution sometimes insulates the criminality of a few in order to protect the privacy of us all.” Supreme Court Justice Antonin Scalia
And before you get on me about how I would feel if it were my family, I lost a close friend in the recent rampage in Paris. While I hate the concept of terrorism and the associated violence, I value my freedoms more.
I’m sure you all know I blogged about this very topic.
First, let’s talk about what the FBI is really asking for here. They are asking for Apple to make a special version of iOS that does not have the delays between password attempts so that a brute force attack can be made with more efficiency. They are even willing to allow Apple to do all of this in-house. We already have laws on the books in the US (that Apple has cited) that protect Apple from having to do something that is not part of their normal business practice in order to help the government. There’s even Supreme Court cases that support Apple on this so Apple is on pretty firm ground legally. It’s very unlikely they will capitulate in this case. Apple’s reasoning is that this will be the first of many requests. That’s an understandable concern.
It is within Apple’s power to help the FBI in this one particular case because this phone is an iPhone 5C. Newer phones, the 5S, 6 and 6S, handle security within something called the Secure Enclave. This is like a computer inside a computer with its own OS. While Apple is understandably sketchy on the details of the innards of the Secure Enclave, I would be surprised if Apple designed it to be updatable externally. I think there’s a good chance that Apple cannot help with the newer iPhones even if they wanted to do so.
Apple’s concern is that this would eventually lead to one or more governments demanding that iPhones have a true backdoor. Tim Cook is right when he says that once you let that genie out of the bottle, the bad guys will use it as well. The tsunami of cybercrime that would follow would be so large that the police would be unable to cope with it. No one would trust their phones anymore. But the worst part is that a true backdoor wouldn’t even solve the problem. Let’s say Apple went along with this and built such a backdoor. Once such a thing existed, the bad guys would switch to using their own custom apps to encrypt messages. It would then become an arms race. Apple logs all keyboard entry so the bad guys switch to their own custom entry system that just transmit coordinates of the keys pressed. Imagine a canvas control with your own entry layout and storing nothing more than the coordinates where the taps occurred.
The old saying is that there’s no such thing as a free lunch. Our personal privacy is important and it comes at a price. That price is that the bad guys can use the same technology to hide their communications. They use the same cars, the same roads, the same computers, the same networks, etc. As I blogged about recently, there are other ways for law enforcement to track down bad guys that don’t violate our right to privacy. At home I have a set of kitchen knives. A bad guy could break into my house, take the knives and use them to hurt someone. Does this mean I can no longer have kitchen knives? Of course not. Life is about risk management. You drive your car to the market knowing that there’s a chance you could die in an accident or kill someone along the way. That doesn’t stop you from doing it nor does it prohibit you from owning a car.
The right to privacy is an important principle of any free society. However, principles only mean something if you stand by them when it’s inconvenient. This is one of those times. Potentially giving up everyone’s right to privacy carries with it far greater risk than we could ever gain from getting access to a bad guy’s iPhone.