Court Order to Apple to Unlock San Bernardino iPhone May Unlock Hackers

A judge’s order that Apple cooperate with federal authorities in the San Bernardino bombing investigation may have serious unintended consequences. There are no easy answers. Once more, a broad dialog is required.

Scales of JusticePreviously I opined about how a dialog should occur between policy makers and the technical community over encryption.  The debate has moved on.  Now, the New York Times reports that federal magistrate judge Sheri Pym has ordered Apple to facilitate access to the iPhone of Syed Rizwan Farook, one of the San Bernardino bombers.  The Electronic Frontier Foundation is joining Apple in fight against the order.

The San Bernardino fight raises both technical and policy questions.

Can Apple retrieve data off the phone?

Apparently not.  According to the order, Apple is required to install an operating system that would allow FBI technicians to make as many password attempts as they can without the device delaying them or otherwise deleting any information.  iPhones have the capability of deleting all personal information after a certain number of authentication failures.

You may ask: why doesn’t the judge just order Apple to create an operating system that doesn’t require a password?  According to Apple,  the password used to access the device is itself a key encrypting key (KEK) that is used to gain access to decrypt the key that itself then decrypts stored information.  Thus, bypassing the password check doesn’t get you any of the data.  Thus, the FBI needs the password.

What Apple can do is install a new operating system without the permission of the owner.  There are good reasons for them to have this ability.  For one, it is possible that a previous installation failed or that the copy of the operating system stored on a phone has been corrupted in some way.  If technicians couldn’t install a new version, then the phone itself would become useless.  This actually happened to me, personally, as it happens.

The FBI can’t build such a version of the operating system on their own.  As is best practice, iPhones validate that all operating systems are properly digitally signed by Apple.  Only Apple has the keys necessary to sign imagines.

With a new version of software on the iPhone 5c, FBI technicians would be able to effect a brute force attack, trying all passwords, until they found the right one.  This won’t be effective on later model iPhones because their hardware slows down queries, as detailed in this blog.

Would such a capability amount to malware?

Kevin S. Bankston, director of New Americas Open Technology Institute has claimed that the court is asking Apple to create malware for  the FBI to use on Mr. Farook’s device.  There’s no single clean definition of malware, but a good test as to whether the O/S the FBI is asking for is in fact malware is this: if this special copy of the O/S leaked from the FBI, could “bad guys” (for some value of “bad guys”) also use the software against the “good guys” (for some value of “good guys”)?  Apple has the ability to write into the O/S a check to determine the serial number of the device.  It would not be possible for bad guys to modify that number without invalidating the signature the phone would check before loading.  Thus, by this definition, the software would not amount to malware.  But I wouldn’t call it goodware, either.

Is a back door capability desirable?

Unfortunately, here there are no easy answers, but trade-offs.  On the one hand, one must agree that the FBI’s investigation is impeded by the lack of access to Mr. Farook’s iPhone, and as other articles show, this case is neither the first, nor will it be the last, of its kind.  As a result, agents may not be able to trace leads to other possible co-conspirators.  A  Berkman Center study claims that law enforcement has sufficient access to metadata to determine those links, and there’s some reason to believe that.  When someone sends an email, email servers between the sender and recipient keep a log that a message was sent from one person to another.  A record of phone calls is kept by the phone company.  But does Apple keep a record of FaceTime calls?  Why would they if it meant a constant administrative burden, not to mention additional liability and embarrassment, when (not if) they suffer a breach?  More to the point, having access to the content on the phone provides investigators clues as to what metadata to look for, based on what applications were installed and used on the phone.

If Apple had the capability to access Mr. Farook’s iPhone, the question would then turn to how it would be overseen.  The rules about how companies  handle customer data vary from one jurisdiction to another.  In Europe, the Data Privacy Directive is quite explicit, for instance.  The rules are looser in the United States.  Many are worried that if U.S. authorities have access to data, so will other countries, such as China or Russia.  Those worries are not unfounded: a technical capability knows nothing of politics.  Businesses fear that if they accede to U.S. demands, they must also accede to others if they wish to sell products and services in those countries.  This means that there’s billions of dollars at stake.  Worse, other countries may demand more intrusive mechanisms.  As bad as that is, and it’s very bad, there is worse.

The Scary Part

If governments start ordering Apple to insert or create malware, what other technology will also come under these rules?  It is plain as day that any rules that apply to Apple iPhones would also apply to Android-based cell phones.  But what about other devices, such as  televisions?  How about  Refrigerators?  Cars?  Home security systems?  Baby monitoring devices?  Children’s Toys?  And this is where it gets really scary.  Apple has one of the most competent security organizations in the world.  They probably understand device protection better than most government clandestine agencies.  The same cannot be said for other device manufacturers.  If governments require these other manufacturers to provide back door access to them, it would be tantamount to handing the keys to all our home to criminals.

To limit this sort of damage, there needs to be a broad consensus as to what sorts of devices governments should be able to access, under what circumstances that access should happen, and how that access will be overseen to avert abuse.  This is not an easy conversation.  That’s the conversation Apple CEO Tim Cook is seeking.  I agree.

How to speak the truth and yet lie? Ask General Alexander

Old joke in the industry: the difference between a sales person and marketing person is that the marketing person knows when he’s lying.  Which is General Alexander?

Let’s appreciate that the head of a spying agency is in a tough spot.  Allies and citizens of the U.S. alike are outraged, making an actual dialog difficult.  Leaders, however, must address hard issues head on and truthfully; and they must demonstrate command of the subject matter, or we waste our time.

Let’s go through some of the General’s statements:

“the assertions… that NSA collected tens of millions of phone calls [in Europe] are completely false”.

– From a BBC article

Maybe, but he and the president have in the past made the distinction between so-called “meta-data” (which the rest of us just call “data”).  And so maybe the NSA doesn’t have access to the calls, but he has not denied that they have access to who people called, the time and date they called, and for how long.  What is the truth?

Yesterday The Washington Post dropped another Snowden bombshell, indicating that the NSA was intercepting Google customer traffic by tapping into their communications lines.  The Guardian had previously reported that GCHQ was tapping fiber cables.  Alexander’s response, this time?

This is not NSA breaking into any databases. It would be illegal for us to do that. So, I don’t know what the report is. But I can tell you factually we do not have access to Google servers, Yahoo servers. We go through a court order.–From CNN

Except in this case, the NSA is not accused of breaking into servers, but rather tapping communications off of fiber cables.  By answering a charge that wasn’t made, either general doesn’t understand the issue and therefore cannot meaningfully inform the President or the public, or he does understand the truth and is intentionally prevaricating to the public.  What is necessary is a public debate over the policy issues relating to surveillance, and when it should and should not be authorized.  The people leading that dialog should be truthful and informed.

I’m sure the general is aware that everyone has their day of reckoning.  It’s time for his.  The president needs to find a new director of the NSA who can intelligently advance an honest discourse.

Who owns your identity?

“On the Internet, nobody knows you’re a dog.”  Right?  Not if you are known at all.  Those days are gone.  As if to prove the point, one of my favorite web sites is on the wrong side of this issue.  An actress unsuccessfully sued imdb.com for lost wages for having included her age on their site.  There is a well known axiom in Hollywood that starlets have a half-life, and age is something that is best kept secret.  IMDB countered that what matters is not an actress’ age but her ability to play a certain age.

My point is this: she sued and was unable to have information about her removed.  Is age something that you believe should be private?  I do.  I especially do for people born after 1989 where a birthday and a home city can lead to someone guessing your Social Security Number.

But what about other physical attributes one might consider private?  “He has a mole that you can only see if he’s naked.”  How about illness?  “This actor cannot lift his arm due to a stroke.”  Once the information is out there, there’s no way to get rid of it.   And this in the UK, which is subject to the European Data Privacy Directive.  The situation is considerably bleaker for your personal information in the United States.

Related to this is The Right To Be Forgotten.  In Europe they are considering new rules that say that you have a right to have information about you removed.  This has some American firms in an uproar, arguing that a lack of transparency only increases risk and inefficiency.  But what are the limits?  What about this actress who doesn’t want her age known?  How did her age provide for market efficiency?

Are bad iPhone maps a security problem?

A while ago I talked about business models and how they impact security.  The key thing then was that Apple had a direct path to the consumer, which drove update rates of iOS very quickly, in comparison to Android.  Implicit in all of that was that consumers would find a reason to upgrade to the latest software.

Now we see a new version 6 of iOS that has what can only be described as a miserable replacement for Google Maps, as well as a number of reported problems with WiFi connectivity.  All of a sudden, the tables are turned.  Are the 200 new features found in iOS worth risking one’s ability to use WiFi or have accurate mapping information?  Note that the question makes no reference to security.  That’s because consumers don’t care about that.

So, here’s the thing to watch, and Google will be watching very closely: what is the adoption rate of iOS version 5 as compared to its predecessor?  The converted have already moved over.  Now it’s time for the rest of us.  Will we or won’t we?  I already have decided to wait for a “.0.1” version of iOS 6, as my iPhone works fine as is, and none of the new features really seem so interesting, such that I want to risk breaking WiFi or my maps.  Note again, I’m not even mentioning security.