Quantcast
Channel: Encryption – Yottabytes: Storage and Disaster Recovery
Viewing all articles
Browse latest Browse all 30

Will Only Outlaws Be Able to Have Smartphone Encryption?

$
0
0

Now that Apple and Google have announced that they will incorporate encryption in smartphones by default, the question is how long law-abiding Americans will be allowed to continue to have encryption at all.

In case you missed it, Apple announced on September 17 that future editions of the iPhone would have encryption turned on by default in a way that no longer allows Apple to have access to encrypted data. “Apple’s new move is interesting and important because it’s an example of a company saying they no longer want to be in the surveillance business — a business they likely never intended to be in and want to get out of as quickly as possible,” writes Chris Soghoian, Principal Technologist and Senior Policy Analyst for the American Civil Liberties Union’s Speech, Privacy, and Technology Project. “Rather than telling the government that they no longer want to help the government, they re-architected iOS so they are unable to help the government.”  The following day, Google announced that future versions of the Android smartphone operating system would have encryption turned on by default as well.

Predictably, the FBI and law enforcement had kittens. “The notion that someone would market a closet that could never be opened – even if it involves a case involving a child kidnapper and a court order – to me does not make any sense,” said FBI director James Comey. He also went on to invoke the notion of the terrorism that could surely befall the U.S. when this happens. “Two big tech providers are essentially creating sanctuary for people who are going to do harm,”  agreed Ron Hosko, a former assistant director of the FBI’s criminal investigative division, told Marketplace. And pulling out the big guns, “Apple will become the phone of choice for the pedophile,” John J. Escalante, chief of detectives for Chicago’s police department, told the Washington Post. “The average pedophile at this point is probably thinking, I’ve got to get an Apple phone.”

Yep, terrorism, kidnapping, and pedophiles. They got the trifecta.

Company executives at Apple and Google told the New York Times that the government had essentially brought this on themselves with incidents like Edward Snowden, and that it was increasingly difficult for American companies to compete overseas with the perception that the U.S. government had its fingers in everything.

“The head of the FBI and his fellow fear-mongerers are still much more concerned with making sure they retain control over your privacy, rather than protecting everyone’s cybersecurity,” writes Trevor Trimm, in the U.K. paper The Guardian, after offering a line-by-line critique of Comey’s statement. Security experts pointed out that the government still had many other options by which it could legally request access to people’s electronic data, that the FBI didn’t cite any examples of cases where encryption would have prevented them from solving a case, and that one case cited by Hosko turned out to be irrelevant.

So the next question becomes, at what point might the federal government attempt to outlaw encryption again? Or mandate a back door?

In case you were born sometime after MTV, at one point in time, anything more powerful than 40-bit encryption was actually classified as a munition – you know, like bombs and missiles — and illegal to export from the U.S., and not all that easy to get hold of even inside the U.S. In fact, a guy named Philip Zimmerman got himself into a peck of trouble when he developed Pretty Good Privacy (PGP), intended to be Everyman’s data encryption. While it was fine inside the U.S., copies of it surfaced internationally, and for several years it looked like Zimmerman might face charges, which led to him being a cause celebre in the computer community.

In 1993, the Bill Clinton White House went further and proposed the Clipper Chip, an encryption system that included a back door so that law enforcement organizations could still read any data encrypted by the device. Which, of course, they’d only use if you were a bad guy, of course. But by 1996, partly due to the enormous wave of protest against the notion — and partly due to technical issues, such as bugs that were found in it (by a guy named Matt Blaze, who’s still around these days, commenting on the Apple/Google encryption flap) — the government had dropped the project. At the same time, the Clinton White House relaxed the rules on greater than 40-bit encryption.

These days, encryption is readily available, but generally you have to know about it and how to turn it on. What Apple and Google are doing are selling devices with it already turned on — and, in response to the increasing number of requests from the government for user data, they no longer will even have access to the user’s data.

(This does, of course, mean that if you lose your encryption key, you’re hosed.)

So what other effects can we expect from the Apple/Google decision?

  • Courts are still trying to figure out whether an encryption key is like a key or a combination to a safe — something you have or something you know — so they can decide whether you have to give it up. So law enforcement organizations are still taking people to court to force them to reveal encryption keys, and sometimes they win. Conceivably, with encryption turned on by default, this could happen a lot more often.
  • Having encryption be the default could also eliminate the “why are you encrypting it if you don’t have anything to hide?” presumption.
  • Of course, bad guys have pretty much figured encryption out, even when it’s not the default. To be blunt, Apple and Google’s actions simply mean that regular people will have the same capabilities as the bad guys. And in an era where companies and individuals alike are regularly losing laptops, disk drives, and smartphones with personal data in them — and then getting fined for losing the data and having it not encrypted in the first place — having encryption as the default simply makes sense.

So what would happen if the government were to outlaw encryption, or mandate a back door? The Electronic Frontier Foundation, which lives for this sort of thing, has a nice long list of possible repercussions.

Realistically, though, would outlawing encryption even be practical in this day and age? Look at it this way — if encryption were made illegal, that would mean that all the personal data on all the devices that get lost or stolen would then be accessible . It would make the Target incident look like a picnic. (The ACLU’s Soghoian pointed out that in 2011, the FBI was encouraging people to encrypt their data to keep it out of the hands of criminals.)

And everyone who believes that bad guys wouldn’t continue to be able to use encryption or would have to have a back door to their communications, please poke out your right eye. In the same way that bad guys continue to get access to illegal firearms today, bad guys would still get access to encryption, one way or another. Sorry, FBI, but that genie is out of the bottle.

It isn’t clear whether the government is going to try to outlaw encryption again, or try to mandate a law enforcement back door. There is some talk about Congress enacting a law, but due to the Edward Snowden revelations, few Congressional representatives want to touch it, according to Bloomberg Business Week. Still, it’s something we need to watch out for — but it looks like computer vendors are increasingly unlikely to help, according to Iain Thompson in the Register. “It’s unlikely Apple or Google is going to bow down to the wishes of government and install backdoors in their own products,” he writes. ” This would be disastrous to sales if found out, and there are increasing signs that the tech sector is gearing up for a fight over the issue.”


Viewing all articles
Browse latest Browse all 30

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>