+

Cookies on the Business Insider India website

Business Insider India has updated its Privacy and Cookie policy. We use cookies to ensure that we give you the better experience on our website. If you continue without changing your settings, we\'ll assume that you are happy to receive all cookies on the Business Insider India website. However, you can change your cookie setting at any time by clicking on our Cookie Policy at any time. You can also see our Privacy Policy.

Close
HomeQuizzoneWhatsappShare Flash Reads
 

The NSA hack proves Apple was right to fight the FBI

Aug 21, 2016, 23:10 IST

Advertisement
Tim Cook, CEO of Apple, visits an Apple store where third grade children from PS 57 James Weldon Johnson Leadership Academy are learning how to code through Apple's 'Hour of Code' workshop program on December 9, 2015 in New York City. Cook said he hoped that teaching coding to children would become standard in education throughout the United States.Andrew Burton/Getty

After the unprecedented breach of hacking tools and exploits stolen from the US National Security Agency's elite hacking unit, some privacy advocates see it as clear vindication of Apple in its fight with the FBI earlier this year.

"The component of the government that is supposed to be absolutely best at keeping secrets didn't manage to keep this secret effectively," Nate Cardozo, a senior staff attorney with the Electronic Frontier Foundation, told Business Insider.

In February, a judge ordered Apple to help the FBI unlock an iPhone that was used by Syed Rizwan Farook, one of two attackers who killed 14 people in a December terrorist attack in San Bernardino, California. That order set off a vigorous debate between law enforcement officials seeking evidence and technologists worried over broader implications for personal privacy.

While the company's legal team fought the order, Apple CEO Tim Cook published a letter arguing against being forced to build a so-called "backdoor" that would subvert the encryption that not only kept the shooter's phone secure, but millions of other users of Apple's smart phones.

Advertisement

Most in the technology community rallied around Apple at the time, arguing that weakened encryption might help government investigators, but it would also make customers vulnerable to hackers.

Now, with a massive top-secret archive of some of the NSA's own exploits having been leaked online, it appears they were right.

First Look/Trevor Paglen

"The NSA's stance on vulnerabilities seems to be based on the premise that secrets will never get out. That no one will ever discover the same bug, that no one will ever use the same bug, that there will never be a leak," Cardozo said. "We know for a fact, that at least in this case, that's not true."

The government eventually backed down from its fight with Apple in late March, after investigators said they were able to unlock the shooter's phone with the "a assistance of a third party." It never disclosed who that was or how it broke into the phone.

Advertisement

Exactly how the FBI got into the phone is yet another case where the government is holding on to "zero days," or software exploits that are completely unknown to companies and users. These exploits, when found, are typically disclosed to vendors so they can fix the problem, used by hackers to break into systems more easily, or sold on the black market.

But Cardozo believes the FBI's exploit of the San Bernardino shooter's iPhone 5C, its still-unknown exploit of the Tor web browser in another case, and NSA's apparent hoarding of exploits that have now been made public, raises a larger issue around the legalities of government hacking.

"When the government finds, creates, or discovers a vulnerability in a system, there are essentially two things they can do: They can disclose it, or they can use it," he said. "But the rules around that are completely broken."

There are some guidelines around how the government is supposed to deal with vulnerabilities in what is called the Vulnerabilities Equities Process, a framework that is supposed to outline how and when it would make sense to disclose a vulnerability to an affected company if the larger security risk is greater than the reward it could yield. 

But the VEP is just non-binding guidance created by the Obama administration - not an executive order or law - which has no legal standing.  

Advertisement

"We need rules, and right now there aren't any," Cardozo said. "Or at least none that work."

NOW WATCH: This animated map shows the most probable path to a Trump victory

Please enable Javascript to watch this video
You are subscribed to notifications!
Looks like you've blocked notifications!
Next Article