The FBI Should Disclose the Vulnerability to Apple—Just As Soon as It Has Another
April 22, 2016 8:00 am (EST)
- Post
- Blog posts represent the views of CFR fellows and staff and not those of CFR, which takes no institutional positions.
More on:
Yesterday, Federal Bureau of Investigation Director James Comey revealed that the FBI had paid more than he will make in the remainder of his time at the FBI to break into the phone of San Bernardino shooter Rizwan Farook. Quick research and math by Reuters puts that number at $1.34 million. By any metric, that is a lot of money. It’s also $1.34 million more than Apple would have been willing to pay for it.
Unlike many other Silicon Valley firms, Apple has no bug bounty program. If a researcher finds a vulnerability in an Apple product, the most they will get is a polite thank you (possibly on a public website). And while Apple is an outlier for not having a program at all, most other companies are unwilling to pay anywhere near what the FBI paid even for critical vulnerabilities.
Instead, companies will offer payment in the low tens of thousands--enough to motivate a researcher to “do the right” thing instead of selling a vulnerability on the black market where it likely could fetch many times that. Payouts are calibrated so tech companies know when vulnerabilities are discovered but don’t motivate their discovery. Companies don’t just want to avoid paying out large sums for vulnerabilities, they also want to avoid having to fix their code.
From a public policy perspective, the current setup looks a lot like a market failure. Overall, cybersecurity would be improved if more vulnerabilities were discovered, companies were pressured to fix those vulnerabilities, and (eventually) motivated to write more secure code in the first place. That’s not happening right now.
The FBI and other federal agencies that purchase vulnerabilities could play a role in making this market function. Instead of sitting on the current vulnerability, the FBI should put out a million-dollar bounty for another one, targeting currently sold phones. Every time it finds a vulnerability and starts to exploit it, it should plan to disclose it within six months and get to work finding a new vulnerability. Rinse and repeat.
Such a program could cost a significant amount of money but according to the latest statistics out of the Office of Management and Budget, the Federal government currently spends nearly $5.6 billion on “Shaping the Security Environment.” Carving off a small portion of that to develop a program to actively find vulnerabilities for the purpose of securing the ecosystem would likely be a wise investment. After all, in addition to a counterterrorism mission, the FBI also has a counterintelligence mission and plenty of foreign spies surely want to access the intel on the iPhones of thousands of government workers.
Under current policy, the federal government, in the words of Cyber Czar Michael Daniel, is strongly biased toward disclosing vulnerabilities. Based on press reports, we know that in the last year of about 100 that went through the vulnerabilities equities process, only two were retained. Yet the purpose for which the government is discovering them is exploitation. The benefit to the security of the ecosystem is a secondary effect.
If the equation were flipped and the FBI and other Federal agencies recognized a responsibility to find vulnerabilities so they could be fixed not just exploited, the benefits to the ecosystem would likely be far greater. While the government might only need to retain two vulnerabilities, it might discover and disclose 200 or more of the type that are not easily or incidentally discovered. Over time, the effect could be that Apple and other companies learn how to write code with fewer vulnerabilities in the first place.
More on: