Heartbleed Shows Government Must Lead on Internet Security

The U.S. government must step in to fill the leadership vacuum

Join Our Community of Science Lovers!

For much of the past two years, two thirds of all Web sites were susceptible to having their memory extracted by remote attackers—memory containing private information, passwords and encryption keys. The flaw, called Heartbleed, was the most serious Internet security flaw ever found. Heartbleed attacks would not have shown up in most sites' logs, so we cannot be sure how widely it was exploited or what might have been leaked.

When the flaw came to light earlier this year, the White House made an unusually clear and direct statement that no part of the U.S. government had known about Heartbleed before it was disclosed, heading off the outcry that would have ensued had the National Security Agency been withholding knowledge of so severe a vulnerability. But the federal government does not get off so easily in the incident. It is guilty of not providing leadership that could have averted the crisis in the first place—and that will be needed to avert the next one.

The leadership gap has arisen largely because more and more security software is “open source”—code that is made and shared widely by programmers for the common good. Heartbleed was caused back in 2011 by an error in code submitted by a German Ph.D. student to an encryption package called OpenSSL. It was a common type of error, but somehow nobody spotted it. The flawed code made it through OpenSSL's vetting process and was adopted into the official OpenSSL version, where it sat unnoticed for two years.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


Open-source software such as OpenSSL is supposed to be good for security because everyone is free to read and analyze the code. Open code maximizes the odds that somebody, somewhere, will find a bug before it burns end users. Open-source advocate Eric S. Raymond famously called this Linus's law: “Given enough eyeballs, all bugs are shallow.” That's good news, if you have enough eyeballs.

But OpenSSL suffers from a major eyeball shortage. The project's Web site lists a core team of three people, and its annual budget is less than $1 million. Another million or two spent on a security audit might well have prevented Heartbleed. OpenSSL security, however, is a public good with the attendant funding problems: once it exists, no one can be prevented from benefiting from it, so many hope for a free ride on someone else's dime.

Government often pays for public goods such as basic scientific research. But government did not invest in the security of OpenSSL. Despite spending billions a year on cybersecurity and declaring “cyber” a national priority, the feds did not offer even a few million dollars to bolster this core security infrastructure.

Government also failed to provide authoritative, concrete advice after Heartbleed was made public, when users and small-site operators across the Net were wondering what to do. Although it offers such advice to people faced with natural disasters or physical safety risks, government left users stranded when Heartbleed showed up. Most companies, meanwhile, did little more than warn users to change their passwords.

Somebody needs to take the lead in funding and coordinating audits of infrastructure, organizing useful disclosures of vulnerabilities to the public, and providing accessible advice and guidance for users and operators of small Web sites. Existing entities perform some of these functions—for example, in the aftermath of Heartbleed, the Linux Foundation and several tech companies pledged support, including funding, for open-source security—but a central organization should unify efforts, identify unaddressed issues and present clear information to the public. If neither government nor private companies step up, then we need an independent institution dedicated to serving the security needs of end users.

We will be fighting the security battle for a long time, and nothing can make us entirely safe. But better institutions can make these crises less frequent, less serious and less confusing to users. With some leadership, and a modest investment, we could have a champion for user security.

It’s Time to Stand Up for Science

If you enjoyed this article, I’d like to ask for your support. Scientific American has served as an advocate for science and industry for 180 years, and right now may be the most critical moment in that two-century history.

I’ve been a Scientific American subscriber since I was 12 years old, and it helped shape the way I look at the world. SciAm always educates and delights me, and inspires a sense of awe for our vast, beautiful universe. I hope it does that for you, too.

If you subscribe to Scientific American, you help ensure that our coverage is centered on meaningful research and discovery; that we have the resources to report on the decisions that threaten labs across the U.S.; and that we support both budding and working scientists at a time when the value of science itself too often goes unrecognized.

In return, you get essential news, captivating podcasts, brilliant infographics, can't-miss newsletters, must-watch videos, challenging games, and the science world's best writing and reporting. You can even gift someone a subscription.

There has never been a more important time for us to stand up and show why science matters. I hope you’ll support us in that mission.

Thank you,

David M. Ewalt, Editor in Chief, Scientific American

Subscribe