Friday, May 09, 2014

Today's Gem O'Truth: Nothing Is F-R-E-E On The Internet!

Before we get to the post o'the day, this blogger encountered a good followup to Jane Mayer's "Reagan's Benghazi" in The Daily Beast and you can read the followup here. Now — for those of your who changed your password(s) in response to the Heartbleed malware — meet the two guys named Steve who didn't see Heartbleed comin' out of the tunnel. Stephen Henson and Steve Marquess devised OpenSSL for web site security and developers far and wide grabbed this code for their sites because it was F-R-E-E! And, poor souls recently received e-mail messages directing them to change their passwords because some (many?) black-hat hackers had inserted Heartbleed into sites far and wide and the black-hats received login names, passwords, social security numbers, and what have you: one click at a time. Each time the heart of a site responded to those clicks, a piece of personal information leaked out to the black-hats. Open source software was F-R-E-E, what could go wrong with that? If this is (fair & balanced) self-delusion, so be it.

[x BuzzFeed]
The Internet Is Being Protected By Two Guys Named Steve
By Chris Stokel-Walker

Tag Cloud of the following piece of writing

created at TagCrowd.com

It was the last thing Steve Marquess and Stephen Henson wanted to hear. In 2006, three years into a struggle to get a key component of OpenSSL validated as secure by the U.S. government, and they’d received bad news: Their code needed more work. OpenSSL is the default encryption engine used by much of the internet and the government was adamant any program it gave approval to would be stringently tested. Marquess, a consultant for the Department of Defense, had given years of his life and his whole project’s budget to getting this approval — the government’s official money had run out six months after the project began in 2003.

“We kept getting requirements to make silly changes,” explains Marquess, now a 59-year-old biker who has traded government work for equally stressful 40-hour-or-more weeks in the shadow of Sugarloaf Mountain near Adamstown, MD “And we kept making them.”

Marquess was mostly acting as a liaison between the government and his sometime partner — and the genius behind the upkeep of the OpenSSL code itself — Stephen Henson, a reserved, reclusive 46-year-old Brit with a Ph.D. in graph theory mathematics who lives in Staffordshire, England. “Everyone was preparing to walk away” from the project because of the difficulties, writes Henson via email. This is his first and only public comment on OpenSSL since the Heartbleed bug — a routine coding error that triggered the largest security breach in the history of the human race, compromising passwords and sending companies and governments scrambling — became known to the general public on April 7.

Early in the morning of Thursday, June 15, 2006, Marquess and Henson were sent a near-impossible task. The Cryptographic Module Validation Program, a joint U.S.–Canadian validation body that fell under the auspices of each country’s government, wanted the team to make a raft of complicated code changes to meet the requirements for accreditation under its security standard. And it had to be done fast. “If we didn’t do it by Monday morning, they’d reject our validation, we’d have to start over again, and it’d take another three years,” says Marquess, bitterness rising in his voice at the memory. “Now this is a huge amount of work — days of silly, pointless work. And this pissed Steve Henson off.”

Henson was on his summer vacation in Great Yarmouth, a seaside resort near Norfolk, England. He had nothing but an HTC Hurricane cell phone, a laptop, and a frustratingly slow internet connection for company. Like Marquess, he was incensed: After several years struggling with government bureaucracy for little to no pay, he wasn’t about to give up now. As Marquess puts it, “At that point in time, completing the project became a matter of stubbornness.”

Henson sent off an email to Marquess around 4 a.m. “I was irritated by this ultimatum, couldn’t sleep, and decided to use this time to see if I could get a solution,” Henson explains.

“And he got it done,” Marquess says quietly. Henson worked through the night, and sent off a preliminary solution that could work. Big government was placated. “That’s the kind of guy you want at your back.”

The events of that weekend brought the two men close together with a bond that is just as strong today as it ever was — despite the fact that they’ve never met in person. “My skills, such as they are, lie in coding. I am not a businessman,” Henson writes. “Steve Marquess is far better at that side of things than I am. In short he handles the things I cannot and vice-versa.”

Before that June weekend and since, companies and government departments have benefitted from OpenSSL’s free price and constant updates, often without giving back. Overwork and understaffing — two things that have been cited as the main causes of the Heartbleed bug, which suddenly brought OpenSSL and its gatekeepers to the world’s attention — aren’t news to Steve Henson and Steve Marquess. But thanks to Heartbleed, everyone else is beginning to understand what the duo have known for a while: Something needs to change, and goodwill and fond words alone won’t cut it. Right now significant parts of the internet’s cryptographic security rely on a tiny handful of people who are already stretched to the limits. If that fails, the modern world as we know it could cease to work as it should.

Open-source software has been a boon for all, allowing users access to high-powered free versions of commercial software, no strings attached. They’ve all noticed the benefits, and warmly embraced open source. Companies like Barclays have been able to draw down spending on software by 90% through changing its program allegiances. The U.S. government is picking up the pace of development of its own open-source programs, while the U.K.’s minister in charge of technology implementation across government believes significant savings [PDF] can be made by using free tools.

If it weren’t for security toolkits like OpenSSL, every time you log into Instagram or Gmail, or enter your credit card details on Netflix or Etsy, security keys that handle your personal information would theoretically be vulnerable. Thirty billion e-commerce transactions were estimated [PDF] by consultancy CapGemini to have been carried out last year; a significant share of those were handled in part by OpenSSL.

The program’s roots start in Australia in 1995 with the development of a cryptographic protocol implementation called SSLeay, created by Tim Hudson and Eric Young (the “eay” stands for Eric A. Young). There were ways of encrypting information passed from person to person and peer to peer online before SSLeay, but strict export laws in the U.S., where those progenitors were developed, meant that unless you lived in the 50 states or a dependent territory, you couldn’t access them. In fact, a quirk of the U.S. legal system meant that cryptography was, until the late 1990s, placed on the U.S. Munitions List [PDF], alongside semi-automatic firearms and tanks.

“We actually had no choice in terms of a new implementation from scratch,” says Tim Hudson. “At that time you simply couldn’t license full-strength cryptographic toolkits from any of the inventors of the technology — U.S. companies could not export it. So if you were non-U.S.A.-based you either had to go with deliberately weakened security systems or write your own.”

Beginning that year, Hudson and Young wrote their own SSL implementation, and for three years supported its development before they moved into the paid sector at RSA Security in late 1998. That left a gap that needed filling.

On Dec. 18, 1998, Ben Laurie, who had been involved in the upkeep of SSLeay, sent an email to subscribers of the Apache-SSL mailing list with the subject line “[ANNOUNCE] New version of SSLeay.” Laurie wrote that he and Stephen Henson, a fellow SSLeay coder, were looking for advice, suggestions, and a name for a new version of SSLeay they were continuing in the absence of Young and Hudson. One response suggested “OpenSSL” as the name. It’d take a few weeks — and a new year — until Hudson and Laurie felt willing to reveal any more about their project.

On Thursday, Jan. 7, 1999, Laurie sent out another email. This one, titled “ANNOUNCE: OpenSSL (Take 2)” and complete with some ASCII art, declared the start of the OpenSSL project, “a collaborative effort to develop a robust, commercial-grade, fully featured, and Open Source toolkit implementing the Secure Sockets Layer (SSL v2/v3) and Transport Layer Security (TLS v1) protocols with full-strength cryptography world-wide.”

OpenSSL took on the 165,000 lines of code that formed SSLeay and began rapidly building it out over the coming decade. During this time, the number of users of OpenSSL increased too — including some within the highest levels of government. The Defense Advanced Research Project Agency (DARPA) and the U.S. Department of Homeland Security have both in the past confirmed their use of OpenSSL. Big companies and government clients became comfortable with the workings of OpenSSL, and in the early 2000s further solidified its role as a crucial cornerstone of the internet’s infrastructure.

As Steve Marquess gradually drew down the work on his government contract in 2009, he still had strong connections to those — particularly Steve Henson — he had met through the OpenSSL project. When they came to him that year asking for help with consulting contracts, Marquess went one better: He set up the OpenSSL Software Foundation (OSF) for the explicit purpose of raising revenue to fund OpenSSL’s development. Unlike many entrepreneurs, Marquess wasn’t hoping for a bounteous exit or public acclaim. For him, OpenSSL was a passion, one he could focus on in retirement: “My daughter’s graduating from school, the house is paid off, I don’t have to worry about starving anymore, and I’m thinking what can I do to help these guys? So I say I’ll do the only thing I know how to do: hustle a small business. That’s why I created OSF: for the explicit purpose of raising revenue.”

“Steve Henson’s always been there for me,” he continues. “I feel like I’m doing a good thing for OpenSSL and him right now. You can’t expect the people coding OpenSSL to starve. But that’s what tended to happen and that’s what, when I first met Steve Henson, was happening to him.” (According to one source, before the foundation was created, Henson earned around $20,000 a year.)

The OpenSSL Software Foundation has never received more than $1 million in income in a given year. It survives mostly through for-hire contracts with big companies. These can range from ad hoc arrangements earning $250 an hour to longer-term work for hire over the course of several years. A fraction of the OSF’s income is donations from supporters and well-wishers. From that cache of money, running costs for the foundation such as outsourcing validation testing (which runs into hundreds of thousands of dollars a year) and new servers and equipment — a recent server upgrade in Germany cost $8,200 alone — is taken out. After that, there’s not much remaining.

“The OpenSSL Foundation has some very devoted people,” says Matthew Green, an assistant research professor at Johns Hopkins University and an outspoken critic of OpenSSL. “It just doesn’t have enough of them, and it can’t afford enough of them.”

The talent pool from which the foundation can draw is shallow to begin with. As Chet Wisniewski, a senior adviser at Sophos Security, explains it: “You need someone who’s both a programmer and a cryptographer, which narrows the field down to a very small number.”

The fact that OpenSSL pays next to nothing constrains things further. Those who do help Henson out often juggle coding with full-time paying jobs elsewhere. Others can’t code for OpenSSL: Their employment contracts prohibit it, so they simply act as advisers. That leaves Henson responsible for 60% of the code commits (or sets of changes to the source code), twice as many as the next-most-prolific developer, Andy Polyakov, who’s compensated some. Most of the code added in the past few years has been approved by Henson, or tapped out on his own keyboard.

The come-and-go, casual nature of the group means that hierarchies aren’t formalized. Marquess can’t say exactly how many people help out with its development at any one time, but directs me to a list on the foundation’s website naming seven active contributors. He points out that until April 23 the list was out of date — and included at least one person who is deceased.

As a result, OpenSSL’s code is a slurry of cobbled-together snippets that work — but only just. It’s strewn with developers’ comments to one another, sandwiched between slashes. Some of them are aesthetic, like, “BIG UGLY WARNING! This is so damn ugly I wanna puke … ARGH! ARGH! ARGH! Let’s get rid of this macro package. Please?” Some are outright petrifying, like the comment that reads, “EEK! Experimental code starts.” They’re unflinchingly honest, yes, but they give an insight into the chaotic nature of the code that makes the program.

“OpenSSL’s code isn’t clear,” says Kenny Paterson, a professor of information security at Royal Holloway, University of London, who’s been working in cryptography research since 2000. “It’s a rat’s nest, full of stuff that’s been outmoded.”

This stems in part from how its current funding structure affects its priorities: For now, OpenSSL’s development lives and dies by the OSF’s commercial income, almost all of which comes from putting in new features, rather than maintaining the old. The current setup means, Steve Marquess readily admits, that “the fundamentals of OpenSSL are being neglected. No one is hiring us to maintain the current code base.”

And of course mistakes can happen. When they do, there currently isn’t the money to pay a person, never mind a team, to go through the 456,332 lines of source code with a fine-tooth comb to find them.

There are plenty of people looking at the OpenSSL code, including professor Paterson, whose Information Security Group (ISG) at Royal Holloway is incentivized by the University of London to conduct research into OpenSSL and to spot bugs. But Paterson points out that there are significantly fewer people writing the code within the OpenSSL project than those on the outside looking in, scanning it for weaknesses — and the former are adding more stuff in, not taking stuff out and cleaning up the contradictions contained within the code base. The cruft of code has tripled in size from late 1998 and early 1999 when SSLeay became OpenSSL; on average 1,575 new lines of code have been added each month for the past 15 years.

Some 90,000 such lines of extraneous code have been removed in a week by the coders of LibreSSL, a renegade offshoot of OpenSSL, without affecting the way it runs. In among the confusing maze of OpenSSL code it was bound to be difficult to discover a simple mistake like Heartbleed.

“You and I can look at that code all day long and we’re not going to find the Heartbleed flaw,” says Sophos Security’s Wisniewski. “These teams are very small and barely funded.” (One concern is that big government does have the money, may have found the exploit, and could have kept quiet about what it knew.)

Marquess was well aware of the situation OpenSSL faced when, last December, he pulled his Subaru truck into the parking lot of the Woodstock Inn, a biker bar situated alongside Route 125 in Woodstock, Md., that advertises itself as the home of good food, good music, good friends, and good times.

The man Marquess was meeting, Matthew Green, the Johns Hopkins professor and critic of OpenSSL, was hardly a good friend. The tunes played in the bar weren’t all that great. As for good times, in the daytime lull when the two men met, the bar was pretty much deserted. And for what it’s worth, Green wasn’t that impressed with his $10 burger, either.

Marquess and Green’s meeting coincided with an allegation that the U.S. government had tampered with OpenSSL. The Guardian and the New York Times had recently published details — based on documents leaked by Edward Snowden — of an NSA decryption program, code-named BULLRUN, designed to circumvent and weaken the encryption software and standards that kept people safe online. Their conversation moved from BULLRUN to the state of OpenSSL’s funding.

“Steve agreed with me that it was ridiculous and we needed to make the [OpenSSL] toolkit better,” says Green, “but then he told me how bad things were. It turns out there was only really one full-time developer, and that explained a lot of the problems.”

“There’s a cumulative effect,” says Paterson. “There’s water building up behind the dam and with Heartbleed, the dam has burst.” In his opinion, those coding the SSL implementation aren’t blameless, though: “They are under-resourced,” he says, “but they don’t do themselves any favors.”

In the weeks since the dam did overflow, some have argued that taking OpenSSL private would have encouraged investment and mitigated against a trivial error potentially unlocking an enormous amount of private data. That would run counter to the beliefs of those involved in OpenSSL, though.

“Open-source projects are a fascinating phenomenon, and OpenSSL is almost a stereotypical example,” explains Marquess. “A handful of people get together, and they scratch their own itch. They write code because it pleases them. Because it’s open source and people find it useful, they build collaborative community forums where people can exchange ideas.”

Another major criticism laid against the group’s door is that the code is programmed in C, a programming language that has little to no built-in error checking. Had OpenSSL been coded in a safer language, goes this line of reasoning, the Heartbleed bug would never have occurred. That could well be true, but it’d be a gargantuan task that would involve translating nearly half a million lines of code, built up over nearly two decades, without introducing any more errors. All humans are fallible. OpenSSL, out of necessity, just relies on fewer humans than is ideal.

When a computer pings a server using OpenSSL, it asks for a “heartbeat,” which includes a minuscule amount of information, to prove that both sides are still working. So far so normal. But a coding error — one that most experts, including the man who submitted it for checking, agree was a pretty basic mistake — meant that a nefarious user could force the server to return data beyond its minuscule heartbeat: up to 64K of memory, enough for 65,536 characters of plain text, or the Gettysburg Address 44 times over.

What’s in that memory, no one knows. Often it’s gibberish — totally useless information. But sometimes it’s something important, like security keys, passwords, or personal details. It’s the luck of the draw. Of course, mechanizing the exploitation of the Heartbleed bug increases dramatically the chance of a hacker lucking upon something someone would regret losing. Doing that often enough could result in building up a decent database of private and personal information. What made the Heartbleed bug so concerning was that any incursion on a vulnerable computer system was undetectable: Hackers could have been exploiting the hole for months or years, and no one would be the wiser.

Near midnight on New Year’s Eve 2011, OpenSSL accidentally introduced a bug into the heartbeat routine. The code change, which had first been drafted a few weeks earlier by a German developer named Robin Seggelmann, went through a series of rewrites, according to publicly available correspondence, before getting the OK from Stephen Henson. It stood in the code base for more than two years before being found.

On April 1 of this year, Google informed OpenSSL about the vulnerability, which one of their researchers, Neel Mehta, had uncovered. Finnish security company Codenomicon would soon dub it “Heartbleed.” The OpenSSL team informed some companies running OpenSSL of the breach and encouraged them to patch the hole in their systems. Mehta claimed a $15,000 bounty for finding the error, and promptly donated it to Freedom of the Press, a charity that develops and supports open-source encryption software for journalists to be able to operate secretly.

Cryptographic experts soon realized this was a big deal. Codenomicon, recognizing the importance of the bug, took the unusual step of setting up a website explaining what the exploit it called Heartbleed did in layman’s terms; that was picked up by the press, who spread it far and wide — and not always accurately. Many reported that OpenSSL is the default encryption engine for an estimated two-thirds of the internet’s websites. But not all websites use an encryption layer, and not all of them were vulnerable to the Heartbleed bug. One Github user scanned 10,000 websites on April 8, half of which had no SSL encryption at all. Of those that did, 15% were exposed to the vulnerability.

Since then the press and public have demanded to know who these people are and how Heartbleed happened. (Some more conspiratorially minded commentators have alleged the bug was a result of purposeful negligence.) Marquess and Henson and the rest have for the most part hidden from the press and attention. As custodians of the internet, they are answerable to their users. But they do see things a bit differently: They don’t have anything to hide, but they also don’t want to give up themselves to scrutiny. They weren’t elected and they didn’t seek the limelight.

“You have this collection of things that didn’t happen,” explains Matthew Green. “You had somebody putting in a bug in the first place, somebody failing to catch it when it was put into the code, you have every other developer in the world failing to catch it later, and then you have nobody running machine tests to detect these conditions that might have done something about it, at least until this year.”

“This has been a wake-up call and a transformative event,” says Marquess. He’s still working 10–12 hour days, just as before; he sounds tired, his larynx loose. At times he loses names mid-sentence.

“It’s fair to say the group has not been tightly organized,” he admits, picking his words carefully. “Many of these guys to this day have never met face-to-face,” he explains. “When you work in those conditions absolutely the only thing you know about that person is the quality of his work. And this group are the best of the best of the best.

“I go out and I sign contracts with companies with my signature on it for large amounts of money, and I’m committing to it,” Marquess explains. Emphatically, he adds, “I put my signature on that. And he and I have no legal documents between us; we never have. It’s all been a virtual handshake, if you will. [Henson] tells me he’ll do something and how long it’ll take, and he has never failed to meet a deadline or a deliverable. The guy’s like a rock.”

“The biggest thing to learn here is that OpenSSL is now part of our critical infrastructure,” says Paterson. And like most other elements of critical infrastructure around the world — highways, medical services, and banks — OpenSSL needs a secure source of funding and some real attention.

Steps need to be taken to ensure that a critical part of our online world won’t just collapse in on itself if either Steve Marquess or Stephen Henson walked away tomorrow.

“We need effectively applied manpower,” Marquess tells me. He notes he chooses his words carefully, and repeats the term several times during our conversations. “I want a recurring funding source that will pay for several full-time SSL developers.”

I ask how much that’d be in an ideal world. Marquess points me to several similar open-source coding projects as a yardstick: the Apache Software Foundation ($905,732 total revenue in fiscal year 2012–13), the Linux Foundation (annual income of $6.25 million as of April 2014), and the Mozilla Foundation ($311 million in 2012). But, I repeat, how much does OpenSSL need?

“A few million a year would do grandly,” he says. He sounds tired. “There should be half a dozen guys working full-time, plus support.”

On Thursday, April 24, the Linux Foundation announced it was establishing the Core Infrastructure Initiative, a multimillion-dollar project with the backing of big name companies such as Google, IBM, Facebook, and Microsoft, “to fund open source projects that are in the critical path for core computing functions.” $3.9 million will be spent over the next three years, with big companies paying in at least $100,000 into the annual funding pot. First up on the list of core infrastructure that needs improving, its organizers say, is OpenSSL.

“The Core Infrastructure Initiative looks very promising and I hope to see great things come of it,” Marquess says. Though the details have yet to be sorted out, that cash will be of great help. That said, especially because it will be split among multiple projects, it will also probably not be enough.

In the nearer term, OpenSSL is taking its own definitive steps toward self-sustainability. For several weeks rumors have been afoot that OpenSSL would add a second full-time developer to its ranks.

Though Marquess is quick to say that talk is cheap, and nothing has yet been committed to paper, he has exclusively confirmed to BuzzFeed that one of the current part-time contributors to the project presently holding a day job will likely be able to devote his or her efforts to improving OpenSSL on a full-time basis within weeks.

More immediately, Marquess says, discussions are at an advanced stage to transition Stephen Henson into a position where he is solely tasked with the maintenance and improvement of OpenSSL, rather than juggling the philanthropic development with lucrative commercial work. “This’ll free Steve up to concentrate on the upkeep of the OpenSSL code base,” says Marquess.

That reorganization, in one fell swoop, would double the capable full-time productivity of the OpenSSL team, and mitigate the risk of code holes like Heartbleed that stem from overwork and understaffing. But the full-time staff is not the only sector of the internet’s security custodians that will be growing. Another developer capable of managing the complexities of the online cryptography program has this morning joined the OpenSSL team on a part-time basis, BuzzFeed can exclusively reveal. Matt Caswell, a U.K.-based developer and cryptography expert, was invited to join the project by the seven current active team members.

It’s all been a huge shot in the arm for OpenSSL and the OpenSSL Software Foundation, and Marquess doesn’t want to stop there. “As appropriate funding becomes available,” he writes in an email, “the OpenSSL team will expand to be bigger, better, and more effective.”

Stephen Henson follows up with his own email to the group. “Changes are coming,” he writes, “big changes for the better.”

And yet both men remain devoted to the project’s independent, open-source identity. That will stay even in the face of big money. Long hours for little remuneration would be preferable to being in hock to a single supporter.

“It’s not even acceptable to me to rely entirely on funding from any one specific interest, whether they attach conditions or not,” says Marquess. “That would in and of itself be an undue influence.” Ω

[Chris Stokel-Walker is a 24-year old freelance writer for BuzzFeed, The Economist, The Sunday Times and The Magazine, based in the UK. He received a BA (English Literature) from the University of Newcastle-upon-Tyne.]

Copyright © 2014 BuzzFeed, Inc.



Creative Commons LicenseThis work is licensed under a Creative Commons Attribution 4.0 International License.

Copyright © 2014 Sapper's (Fair & Balanced) Rants & Raves