It's easy to assume that hackers work way above our pay grade. Electronic intruders must be able to exploit vulnerabilities in the software we use because they're evil geniuses, right?
That may be the case in some very sophisticated attacks, experts say, but in others, not so much. Programmers -- the people who create the software -- don't write all their code from scratch, instead borrowing freely from others' work.
The problem: they're not vetting the code for security problems. Working more as code assemblers than as writers, programmers are sourcing about 80 percent to 90 percent of the code in any given software application from third parties, many experts estimate. Sometimes programmers buy code from other companies, and sometimes they use open-source code that's free for anyone to use.
The problem affects all software, which means everything from the mobile apps on your smartphone to your favorite website to the programs you run on your computer. Everything except for the operating system on a device or computer is likely composed of building blocks of code rather than created wholly new, said Chris Wysopal, co-founder and executive at software security company Veracode.
The priority for all those highly paid programmers is speed, not security, Wysopal said. His company, which assesses software for businesses, released a report Tuesday analyzing its own clients' habits when it comes to software use. Veracode found 6.9 million flaws in more than 200,000 inspections of code used by its clients over the last year. Those clients fixed 4.7 million of the flaws. While in-house programmers likely wrote some of that code, industry numbers suggest the vast majority of it came from elsewhere.
"That's the trend -- to reuse as much code as possible," Wysopal said. It speeds up production time and lets software programmers work on solving new problems instead of reinventing the wheel. "Everything is good about that except for the inheriting-vulnerabilities part," Wysopal said.
Feds and flaws
Lowest ranking among the industries Veracode checked for security flaws was the federal government. "Part of the reason for this is that the government still uses older programming languages," Veracode researchers wrote in the report. That might not come as a surprise to those following news of multiple breaches of federal government workers' personal records, which compromised the Social Security numbers of millions of current and former federal workers and revealed sensitive personal information on everyone who has applied for a security clearance.
The problem of flawed source code is bad enough that Veracode has made a business out of checking software components for problems, and other companies are similarly offering to vet software components for those speed-hungry programmers. One of those companies is Sonatype, and its chief technology Joshua Corman says he's on the side of the programmers hitting Ctrl-V, the keyboard shortcut for "paste." Are programmers lazy? No, Corman says, just efficient.
"The best way to put this is the time value of money," he said. "You want to spend your unique talent pool on different problems." Some companies are using services like Sonatype and Veracode, and some are hiring security "fellows" whose paychecks are dependent on finding security flaws in code. Corman's company provides a repository of open-source code, but it also focuses on finding and eliminating problems in the code. In fact, Corman went so far as to check out a major government project for flaws to see if it was vulnerable to hackers.
That project was Healthcare.gov, the website rolled out by the Obama administration to get people signed up for the health insurance mandated by the Affordable Care Act. The website was notoriously buggy when it first went live, and Corman decided to look at the building blocks used by the government contractors who built it to see if hackers might have an avenue into it.
He looked at the third-party code accessed by the developers and concluded it contained some vulnerabilities. But he wasn't sure if those flaws made it into the website's final code. Nonetheless, this news alarmed lawmakers, Corman said. Eventually, those lawmakers learned that federal law doesn't explicitly require software programmers contracted by the government to vet code they didn't write themselves. A proposed fix to the problem-- a bill called H.R. 5793 -- would have required software developers to give the government a list of third-party code, assurance that all the code was free of known flaws, and a guarantee to fix any vulnerabilities that come up later.
Rep. Edward Royce (R-Calif.) introduced the bill in December at the end of the congressional session. The bill never made it to a vote, and Corman said he thinks it might be better suited for an executive order. Other industries have problems with faulty source code, too, according to Veracode's research. Retail and hospitality companies that use Veracode to vet their software had a poor track record with their efforts to encrypt data, for example. Again, this isn't surprising news given the breach of customer information at major retailers like revealed by Target and the Home Depot over the past year.
The pace of software development is only speeding up, meaning the problem is harder to keep up with, Wysopal said. "New languages and new environments to write code in are continuously being invented, and companies want to push software out the door as quickly as possible," he said. But speed doesn't have to sacrifice security, he argued. "They don't need to be mutually exclusive. If you build security processes in or if you require vendors to build it in, you can still go fast," Wysopal said. But, he noted, "It can't be an afterthought."