News

Want Better Security? Bridge the Gap Between Dev and Ops

Computer security guru Gary McGraw is famous for pushing developers to take responsibility for building secure software. The operations side can only do so much with buggy applications and flawed systems, he has said. It's up to "the guys who build stuff for a living" to stop thinking about security as a feature, and to begin seeing it as an emergent property of a whole system.

McGraw, who is the CTO of software quality management provider Cigital, still believes that bad software is the root of the IT security problem (not to mention all evil), but at this year's RSA Security Conference, held in San Francisco recently, he spread the blame around.

"The guys who do security in the real world tend not to be software people," he told attendees during his "How to Break Code" session at the conference. "They're operations people. They're doing work that is very necessary, but they haven't been very helpful to the software guys, who really are trying to figure out how to build things to be secure."

There is a chasm between development and operations, McGraw said, and it's up to the security community to bridge the gap. How? By talking candidly and in detail about software exploits.

"A lot of people say, How could you possibly talk about software exploits?" he said. "That is so irresponsible! You're going to teach a whole new generation of people how to hack better. Well, that's exactly what they told Dan Farmer a decade ago."

Back in 1995, security expert Dan Farmer released the Security Administrator Tool for Analyzing Networks (SATAN), which was designed to automatically find security holes in computer systems. Farmer, who co-created the tool in his spare time, said that he designed it to help network administrators find and close security vulnerabilities before hackers could exploit them. Farmer was widely criticized for developing what was considered an ideal hacker tool, and he lost his job at Silicon Graphics when his bosses heard about his plans to give the program away.

"That was the state of the practice back then," McGraw said. "If you ran a penetration tool against your own network, you got fired. Today, if you fail to run network penetration and testing tools against your own network, what happens to you? You get fired. We've made a big shift on the network side."

But McGraw said that he hasn't seen the same shift on the development side. He cited as an example last year's controversy over a course on malware, taught by assistant professor John Aycock in the University of Calgary's Department of Computer Science. The school came under critical attack over a course designed to give students an opportunity to study malware techniques, along with countermeasures and "benevolent" forms of malware. The course also covered legal and ethical questions and general computer and network security issues.

"When John Aycock announced that he was going to teach the course, all the security vendors jumped on him," McGraw said. "They railed at him, in fact. How can you possibly teach a course about viruses and malicious code! That's irresponsible, and you're going to make a whole new generation of people who can do that! That attitude is as wrongheaded today as it was in 1995."

"We've absolutely got to talk about this stuff," he added. "We've got to learn about it so that we don't build defenses that are complete baloney, which we're doing today. A lot. It makes us feel good when we spend money on security, but if it doesn't work, who cares?"

In fact, talking about security disasters could serve as a kind of "bait" to get more people interested in security, he said.

"The world loves to talk about stuff that breaks," McGraw observed ruefully. "NASCAR fans watch it for the crashes. It's something that gets people excited. Talking about engineering cars and blueprints is boring. Nobody wants to talk about that. They want to talk about the crashes."

To attendees who doubted the NASCAR example, McGraw pointed to two of his own books. (He's written seven so far.) "Building Secure Software" (Addison-Wesley, 2001) did "really well for a little book on computer security" when it hit the shelves. But his latest book, "Exploiting Software: How to Break Code" (Addison Wesley, 2004), out-sold that book on the first day.

"It's just a fact of life: People would rather talk about the way stuff breaks than the way stuff doesn't break," McGraw said. "We in the security community need to use that as the bait to get people interested in thinking about how to build things right. We need to use that to our advantage."

Along the way, it might be a good idea to walk a mile in the other guy's shoes.

"How many of you have been through, or perpetrated on someone else, a security review?" McGraw asked his audience. "Over the past ten years, corporations have embraced the notion of a security review. Here's how that looks from the developers' perspective: You spend a lot of time and effort building this incredible thing, and you're knocking yourself out, and you hit your date, and then they bring in the security weenies to tell you why your baby is ugly. Well guess what, when we do that, the guys who busted their humps to get the thing done don't like us anymore. We shouldn't be surprised that developers run when they see us coming. Clearly, we need to talk to them about how to do it right in the first place."