NYT : The Threat Is Complexity Itself

Thursday, September 13, 2007

Everybody knows hackers are the biggest threat to computer networks, except that it ain’t necessarily so.  Yes, hackers are still out there, and not just teenagers: malicious insiders, political activists, mobsters and even government agents all routinely test public and private computer networks and occasionally disrupt services.  But experts say that some of the most serious, even potentially devastating, problems with networks arise from sources with no malevolent component.  Whether it’s the Los Angeles customs fiasco or the unpredictable network cascade that brought the global Skype telephone service down for two days in August, problems arising from flawed systems, increasingly complex networks and even technology headaches from corporate mergers can make computer systems less reliable…..“We don’t need hackers to break the systems because they’re falling apart by themselves,” said Peter G. Neumann, an expert in computing risks and principal scientist at SRI International, a research institute in Menlo Park, Calif.  Steven M. Bellovin, a professor of computer science at Columbia University, said: “Most of the problems we have day to day have nothing to do with malice.  Things break.  Complex systems break in complex ways.”  When the electrical grid went out in the summer of 2003 throughout the Eastern United States and Canada, “it wasn’t any one thing, it was a cascading set of things,” Mr. Bellovin noted.  That is why Andreas M. Antonopoulos, a founding partner at Nemertes Research, a technology research company in Mokena, Ill., says, “The threat is complexity itself.”  Change is the fuel of business, but it also introduces complexity, Mr. Antonopoulos said, whether by bringing together incompatible computer networks or simply by growing beyond the network’s ability to keep up.  “We have gone from fairly simple computing architectures to massively distributed, massively interconnected and interdependent networks,” he said, adding that as a result, flaws have become increasingly hard to predict or spot.  Simpler systems could be understood and their behavior characterized, he said, but greater complexity brings unintended consequences.  “On the scale we do it, it’s more like forecasting weather,” he said…..In the case of Skype, the company — which says it has more than 220 million users, with millions online at any time — was deluged on Aug. 16 with login attempts by computers that had restarted after downloading a security update for Microsoft’s Windows operating system…..As computer networks are cobbled together, said Matt Moynahan, the chief executive of Veracode, a security company, “the Law of the Weakest Link always seems to prevail.”  Whatever flaw or weakness allows a problem to occur compromises the entire system, just as one weak section of a levee can inundate an entire community, he said.  This is not a new problem, of course.  The first flight of the space shuttle in 1981 was delayed minutes before launching because of a previously undetected software problem.  The “bug heard round the world,” as a former NASA software engineer, John B. Garman, put it in a technical paper, came down to a failure that would emerge only if a certain sequence of events occurred — and even then only once in 64 times.  He wrote: “It is complexity of design and process that got us (and Murphy’s Law!).  Complexity in the sense that we, the ‘software industry,’ are still naïve and forge into large systems such as this with too little computer, budget, schedule and definition of the software code.”  In another example, the precursor to the Internet known as the Arpanet collapsed for four hours in 1980 after years of smooth functioning.  According to Dr. Neumann of SRI, the collapse “resulted from an unforeseen interaction among three different causes” that included what he called “an overly lazy garbage collection algorithm” that allowed the errors to accumulate and overwhelm the fledgling network. 

Where are the weaknesses most likely to have grave consequences?  Every expert has a suggestion.  Aviel D. Rubin, a professor of computer science at Johns Hopkins University, said that glitches could be an enormous problem in high-tech voting machines.  “Maybe we have focused too much on hackers and not on the possibility of something going wrong,” he said.  “Sometimes the worst problems happen by accident.”…..Making systems strong enough to recover quickly from the inevitable glitches and problems can keep disruption to a minimum…..The best answer, Dr. Neumann says, is to build computers that are secure and stable from the start.  A system with fewer flaws also deters hackers, he said.  “If you design the thing right in the first place, you can make it reliable, secure, fault tolerant and human safe,” he said.  “The technology is there to do this right if anybody wanted to take the effort.”…..Dr. Neumann, who has been preaching network stability since the 1960s, said, “The message never got through.”  Pressures to ship software and hardware quickly and to keep costs at a minimum, he said, have worked against more secure and robust systems.  “We throw this together, shrink wrap it and throw it out there,” he said.  “There’s no incentive to do it right, and that’s pitiful.”

Reference : http://www.nytimes.com/2007/09/12/technology/techspecial/12threat.html

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: