I noticed one day in a discussion with a client, that something they said resonated with things other clients have said over the years. Literally some dozens of conversations over the years have gone something like this:
So, you plan to re-install the worm infected systems from clean media, right? No. We get hit by viruses all the time and nothing really bad has ever happened. We'll just delete the worm files, whack the key from the registry and go back about our business. Do you know what the worm did after it contacted the overseas IRC remote control channel? No. How do you know nothing bad happened?If you're a client and you recognize this conversation, don't feel bad, I'm not quoting you. I've had this same basic conversation with many other clients, you're in good company (I'm not quoting them, either!) Just about every other professional consultant in the information security world that I've ever spoken with has similar "war stories". People who make their living managing Information Technology shops need to have basic logic and reasoning skills, and for the most part, they do. Oddly, with respect to one particular class of problems -- those for which the solution is perceived to be expensive -- circular reasoning seems to be very popular. When managers don't like the answer that they know, and that the entire cadre of professional security consultants the world over agrees, is the right answer to a particularly painful problem, suddenly you can get whiplash trying to keep up with the coming and going in circles. The simple fact is, if somebody "0wnz your box, d00d!", no matter the particulars of how they came to own it, you have a very difficult time assuring the security of that system unless you re-install from pristine media. Yes, there exist a few techniques and a few tools that might help you recover certain types of systems under certain circumstances. Would you like to experiment with those techniques on your production systems today? What if the box in question is the PC on your desk? Do you trust the cleanup tool enough to know it didn't leave behind a keystroke logger that the bot downloaded over IRC? Do you mind if someone outside the organization gains access to your bank account login while your staff are learning the forensic techniques they need to find it? I thought not! I've had an email signature block around for years -- a quote from physicist Richard Feynman. It's a bit long, and I don't use it often. Sometimes when the national debate on some topic or another has degenerated into nonsense, I quietly attach it without comment at the bottom of my emails for a few days. I was deeply moved by this passage from Feynman's observations, attached as an appendix to the final report of the Rogers Commission, which investigated the accident of the Space Shuttle Challenger in 1986. It serves to remind me of the sometimes accidental and sometimes unconscious -- but nonetheless ever-present hubris of a bureaucracy. I work against this hubris at every turn, steadfast in my belief that organizations are made up of people, and most people want to do the right thing. When presented with the facts in a relaxed setting, outside of the office, away from the deadline pressures and the promotion risks and office politics, I'd guess almost all of the managers at NASA would have agreed with Feynman and a number of NASA engineers that jets of burning gas shooting out of leaky brittle O-rings and pointed at a giant tank of hydrogen was a bomb waiting to go off. And finally, after a surprising number of launches sporting extraordinary luck, it sadly did.
Now, in most organizations nobody will die from a worm attack. Hospitals, air traffic control, dispatch centers, train control networks, nuclear power plant control centers, various other utilities, and the enormous DoD networks being notable and important possible exceptions, of course. By the way, all of those industries are documented to have suffered worm attacks within the last few years. Certainly I don't mean to over-dramatize the case, it's just that Feynman elegantly cut through mountains of red tape to reveal the rotten core of the decision making process that led to the first space shuttle disaster. Arguably he explains the second disaster, from which NASA is still reeling, as well. Information Technology decisions require clear thinking. Circular arguments have no place in it. Epiblog: I've just got to the end of this essay, when I went to look up one last reference. Out of the blue as a bolt of lightning, I was struck by a most remarkable coincidence. You see, I was looking for a good reference on the different types of fallacies in reasoning, when I browsed one I've had on my shelf for years. Clear Thinking: A Practical Introduction by Hyman Ruchlis. I bought it on a sale table several years ago, and occasionally look up a section on a particular reasoning fallacy or another. In the section on "circular reasoning", Mr. Ruchlis includes this same Feynman quote! Additional information on the Challenger Accident can be found at the Federation of American Scientists web site.
"Personal observations on the reliability of the Shuttle" "We have also found that certification criteria used in Flight Readiness Reviews often develop a gradually decreasing strictness. The argument that the same risk was flown before without failure is often accepted as an argument for the safety of accepting it again. Because of this, obvious weaknesses are accepted again and again, sometimes without a sufficiently serious attempt to remedy them, or to delay a flight because of their continued presence." By: Richard P. Feynman (1986) "Personal observations on the reliability of the Shuttle" Included as an appendix to: Report of the PRESIDENTIAL COMMISSION on the Space Shuttle Challenger Accident (known informally but widely as "The Roger's Commission" report)