Skip to main content

Clear Thinking & Information Security

I noticed one day in a discussion with a client, that something they said resonated with things other clients have said over the years. Literally some dozens of conversations over the years have gone something like this:
So, you plan to re-install the worm infected systems from clean media, right? No. We get hit by viruses all the time and nothing really bad has ever happened. We'll just delete the worm files, whack the key from the registry and go back about our business. Do you know what the worm did after it contacted the overseas IRC remote control channel? No. How do you know nothing bad happened?
If you're a client and you recognize this conversation, don't feel bad, I'm not quoting you. I've had this same basic conversation with many other clients, you're in good company (I'm not quoting them, either!) Just about every other professional consultant in the information security world that I've ever spoken with has similar "war stories". People who make their living managing Information Technology shops need to have basic logic and reasoning skills, and for the most part, they do. Oddly, with respect to one particular class of problems -- those for which the solution is perceived to be expensive -- circular reasoning seems to be very popular. When managers don't like the answer that they know, and that the entire cadre of professional security consultants the world over agrees, is the right answer to a particularly painful problem, suddenly you can get whiplash trying to keep up with the coming and going in circles. The simple fact is, if somebody "0wnz your box, d00d!", no matter the particulars of how they came to own it, you have a very difficult time assuring the security of that system unless you re-install from pristine media. Yes, there exist a few techniques and a few tools that might help you recover certain types of systems under certain circumstances. Would you like to experiment with those techniques on your production systems today? What if the box in question is the PC on your desk? Do you trust the cleanup tool enough to know it didn't leave behind a keystroke logger that the bot downloaded over IRC? Do you mind if someone outside the organization gains access to your bank account login while your staff are learning the forensic techniques they need to find it? I thought not! I've had an email signature block around for years -- a quote from physicist Richard Feynman. It's a bit long, and I don't use it often. Sometimes when the national debate on some topic or another has degenerated into nonsense, I quietly attach it without comment at the bottom of my emails for a few days. I was deeply moved by this passage from Feynman's observations, attached as an appendix to the final report of the Rogers Commission, which investigated the accident of the Space Shuttle Challenger in 1986. It serves to remind me of the sometimes accidental and sometimes unconscious -- but nonetheless ever-present hubris of a bureaucracy. I work against this hubris at every turn, steadfast in my belief that organizations are made up of people, and most people want to do the right thing. When presented with the facts in a relaxed setting, outside of the office, away from the deadline pressures and the promotion risks and office politics, I'd guess almost all of the managers at NASA would have agreed with Feynman and a number of NASA engineers that jets of burning gas shooting out of leaky brittle O-rings and pointed at a giant tank of hydrogen was a bomb waiting to go off. And finally, after a surprising number of launches sporting extraordinary luck, it sadly did.

"Personal observations on the reliability of the Shuttle" "We have also found that certification criteria used in Flight Readiness Reviews often develop a gradually decreasing strictness. The argument that the same risk was flown before without failure is often accepted as an argument for the safety of accepting it again. Because of this, obvious weaknesses are accepted again and again, sometimes without a sufficiently serious attempt to remedy them, or to delay a flight because of their continued presence." By: Richard P. Feynman (1986) "Personal observations on the reliability of the Shuttle" Included as an appendix to: Report of the PRESIDENTIAL COMMISSION on the Space Shuttle Challenger Accident (known informally but widely as "The Roger's Commission" report)
Now, in most organizations nobody will die from a worm attack. Hospitals, air traffic control, dispatch centers, train control networks, nuclear power plant control centers, various other utilities, and the enormous DoD networks being notable and important possible exceptions, of course. By the way, all of those industries are documented to have suffered worm attacks within the last few years. Certainly I don't mean to over-dramatize the case, it's just that Feynman elegantly cut through mountains of red tape to reveal the rotten core of the decision making process that led to the first space shuttle disaster. Arguably he explains the second disaster, from which NASA is still reeling, as well. Information Technology decisions require clear thinking. Circular arguments have no place in it. Epiblog: I've just got to the end of this essay, when I went to look up one last reference. Out of the blue as a bolt of lightning, I was struck by a most remarkable coincidence. You see, I was looking for a good reference on the different types of fallacies in reasoning, when I browsed one I've had on my shelf for years. Clear Thinking: A Practical Introduction by Hyman Ruchlis. I bought it on a sale table several years ago, and occasionally look up a section on a particular reasoning fallacy or another. In the section on "circular reasoning", Mr. Ruchlis includes this same Feynman quote! Additional information on the Challenger Accident can be found at the Federation of American Scientists web site.


Popular posts from this blog

Verified by Visa (Veriphied Phishing?)

If you have used a Visa card to make a purchase online lately you may have encountered a relatively new program, Verified by Visa . I've encountered it twice. The system is an interesting attempt by Visa to reduce online fraud and identity theft. It's a noble effort, but the user experience is unsettling, and the security implications are not exactly crystal clear. Here's what happened to me, both times the system was activated. I was redirected away from the domain at which I was shopping, to a URL which was: not the domain where I was shopping, not the domain of the bank that issued my card not I've been telling people for years that if anything like that happens to you, close your web browser immediately and do not under any circumstances enter any personal information into the form, because this is a sure sign of a man in the middle or phishing scam. (Never mind that all the best phishing scams now-a-days look like the actual domai…

Hacker 0x80 0wn3d by FBI (Arrested after Accidental Outing by Washington Post) [1]

What can the botmaster 0x80's impending misfortune [1] teach us about information security? Quite a bit. What the botmaster and the reporter didn't count on is a security risk known as "the aggregation problem" or "point and click aggregation". It's not surprising, as even practicing security professionals are often unaware of this problem, or vaguely aware of the concept but not the name. Information Security dictionaries online generally lack the terms, and don't mention them in their discussion of "disclosure" either. The aggregation problem happens when a series of small facts, any one of which if disclosed present a minimal security risk, combine to present a greater security risk when disclosed together. When aggregated, information from publicly available sources may accidentally disclose information that was intended to remain confidential. As it happens, an IETF glossary contains a definition of the basic term. RFC 282…

Splunk acquires Phantom Cyber

I hope it doesn't come across as too cynical, the observation that most acquisitions in the tech domain fail to produce anything useful and often as not wind up killing a promising upstart technology, even if only by accident.

I have hope for this one, though. Splunk strikes me as a likely exception. This acquisition of fresh ideas and talent might breathe new life into a solid, if somewhat staid, security company.

Splunk’s data analytics gets a security boost with $350 million acquisition of Phantom Cyber