- can't be ignored due to strict property accounting requirements,
- can't be denied due to the loss of a physical device,
- and is more easily understood by all levels of oversight and management.
Thursday, June 29, 2006
Recently there have been a spate of incidents in which U.S. federal government agencies reported data theft or loss, particularly data which could result in identity theft. The losses include the contact information and social security numbers of, literally, millions of federal employees and contractors. Most of these recent incidents were the result of stolen laptop hardware, USB Key fobs, or other computer hardware, although at least two involved unspecified intrusions (electronic theft of the data following a break-in to an online system). In the past several months, as the reports of stolen servers, hard drives, laptops, and USB key fobs have mounted, I've only seen two disclosed instance of an intrusion (in one case apparently targeted) which resulted in the theft of identity data concerning 1,502 people at the Department of Energy: Energy ups security efforts after loss of employee data and 26,000 people at the Department of Agriculture: U.S. Department of Agriculture hacked. Despite the sparse reports of such intrusions, we know that government PC systems are not uniquely protected from these threats. Although it hasn't been reported, there is ample reason to believe that significant data loss has also occurred over the past several years through worm, botnet, spyware, trojan and rootkit infestations. Such malware routinely scans the infected PC and mounted network drives or shares and uploads files and data into the arms of organized crime. This type of loss is harder for organizations to detect and remains underreported as a result. However, it has has undoubtedly resulted in many more exposures of similar magnitude than have theft of laptops. Many tens of thousands of computers in government agencies are infected with worms, bots, adware, spyware, viruses, trojans, and rootkits every year. The infection rates of many government agencies are not radically different from private industry. Why do we see so few reports about data loss from these types of large scale intrusions? The difference is that when a laptop is stolen, a bit of government-owned equipment goes missing. This produces a few unique circumstances that malware infections don't produce. Missing hardware:
Wednesday, June 28, 2006
Within a few years it's possible that encryption will be the norm in government data storage, and probably large organizations, too. The historical inevitability of this process was given a boost recently. The OMB has provided guidance requiring Federal agencies to take the security of desktop and laptop systems more seriously (see: OMB Sets Guidelines for Federal Employee Laptop Security)in the wake of recent disclosure of several massive losses of data which could lead to identity identity theft. Here are a few stories describing recent incidents which have prompted the concern and gained the attention of the OMB: Navy Finds Data on Thousands of Sailors on Web Site Afghan market sells US military flash drives FTC Loses Personal Data on Identity-Theft Suspects US veterans' data exposed after burglary Veterans Affairs warns of massive privacy breach Officials: Veterans Affairs Department Ignored Repeated Warnings on Data Security Latest Information on Veterans Affairs Data Security Additional background reading on the recent OBM security guidance: OMB targets desktop hole in cybersecurity Before we leap headlong into encrypting everything in the government, however, we should really ponder the technology and its other implications. Earlier this week, President Bush chastised the North Koreans, who have been preparing to test an ICBM (Intercontinental Ballistic Missile), saying that it is worrisome that a "non-transparent regime" is developing such a capability. Transparency in government is a valued characteristic of modern democratic governments. Consider, however, that even in a modern democracy there exists a tension between disclosure and transparency on the one hand, and the desire of government organizations to restrict information flow for a variety of purposes on the other. Also this week, the disclosure of further domestic spying activity highlights that very issue. More directly, even one of the agencies hit by recent data theft ran aground on the sand bar of public relations spin control run amok: Source: Theft of vets' data kept secret for 19 days. At least some organizations will opt to encrypt most data in most databases, most documents, and most filesystems, because it will be easier and cheaper to comply with directives like this by defaulting to encrypted storage for everything than it will be to analyze this mountain of content to determine if it should be encrypted or not. (Most of the stolen data that upsets people is personnel data, which is "sensitive but unclassified," for example.) Although this may help prevent massive loss of data as seen recently, it might also reduce transparency in government. It may well be legitimately more difficult and expensive to satisfy a FOIA (Freedom of Information Act) request for organizations which rely on office documents and distributed (ad-hoc) content creation and storage. Most policy setting organizations do exactly that. The recent OBM guidance is a welcome step in helping to limit the damage. (It should also be noted that encrypted storage doesn't completely solve this problem, as people tend to leave passwords laying about in plain text files to help them access their protected data, and passwords can be cracked with common tools, given sufficient CPU power and time to perform the crack.) Congress should consider the implications of encryption as a response to data theft problems upon the desirable characteristic of transparency in governance, and should attempt to mitigate the potential damage to transparency before it occurs. They might require that all encrypted archvies be searchable, for example, similar to the way email applications search encrypted mail files. Some thought on this issue would undoubtedly produce a few basic guidelines which would help preserve transparency in governance.
Technorati Tags: Afghanistan, arms control, Army, data loss, data security, encryption, identity theft, North Korea, OMB, rootkit, security, spyware, transparency, Trojan, USB, USDA, veterans affairs, virus, worm
Friday, June 16, 2006
A new zero-day exploit of Microsoft Excel has me pondering a standard bit of security advice, "be careful what you click." This meme survives to be repeated at nearly every outbreak, yet it simply isn't very effective. You've probably seen a story or blog post about this already, but in case you haven't here's the alert from the Microsoft technet blog which got me thinking:
Reports of new vulnerability in Microsoft ExcelMany online article and blog postings repeated this advice, unquestioningly. Some folks even praised it, including the respected security professional Brian Krebs. In his post about the issue at the Security Fix blog, he says it's "always good advice" that one be very careful opening unsolicited attachments. Recently similar advice was given to users of various Instant Messaging systems, as a "worm" affected users of Yahoo's system. In fact, the "worm" required the user to click it, meaning that its spread couldn't possibly achieve the "every vulnerable machine got hit" levels of a real automatically propagating network worm. However, these Instant Message viruses and email viruses can affect large numbers of systems in a short amount of time. A year or so ago I saw an outbreak of an email virus hit 1.5% of the systems at a large customer. It hit so many people (over 500) so fast (within an hour or two) that we at first thought it was exploiting an automatic execution hole in the email client. In fact, it had just been a little more clever than average at social engineering—tricking people to click it. I briefly interviewed a few of the victims, some of whom were trained IT professionals, who spent a lot of time during the course of the year explaining to users that they shouldn't click unexpected attachments. Well, the virus in question was somewhat clever. It nearly always appeared to be from someone you know. It sent an attachment which appeared to be a spreadsheet (it was instead an executable virus). It used cleverly mundane subject lines. Nearly all of the victims had received a virus pretending to be a spreadsheet which appeared to be from someone that they regularly receive a spreadsheets from via email. How careful must people be? Scanning a file first wouldn't have protected the victim against zero-day threats like the current Excel threat. We give the same advice to people about web surfing. Be careful where you surf, be careful what you click. It doesn't work there, either. Corporate and home PCs alike see anywhere from 1% to 20% ambient levels of adware and spyware infestation. But the web is a treasure trove of useful and wonderful things you might never discover if, sometimes, you don't click with essentially reckless abandon. The sentiment is pure, but most users are not able to easily tell what to click from what to avoid. Only the most rudimentary of email viruses or phishing can most people filter out at a glance. I've given this advice myself many times, trying to carefully explain how to tell good from bad emails, and good from bad free downloads. I think in general the advice hasn't been helpful to most people most of the time. High levels of ongoing infestation from adware and spyware, widespread damage from Instant Message "worms" and rampant identity theft all tell us that the advice isn't working.
" In order for this attack to be carried out, a user must first open a malicious Excel document that is sent as an email attachment or otherwise provided to them by an attacker. (note that opening it out of email will prompt you to be careful about opening the attachment) So remember to be very careful opening unsolicited attachments from both known and unknown sources."
Friday, June 09, 2006
Security Auditors can be a clever lot, sometimes a bit too clever. You really need to have someone on staff looking over their shoulder throughout the entire audit, from planning through probing, and reporting. If you don't have someone on staff qualified to watch them, you need an independent consultant. A very sharp generalist would do, but someone experienced in security would be better. Basically you need a check and balance system in place, to keep stories like the following from happening to your organization. First the context. The auditors created a custom Trojan, planted it in amidst various other files on USB drives, and seeded them in parking lots and areas of the client's work area where they would likely be discovered by customers. Which, of course, they were. Here's what they say about the experience: Social Engineering, the USB Way
I had one of my guys write a Trojan that, when run, would collect passwords, logins and machine-specific information from the user’s computer, and then email the findings back to us. ... I immediately called my guy that wrote the Trojan and asked if anything was received at his end. Slowly but surely info was being mailed back to him. ... After about three days, we figured we had collected enough data. When I started to review our findings, I was amazed at the results. Of the 20 USB drives we planted, 15 were found by employees, and all had been plugged into company computers. The data we obtained helped us to compromise additional systems, and the best part of the whole scheme was its convenience. We never broke a sweat. Everything that needed to happen did, and in a way it was completely transparent to the users, the network, and credit union management.Yes, you read that right. Their custom trojan emailed the client's account names and passwords and other (presumably important) data out to the auditors' off-site email accounts. Now, unless these guys put rather a lot more effort into their custom trojan than they described, email is a plain text protocol. So, any fifteen year old kid with a summer job sitting on a router or an SMTP gateway at an ISP between the client and the auditor's email basket can read that email. Of course, it's possible the trojan was equipped with an X.509 certificate and encryption system, but it seems to me that if the auditors had thought of this, they would have mentioned it. It would have been a source of pride. For either forgetting to encrypt the data, or failing to mention it in their storytelling, they will undoubtedly be punished by the flood of email they are bound to get from every GSEC and CISSP certified security analyst on the planet. I don't want to be too critical, because they seem to have the best intentions, and their effort served to illustrate a point that clients often don't take seriously -- USB drives really can be dangerous, even if you don't inhale one. However, in their excitement to put the clever idea to the test, these auditors seem to have overlooked one important layer of the security cake and the important dictum, useful to all consultants, "first, do no harm." Of course, this isn't the most egregious error ever committed by an auditor. Far from it, in fact. I've personally seen Auditor's laptops spewing worm traffic on a client's network. Of course, it's likely that the auditor's systems were infected by a worm on the client's network, rather than the other way around, but running 3 systems known to be vulnerable to the same defect that they were spanking the client for was, pardon the pun, an oversight. In the last year or so, several incidents of auditors losing valuable client data including identity information have been reported, notably more than once incident involving Ernst & Young. So, have someone on your staff work closely with the auditors as a sponsor of the audit, or have an independent consultant watching over their shoulder for you. People sometimes get carried away in their exuberance to do great work, and other times are following bureaucratic procedures that just don't make sense. In either case, your sponsor should have veto power over any actions during the audit, to protect your data from accidental exposure. In case you're wondering, you don't need an "auditor for the auditor for the auditor" up an infinite chain. What we're really talking about here is a sponsor with veto power who isn't part of the audit team. This kind of outside watchdog can break the pattern of groupthink that causes people to run off with a half-baked idea and accidentally expose the data they are ostensibly trying to help you protect.