The Register reports that IntelGuardians, a penetration‑testing firm, has built up on the Princeton research that showed how to extract encryption keys from computer memory, RAM. They’ve created a device and named it “DaisyDukes,” and it plugs into a computer via the USB port. No word on whether the device comes encased in cut‑offs. It’s still in beta mode, so anything could happen… Although, truth be told, I’m guessing they’ve christened the device after the TV character because the bombshell was able to wangle her way into pretty much anything.
Based on what I’ve read, it sounds like the new device is very similar to the device the Princeton researchers had showcased last month. IntelGuardians has a spin on the original device and research, though: they’ve found that passwords, not just encryption keys, remain in RAM as well. In fact, it turns out that passwords have a distinct signature from one application to another, and these will remain in RAM. The Princeton researchers had shown that information in RAM decayed gradually, offering a small window of opportunity to steal sensitive data even when a computer is turned off; the new findings increase the type of information that can be compromised, from encryption keys to anything found on RAM. Intelguardians are talking about seeing if they’d be able to download chat logs with their device, for example. Among the applications that IntelGuardians has been able to compromise to date include Thunderbird, Outlook, and AOL Instant Messenger, among others.
The news, understandably, is not making as much of a splash as the original findings did. And, just like the original findings, the security breach hinges upon finding a computer that is left unattended and turned on, or left in sleep mode. Well, that and having the ability to boot off the USB drive. Because the data in RAM will be overwritten when a computer is turned back on, the researchers boot up the computer from their device and copy the data found in the RAM as is, decreasing the chances of disturbing the original data—and increasing the chances of finding something useful.
So, it seems like a particularly effective way of getting around this particular problem—if you do have full disk encryption, like AlertBoot, on your computers—would be to turn off the ability of USB boot ups on your computers. Someone commented that this is missing the point, since one could use a bootable CD or a floppy disk to get around it.
Not quite, I imagine. Remember, the purpose here is to obtain data residing in RAM. A bootable CD—you can’t write stuff to it; it’s just the nature of the medium. As to a floppy disk, yeah, it’s possible. However, last time I checked, floppy disks have a capacity of less than 2 MB. When you consider that most RAM capacities on computers are at least 100 times bigger, there’s a less than one percent chance of getting something useful being copied to the floppy disk. And this is not taking into account the size of the software that will reside on the floppy disk to read and save said data (and the effects of formatting the disk). Realistically speaking, I’d imagine the available space of a floppy disk would be approximately 1 MB, and most RAM approaches 500 MB on the lower end of the scale (with some going as high as 2 GB), representing a 0.2% change of culling sensitive data at most.
If anything, I’d say that the point most people are missing is this: physical security still matters in this day and age. Computer security products are there as a back up to the vicissitudes of life (and common sense): antivirus software will always take a back seat to not visiting unsafe sites; encryption will always take a back seat to not getting it stolen.
A former programmer at Compass Bank stole a hard drive containing the information of one million customers from his employer. He tried, with an accomplice, to execute card fraud by encoding the stolen information on blank debit cards. He had created about 250 cards, and had successfully gotten money using 45 cards prior to being arrested. Cognizant that banks operate security cameras and flag instances where a great deal of money is taken out from an ATM machine, the programmer tried cover his tracks by wearing a disguise and taking out relatively small amounts of cash, usually $500 or so.
The crime originally occurred in May of last year, but nobody was wise to the story until the thief was sentenced last week. Turns out that Alabama is one of 11 states that does not require automatic notification of personal information breaches. However, the bank did have the good sense of letting the 250 customers know what happened.
In instances like these, where an insider commits the crime, it is hard to point out what the company could have done to prevent the crime in the first place. You know, short of making sure you hire an honest person. (And where’s the guarantee of that happening?) Off the bat I could say, well, encrypt your drives using a centrally managed encryption product like AlertBoot. This way, if some guy makes off with a hard drive and he knows the passwords for decrypting the data, you can disable said passwords to the drive from your console, ensuring no one can get to the data until the drive is recovered. This would be an impossibility for a standalone full disk encryption product, since the hard drive would have to be recovered in the first place to change any settings.
However, someone willing to go the distance to commit fraud will find ways around such “problems.” For example, someone is only prompted to disable access to an encrypted drive when he is aware that the drive is missing. So, the fraudster would try to hide the fact the drive is missing: get a brand new drive and break it—give it so much juice the internal components fry; swap the drive containing the data with the brand new, but broken, drive; bring his computer to IT and claim his computer is not working. IT guy opens up the case and sees a fried hard drive. IT guy orders either a new drive or computer, perhaps gives a loaner. The last thing the IT guy is gonna do is disable access to an encrypted, but fried, drive; he’s got better things to do, you know? In fact, he may decide to just dump the thing, “knowing” the contents are “encrypted” and hence safe to dispose of.
Is there a way to get around such a setup for carrying out crime? For encryption suites with a central management console, there is, if one implements it right. Require, as protocol and under penalty of getting fired, that IT staff keep the encryption in place but disallow all users from potentially accessing the drive for any and all returned drives, regardless of why it was returned (however, if you’re using a standalone encryption program, sorry to say this won’t work for you since there is no central management console). Such a protocol should be part of any good computer auditing plans.
BlueCross BlueShield is certainly having a busy 2008. And it’s just the beginning of the year! A CareFirst BlueCross BlueShield dental HMO, The Dental Network, has announced that the private information, of mostly Maryland and D.C. residents, has been available on the internet for a period of two weeks. The accident may have affected approximately 75,000 people by exposing names, addresses, and dates of birth.
Incidents like these are nothing new, and will continue to happen due to human error: move or save the contents of a sensitive file to the wrong computer directory (aka, document folder), and all of a sudden it can be found on the internet. The principle is not unlike that of P2P software, if you’re familiar with such applications. Most P2P software gives you exact control on which folders are available for sharing. Share the wrong folder, and all the files in that directory can be accessed by anyone. Likewise, a website is composed of files, which most people call web pages, within specific directories. Drop your database file into the wrong directory and anybody on the internet can download it.
Due to the security implications, the practice of having a web server double up as a database is frowned upon; however, a lot people do this despite the risk, usually trying to save resources. Their reasoning is usually that they’ll be extra careful; many come to regret it.
An easy solution to the problem would be file encryption. It differs from full disk encryption in that full disk encryption scrambles any and all data found in a hard disk, whereas file encryption protects the one file only. The latter is appropriate for those instances where a sensitive file may be copied, e-mailed, or stored on a computer connected to the internet 24/7. This way, even if firewalls and other security measures fail, the hacker cannot read the contents of the file itself.
Relying on encrypting individual files for protection, however, is no panacea. Sometimes people will forget to encrypt the file. One remedy is to encrypt all files that share a common extension, such as *.doc or *.xls. But as some security experts point out, that’s no guarantee of safety, since the same information can often be found on temporary files created by the application—and over which endusers don’t have any control.
For the above reasons, recommendations are sometimes given to use both types of encryption solutions, disk encryption as well as file encryption, each available in AlertBoot. This way, the data is protected even if the computer is stolen, and prevents any data breaches if a file is e-mailed to the wrong person.
I blogged last night (much sooner than the mainstream media, it seems like—I know I didn’t see too many other articles at the time) about how the National Institute of Health had reported that a laptop with patients’ data was stolen. At the time I said that it affected 2500 people and that the blame shouldn’t fall just on the researcher who lost the laptop. I also suggested that an encryption solution that allowed easier auditing, like AlertBoot, may have reduced the chances of something like this happening.
Well, I woke up today to find nearly two hundred new articles covering the situation, according to Google News; and as of right now, there are lots of comments on those articles. I’d like to offer my thoughts on some comments that have popped up in all the excitement.
The Feds are incompetent. They’ve lost stuff without fail. Besides the NIH thing, there were two VA incidents, the backup tapes, the… OK, so the federal government has problems. But they’re nowhere close to the incident rates of private industry. I’m not saying that the government is doing a good job. However, it’s only fair to point out that people are lumping together various different, autonomous branches together into a single entity. If people are going to sing that tune, and crying for blood, why not go after retailers? That industry is composed of different, autonomous entities, and TJX, the Gap, Hannaford, and numerous other companies have lost more records combined. Heck, some of them have single‑handedly lost more records than what the Fed lost, combined, over the past three years. And, if I’m not wrong, the Fed’s actually bigger than the retail industry, which means that the Fed actually has a lower incidence rate of data breaches.
People shouldn’t get their panties in a bunch. This is a victimless crime. Yeah…no, it isn’t. For starters, there’s the researcher who’s lost the laptop. He could be fired, demoted, research funds pulled, whatever. And before someone says “good riddance,” remember, this is a heart researcher, not some office drone munching on doughnuts. This guy could be the guy to make a breakthrough in cardio‑related research. And I’m not talking about a new aerobics gizmo. (You know who’s not in the people’s cross‑hairs? The IT guy who should have made sure the researcher’s laptop was encrypted.) Plus, there’s the fact that a laptop got stolen. Last time I checked, theft means there’s a victim.
As to the 2500 patients/volunteers to the research? They could be victims, too. As I remarked in the previous post, readily useful information was not included, like SSNs. The stolen information includes medical diagnoses, names, and DOBs. If an insurance company gets a hold of this information, they’ve got reasons to deny coverage. Does it sound farfetched? Sure. Could it happen? It already has. Just like people Google up someone’s name when they’re set up for a blind date, insurance companies have been found to google the backgrounds of people signing up for insurance.
I’d like to say it’s unlikely to happen, but I’ve personally stumbled upon an Alcoholics Anonymous spreadsheet with names, e‑mail addresses, and phone numbers online; who knows what else is out there? It was weird for me, since I’m under the impression that AA‑related things are supposed to be anonymous, but I’ve never been to one, so I have no idea how it works.
Patient information shouldn’t be leaving the premises. I want to agree with this one. The very best method of protecting information, after all, is making sure it doesn’t leave a security perimeter. Of course, there’s always the chance the thieves will come to you, so you’d still need to use full disk encryption on computers. However, the potential of an information security breach is lowered considerably if patient data is not carried about.
However, there are many legitimate reasons why information does leave the premises. For example, let’s say we’re talking about a hospital, not a medical research center. HIPAA requires that patients’ medical information be retained for at least 6 years after their deaths. Now, chances are you’re not going to keep that around—hospitals are pretty cramped places already. It’s got to go into storage. That generally means leaving the hospital and going into the coffers of a storage company like Iron Mountain.
Or what if a researcher is going away on a week‑long conference and wants to do some work on his off‑time? VPN? Sure; it might work. And don’t tell me there’s no work done at such conferences. Some people decide to do more than sip margaritas when colleagues who can offer suggestions are around. For that matter, what if you have to share data? These armchair security zealots are suggesting people e-mail stuff?
Why’d he leave the laptop in the trunk of a locked car? I’d imagine it was for security purposes. If the researcher was not mindful of security, he’d have left the laptop in the back seat or something. Nope, this guy was security conscious to a degree. It’s just a shame he didn’t nurture that security consciousness to a mild degree of security paranoia. You know, the kind of paranoia that makes you jiggle the door knob after locking the door? Just to see if it opens?
The National Institute of Health has reported that a laptop with patients’ data has been stolen. The theft took place about a month ago, but the incident was not made public until today. Approximately 2500 patients may be affected by the latest data security breach, since the laptop in question did not feature full disk encryption.
The information in the laptop included names, medical diagnoses, and details of the patients’ hearts. Information that would be readily useful to identity thieves—such as Social Security numbers, phone numbers, addresses, and financial information—were not included. (Question: why would financial information be collected for a heart study?) Regardless, the incident is being considered very seriously because it represents a violation of the government’s data protection policy and a violation of patients’ privacy.
The latter is self‑explanatory. Doctor‑patient confidentiality exists for many reasons, including incentives for patients to give honest and accurate description of symptoms. The former, too, is self‑explanatory: things get stolen or lost. And when laptop computers get lost or stolen, numerous people may be affected, in the range of, say, oh, I don’t know, approximately twenty‑five hundred people.
The washingtonpost.com has a write‑up of this case, and it’s quite an interesting read because, if anything, it gives you a view into how that particular bureaucracy works (there is a reason why it took so long to report the theft to the public). In summary, the now‑missing laptop was to be encrypted, but the process failed for some reason. The person using the laptop failed to do a follow up on this matter and the computer subsequently got stolen from his car’s trunk.
In some ways, I wish we were dealing with brain researchers so I could make a crack about laptop encryption not being brain surgery. Alas, it is not to be.
I think there are two people to blame for this information security breach. Or rather, there is someone to blame in addition to the heart researcher who was using the laptop. Clearly, a large part of the blame falls on the researcher himself. He knew his laptop was not encrypted. He was also in a better position to know what kind of data could be found on his laptop, and the potential ramifications if it were to be stolen.
However, it’s also true that there should have been some form of oversight, i.e., auditing and correcting any shortcomings. For example, with an encryption solution like AlertBoot, not only do you get an easy and centrally managed whole disk encryption system, you get a superior reporting engine, allowing you to easily perform audits on the state of the computers’ encryption and ensure nothing fell through the cracks. It stands to reason that someone other than the laptop owner would be in charge of running such reports. In the case of the NIH, it looks like this person fell asleep at the wheel.
Vindu Goel at the San Jose Mercury News has been keeping track of a story involving the theft of a laptop computer containing the personal data of 51,000 current and former employees of Agilent Technologies. The data breach includes names, Social Security numbers, addresses, and company stock‑related information.
In their letter to employees, Agilent put the blame on a vendor, the aptly named Stock & Options Solutions, referred to as SOS in Mr. Goel’s articles. (Anyone foresee that the name would backfire on them? Show of hands, everyone.) SOS was hired to make sure that Agilent’s former money management firm, Smith Barney, correctly transferred employee stock data to Fidelity Investments, the new administrator of Agilent’s employee stock program.
I don’t know what SOS needed to do specifically but, based on subsequent events, it’s clear that they required employees’ personal data. Agilent concurred, though on the condition that SOS safeguard the data. Indeed, Agilent is claiming that the unencrypted laptop is in violation of SOS’s contract. I’d say someone’s about to get sued, especially when you consider that in similar past cases, the third party vendors’ identities were actually protected by the affected companies (like the Gap protecting their vendor last year, for instance).
The funny thing about this story is that the vendor told Agilent that the laptop was stolen en route to being encrypted, according to an Agilent spokeswoman. Now, this is not unusual; it could happen, technically. There’s a reason why you ideally want to encrypt your machines before you begin to use them and download sensitive data and whatnot. What is unusual, though, is that an SOS employee took the laptop from the East Coast to California to encrypt it. Let me repeat that again: in order to encrypt the laptop, they traversed the entire continent, from sea to shining sea.
This, to me, is lunacy. I have to assume this person took a plane to reach Cali. That implies an airport, a busy place where things get stolen and lost. All. The. Time. Plus, there are all those pesky little instances where something could be lost on the way to the airport, and from the airport to the final destination. You generally want to have your laptops encrypted because you’re about to face such potential dangers, not in spite of them. It boggles the mind—is this some kind of pre‑emptive April Fool’s joke?
Maybe they should have gone with AlertBoot. Unlike whatever SOS was using, or tried to use and failed spectacularly, AlertBoot’s encryption is easy to deploy and entirely web‑based. So, instead of having to spend a grand getting somewhere to protect a laptop, hard drive encryption can be easily employed by hopping onto the internet. And, the status of encryption can be verified from a central console via powerful reporting engines, so auditing teams can stay on top of any computers that don’t have data protection on them.