Wired is reporting that the password cracking program "ocl-Hashcat-plus" is now able to crack passwords up to 55 characters long. The importance of strong, unique passwords is not lost on people who use managed laptop disk encryption like AlertBoot FDE. However, at some point, one has to wonder whether lengthier passwords are the answer to data security.
Prior to the latest release, the password cracking program "ocl-Hashcat-plus" (Hashcat) had a limit on passwords it could guess. According to the creators of the program, a 15character limit was placed on purpose, as increasing the character count would "[result] in a decrease in performance."However, the demand for cracking longer passwords finally won over. The improvement depends on the hash algorithm that's being targeted, but "the maximum can grow as high as 64 characters or as low as 24," according to wired.com. (This does not imply that passwords shorter than 24 characters are somehow more secure. There are other password cracking software other than Hashcat, after all.)It is further being reported that Hashcat can achieve password cracking speeds of eight billion guesses per second. How much damage can the software deliver?ocl-Hashcat-plus targets a much wider number of popular cryptographic products and applications, including TrueCrypt 5.0 and beyond, 1Password, Lastpass, the SHA256 algorithm in the Unix operating system, and hashing operations found in the latest version of Apple's OS X operating system.Yikes. As another metric, wired.com is reporting that the 14.3 million passwords that were leaked in the RockYou list can be cracked in 65 seconds.
ocl-Hashcat-plus targets a much wider number of popular cryptographic products and applications, including TrueCrypt 5.0 and beyond, 1Password, Lastpass, the SHA256 algorithm in the Unix operating system, and hashing operations found in the latest version of Apple's OS X operating system.
A couple of passwords that were cracked using the newly released Hashcat are "thereisnofatebutwhatwemake" and "Ph'nglui mglw'nafh Cthulhu R'lyeh wgah'nagl fhtagn1," which come from a H.P. Lovecraft story (well, that and the number 1. I guess someone's password policies required the use of letters and numbers).Now, it could be that these two passwords were not "salted" and thus were "easy" to crack. However, when you consider that Hashcat can go through 8 billion passwords per second, and are less than 55 characters long, it stands to reason that they could have fallen regardless of salting – especially if the same salt is applied to all passwords stored by a company.If the above two passwords cannot stand... aren't we all doomed? The only sensible answer is to start using passwords that are even longer than 55 characters. Good luck remembering that...A simple remedy may lie in the use of salts, though: You start salting your passwords yourself. For example, why create a new password that is longer than 55 characters when you can take your old one and stretch it out?Take "thereisnofatebutwhatwemake" as an example. If you decide your salt is "firefly," then the password could now be "therefireflyisfireflynofireflyfatefireflybutfireflywhatfireflywefireflymake" which is 75 characters long and as easy to remember as the old password, if a bit unwieldy.The problem? At some point, passwords are going to become too long for humans to use. It's the reason why AES-256 encryption keys are not chose by people; instead, they're randomly generated by computers.
Over four million people are affected by the theft of four desktop computers from Advocate Medical Group. The computers were not protected with medical data encryption software, which runs counter to most HIPAA experts' recommendations. The one silver lining here for AMG is that there is still time left until compliance with the HIPAA Omnibus Final Rule (but, frankly, that hasn't stopped HHS from handing out million-dollar HIPAA fines to covered-entities that should have known better).
According to chicagotribne.com, the computers were stolen on July 15. Burglars broke into an administrative building and stole four computers that contained names, addresses, SSNs, and dates of birth. The chicagotribne.com points out that financial information and medical records were not stolen.However, the information that was stolen is enough to fetch some good money in the black market. The value of names and SSNs tends to vary, but such information can go as low as pennies per name – which means that the thieves could get at least $40,000 by selling the data (but probably much more. Quantity has a quality all its own, after all. Not that I make it a habit to quote despotic leaders).The breached information goes as far back as the early 1990s.. One year of free credit monitoring is being offered to people whose information was stolen.
One of the more upsetting aspects about this story is that security, in all of its forms, was severely lacking. The computers reportedly had password-protection but most people already know that it cannot be relied on to protect data. The building itself didn't have any physical security either. Security cameras are present but the office was "not equipped with an alarm." It's also apparent that the company didn't have 24/7 security staff at the time of the burglary.Under the circumstances, it's almost as if the company believed that these desktop computers didn't require meaningful security because...they're desktop computers. You know, just like you wouldn't use a $200 bike lock on a weathered Walmart bicycle with a tattered seat and an extremely rusty chain.The problem is, thieves are willing to steal anything if they think they can get away with it. Desktop computers are not sexy, but they are bankable – sell it for cheap or sell it for parts. And, the data on it takes on the same form regardless of the device: laptop, desktop, netbook, tablet, smartphone, etc.With the HHS's Office of Civil Rights handing out million-dollar penalties for HIPAA breaches every six months or so (they can choose to be picky on who to make an example out of – there are thousands of reported data breaches each year, and growing), it's perplexing that any covered entity or business associate is willing to take a short-sighted approach to PHI protection.
A story at fastcompany.com covers the story of how Beth Israel Deaconess Medical Center (BIDMC) was able to prevent a data breach when a bomb went off at the Boston Marathon earlier this year. Perhaps ironically, BIDMC was only able to do so because they had experienced a significant data breach when a doctor's laptop computer had been stolen the previous year. The lack of laptop encryption on the device had led to significant changes at the hospital.Casting aside the more tragic aspects of what happened that day, the piece shows why you want to be prepared for the unlikely event of a HIPAA data breach.
What are the odds that a terrorist will be treated at your local hospital? Virtually nil, assuming that there isn't a physical war going on in your own backyard. And yet, that's the situation BIDMC was facing in the aftermath of the Boston Marathon bombing.Medical personnel were already busy with their valiant efforts in treating victims when Dzhokhar Tsarnaev was brought in for treatment. Tsarnaev being the sole center figure of the tragedy, his presence at BIDMC meant the hospital would face a challenge:For Halamka's [BIDMC CIO] department, ensuring that systems stayed online and maintaining the privacy of patients was essential. In his prior life as a surgeon in Los Angeles, Halamka saw how journalists would try any trick in the book to get a scoop on a breaking celebrity story. From BIDMC's perspective, there was a real risk someone would attempt to steal the medical records of Tsarnaev or the victims. This would hinder the hospital's ability to provide care and risk exposing it to lawsuits.Had the hospital not updated its security policies and technology after their data breach in 2012, who knows what would have happened? The odds of a breach could have been high; the laptop theft in 2012 occurred at an access-restricted area, after all. Assuming a breach did happen, the Office of Civil Rights would have had to launch an investigation – could something become more high profile than this? – and I doubt that they would have let things slide because of the incredibly improbable likelihood of the event.Being prepared paid off big time for Beth Israel, although it cost them $500,000 to get there (the amount they spent cleaning themselves up after the laptop theft).
For Halamka's [BIDMC CIO] department, ensuring that systems stayed online and maintaining the privacy of patients was essential. In his prior life as a surgeon in Los Angeles, Halamka saw how journalists would try any trick in the book to get a scoop on a breaking celebrity story. From BIDMC's perspective, there was a real risk someone would attempt to steal the medical records of Tsarnaev or the victims. This would hinder the hospital's ability to provide care and risk exposing it to lawsuits.
The odds of a US hospital facing the same situation again are very low. However, the odds of a data breach occurring remain very high. Hospitals may have restricted areas but they tend to be an open environment because of the nature of what they do: when seconds can determine life or death, locking doors or making medicine closets impregnable, important as they may be, are not at the top of the list. Obviously, this means that the chances of something being stolen will remain high. It'll always be a matter of "when," not "if." When will we experience a data breach? should be the real question medical establishments ought to be asking themselves.While that particular bell is tolling for all HIPAA covered entities, you can distance yourself from it as much as possible by implementing the appropriate and needed (even if they may not be listed as "required" under the guidelines) solutions, like medical laptop encryption and data security measurements.
As the deadline for complying with the HIPAA Final Omnibus Rule looms ever closer, experts are weighing in with their opinions. The site healthcare-informatics.com has published an interview with a lawyer who notes that forensics is one of the most difficult elements of dealing with a PHI data breach and that "federal officials are very open about the fact that these heavy penalties are intended to promote encryption."
In the interview with healthcare-informatics.com, Kathryn Coburn, a lawyer, noted that one of "the most difficult elements [in addressing a data breach] is the forensics" and gave the following example (my emphasis):[M]aybe you had 40 laptops that were stolen from a facility [that didn't get out of the building and were eventually recovered]...you have to figure out whether someone looked at the PHI. In many cases, you can prove that nobody looked at them, in many cases, and then you don't have to give any notice of breach.... It's easy to say the information wasn't compromised, but if you can't prove it, you're still going to have to give notice of breach.This is probably an oft-overlooked requirement, and possibly the leading reason why a HIPAA covered-entity will announce a data breach even when they claim their PHI was protected with encryption software. (The other leading reason would be finding that the encryption that was used did not meet up to HIPAA's standards, which are essentially NIST's standards).Under HIPAA you must document your reasons for an action. For example, you're not required to use encryption software to protect PHI stored on a laptop. However, you must have a very good reason for it...and it must be documented. Remember, under the Security Rule, HIPAA makes encryption an addressable specification. There's very little you do not document under the Security Rule.But, the question remains: even if you have encryption installed on a computer – and one that is validated by the NIST as conforming to their standards – how do you prove that it was not accessed between the time it was stolen and the day it was recovered?With a solution like AlertBoot Full Disk Encryption, it's quite simple: you run a report. Although AlertBoot FDE is a cloud-managed FDE service, the cloud portion is used for installation, deployment, and management. The encryption itself is free-standing, meaning it will work regardless of the computer being connected to the internet. The time and date of when someone accessed the computer is also tracked, and whether the logging into the computer was successful.This data, along with the fact that AlertBoot remains a third party, absolves the client from most problems, including the possibility of adulterating the data.
[M]aybe you had 40 laptops that were stolen from a facility [that didn't get out of the building and were eventually recovered]...you have to figure out whether someone looked at the PHI. In many cases, you can prove that nobody looked at them, in many cases, and then you don't have to give any notice of breach.... It's easy to say the information wasn't compromised, but if you can't prove it, you're still going to have to give notice of breach.
One of the more eye-opening remarks was the following Q&A:
It seems that the number of breaches is growing significantly.Oh yes, it’s dreadful. That’s why encryption is so important. And federal officials are very open about the fact that these heavy penalties are intended to promote encryption. And they don’t refer to any specific type of encryption; they do refer to the NIST [National Institute of Standards and Technology] standard.
Of course, I've already noted before that the HIPAA OCR Director said "we love encryption, and those who use encryption love it, too."Regardless, to find a lawyer going on record saying that heavy penalties are there to promote encryption is a bit shocking. It makes me wonder why the government is going around it in such a roundabout way. Why not just require encryption? Would it just be easier?And more effective, too.
There's a saying that once you've taken your sword out of its sheath, it's hard to put it back in. It appears to perfectly describe the stance on monetary penalties that is being handed out by the Department of Health and Human Services (HHS): not content with fining Mass General Hospital $1 million in 2011 and Massachusetts Eye and Ear Infirmary and Massachusetts Eye and Ear Associates for $1.5 million in 2012, the HHS has now reached a settlement with Affinity Health Plan for $1.2 million.With the Final Omnibus Rule going into effect on September 23, it means HIPAA covered entities should take the time to ensure that they are following HIPAA and HITECH, such as using laptop encryption software to secure data on portable computers.
Why was Affinity Health Plan fine for $1.2 million? Because they forgot to sanitize their photocopier. And I'm not referring to the lack of Purell in the machine.According to the complaint, CBS contacted Affinity in 2010 as part of a news story into modern day copiers. Like your car, photocopiers also have a significantly computerized component to them. CBS obtained used photocopiers in the market, and one of the machines' hard drives contained Affinity's data.The HHS Office of Civil Rights looked into the situation and found that approximately 340,000 people were affected by this particular data breach. The sheer number of people affected pretty much guaranteed a fine.In Affinity's defense, most people in 2010 didn't know that photocopiers were really computers. But then, it didn't really require one to be a rocket scientist to figure it out: a machine that essentially scans your document and creates 15 different copies all collated in reverse order means images are being stored somewhere.Ignorance is never and cannot be an excuse for failing the law, however. Hence the fine (or if you prefer, settlement).While the above fine has nothing to do with laptops, the Mass Eye and Ear case did: a laptop computer that was not secured with encryption software was lost, triggering the $1.5 million fine.The message here is that HIPAA covered entities (and beginning in late September, their Business Associates) must pay attention to ePHI in all its forms. The use of laptop encryption is a no brainer. But, photocopiers, CDs, DVDs, backup tapes, smartphones, USB flashdrives and any storage media where ePHI can be stored must be secured in some way.
Netbooks. They went the way of the dodo bird when Apple's iPad made its debut to cheers as well as jeers (I remember how everyone kept saying it was just a humongous iPhone. I guess Jobs showed them). And while netbooks might be the ugly duckling that never became a swan, there's millions of these devices out there.Including at least one at Caledonia Home Health Care & Hospice (or rather, I know they used to at least have one...because it got stolen). The device that had been issued by Caledonia, which I assume is a HIPAA covered entity, was not protected with HIPAA compliant encryption software.
According to the breach notification letter from Caledonia, patient information was lost when a "work issued netbook" was stolen from a nurse's house. The device contained PHI, including Social Security numbers. The police were notified of the theft.The netbook was password-protected, and it appears that the file holding the PHI was password-protected as well. Unfortunately, password protection isn't really considered to be protection when it comes to digital data, and is definitely not applicable for safe harbor from the HIPAA Breach Notification Rule.
Over the past couple of years, many HIPAA covered entities (and their business associates, BA) have increasingly begun to deploy encryption software to protect their data on laptops and external hard drives. This is despite the fact that computers like laptops have been used in medical settings for decades. What's driving this sudden interest in encryption?Sadly, it's not patient protection; if that were the case, encryption software would have been in use by most medical organization decades ago. Rather, the impetus lies in self-preservation: the Department of Health and Human Services has basically made it impossible not to use it.First, it has essentially admitted that only encrypted data will be thought of as "protected ePHI." While the HHS keeps stating that covered entities and BAs are not required to use encryption, regulation such as the Breach Notification Rule makes it impossible.Second, the HHS has begun to wield their power to hand out fines. With up to $1.5 million as a potential penalty, hospital administrators are beginning to take notice.
The thing is, sometimes you just can't protect a device even if you want to, and I guess that most netbooks fall into this category. Apple's iPads and their Android brethren are designed with security in mind. Same goes for smartphones.Computers are not really designed with security in mind, but third party software does an admirable job, especially because hardware power has increased.Netbooks, on the other hand, are underpowered laptops (which, remember, were not really designed with security in mind). Because they have been hamstringed hardware-wise, encryption software may directly compete for computing resources, slowing down the device to a point where it's less than useless. (What's worse that a computer that's about as useful as a doorstop? A computer that's about as useful as a doorstop that is running hot).What this means is that hospitals that made the terrible, in hindsight, decision to buy netbooks could be caught in a bind: they have a resource they have to use, but they can't use it if they encrypt it.There are only two solutions to this dilemma: either retire the machines or use an encryption solution that won't bog down the netbook.