in

This Blog

Syndication

Tags

News

AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.

Archives

AlertBoot Endpoint Security

AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.

February 2011 - Posts

  • HIPAA-HITECH Security: Weak Encryption Is Tantamount To Data Breach

    El Emam et al write in "How Strong are Passwords Used to Protect Personal Health Information in Clinical Trials?" that the safe harbor clause is note extended to cases where weak encryption software is used.  So, even if one were using disk encryption software to protect the contents of a notebook computer, if it's based on weak encryption, the loss of the computer will constitute a data breach. (I should note that AlertBoot uses AES-256, one of the strongest encryption algorithms available for commercial use).

    Don't Take It for Granted...

    Here's the particular paragraph from El Emam et al that points out the repercussions of using weak encryption to protect PHI:

    It should not be taken for granted that the default file encryption algorithms used to protect PHI are strong. In fact, we found that emailing the ZIP files in our sample would be considered a data breach under the US Health Information Technology for Economic and Clinical Health (HITECH) Act because they all used the weak ZIP 2.0 standard. Furthermore, the emailing of files encrypted using the default encryption in Word 2003 and earlier would also be a breach under the US HITECH Act. Therefore, the simple technical act of encryption does not ensure that this was done effectively [my emphasis]

    What is weak encryption?  It's usually defined along the lines of "encryption that uses a key of insufficient length that doesn't prevent it from being compromised in a meaningful time frame."  In other words, if I can figure out (or find or guess) the encryption key in, say, less than a year, you've got weak encryption.

    In practice, it's easier to say that weak encryption is anything that is not deemed strong encryption, since listing strong encryption algorithms is easier due to their limited numbers.

    HIPAA/HITECH Data Breach Safe Harbor

    It shouldn't be news to any HIPAA covered-entities that the HITECH Act amended HIPAA, or that there is a new data breach notification requirement in that amendment.  The new rules went into effect over a year ago, so if you're hearing about this now...well, get moving and secure your data.

    Also in the "not-news" category: there is a safe harbor component to the breach notification requirement.  Namely, any cases where ePHI (electronic protected health information) is lost but encrypted don't apply to the notification requirement.

    There is a caveat, however.  Nowhere is it specified what type of encryption one should be using.  Instead, readers of the guidelines will notice that they're referred to NIST publications regarding encryption.

    On the functional equivalent of a safe harbor:

    This guidance is intended to describe the technologies and methodologies that can be used to render PHI unusable, unreadable, or indecipherable to unauthorized individuals. While covered entities and business associates are not required to follow the guidance, the specified technologies and methodologies, if used, create the functional equivalent of a safe harbor, and thus, result in covered entities and business associates not being required to provide the notification otherwise required by section 13402 in the event of a breach. [19008  Federal Register / Vol. 74, No. 79 / Monday, April 27, 2009 / Rules and Regulations, my emphasis]

    On encryption software and NIST:

    (i) Valid encryption processes for data at rest are consistent with NIST Special Publication 800–111, Guide to Storage Encryption Technologies for End User Devices.

    (ii) Valid encryption processes for data in motion are those which comply, as appropriate, with NIST Special Publications 800–52, Guidelines for the Selection and Use of Transport Layer Security (TLS) Implementations; 800–77, Guide to IPsec VPNs; or 800–113, Guide to SSL VPNs, or others which are Federal Information Processing Standards (FIPS) 140–2 validated. [42742  Federal Register / Vol. 74, No. 162 / Monday, August 24, 2009 / Rules and Regulations]

    NIST rules out any encryption that wasn't tested by them, so if you're using something that was contracted out to be built for you, but never validated by NIST, you're not getting safe harbor from the HITECH breach notification requirements.  Likewise if you're using encryption software that is outdated, like those found in Windows Word 2003 or earlier, as mentioned by El Emam et al.

    It's not just a matter of using encryption.  You've got to use the right encryption: FIPS 140-2 validated encryption.

    You've Got to Wonder...

    Reading the above, you've got to wonder if there are HIPAA-covered entities out there that are essentially breaking the law because they don't know better.


    Related Articles and Sites:
    http://www.jmir.org/2011/1/e18/#ref51

     
  • HIPAA Encryption: Medical Researchers Get Some Recommendations From Colleagues

    In a paper titled "How Strong are Passwords Used to Protect Personal Health Information in Clinical Trials?" El Emam et al. note that most researchers involved in clinical trials are not properly using data security tools like encryption.  This is not necessarily because the researchers are not trying but because they're not aware that there are potential problems with what they're using.

    Researchers in Clinical Trials are Aware of the Need for Security

    Researchers conducting clinical trials are very aware for the need for data security.  Not only do they handle sensitive information on a regular basis, which would definitely be classified as protected health information (PHI), they also need to ensure "data integrity" and deal with aspects dictated by the Food and Drug Administration (FDA).  Not complying with regulations would not only mean potential fines (say, under HIPAA/HITECH) but also losses due to delays in rolling out a new drug.

    Consequently, there is a lot of effort spent on ensuring data security. The paper's authors found that most researchers who participated in a survey used encryption to secure data.  But, this did not necessarily mean that things were OK.

    Weak Passwords, Weak Encryption, Shared Passwords

    The authors of the paper concluded that clinical researchers were not up to snuff when it came to data security.  As already mentioned, it's not because they were not trying.  Rather, it's because they failed to grasp certain intricacies when it came to data security.

    One area that needed shoring up was the use of weak passwords.  El Emam et al. were able to recover passwords to 14 out of 15 encrypted files provided by researchers.  The authors of the paper were not hackers themselves.  Rather, they found two off-the-rack solutions ($30 and $130 each) that searched for passwords brute-force.  All 14 passwords that were recovered were done so in less than 24 hours.

    (The authors of the paper are aware that 15 samples is not a statistically relevant data pool.  However, the authors also note that those who provided the files were comfortable in the strength of their data security, so it's expected to be biased towards researchers who are security-aware and actively protecting data.)

    Finding such passwords, in such a short time, is only possible when passwords are weak: because it's too short, or because the password is composed of a word found in a dictionary, or because it's an often-used password.

    The authors also found that many of the files were protected with weak encryption.  For example, Microsoft Word has a built-in ability to encrypt files.  However, versions of the word processor (Word 2003 and earlier versions) use a weak form of encryption.  What this means is that, instead of figuring out the password to the encrypted file, one can search for the encryption key, an attack that could be easier and faster.

    Weak encryption was especially deemed a problem because files are frequently e-mailed.  E-mail is by design an insecure messaging system.

    And, last but not least, the authors of the paper found that researchers would share passwords, a data security affliction that is not unknown in less security-intensive businesses.

    Recommendations

    The authors recommend three general practices to bolster data security: strong encryption of e-mails and files, enforcing the use of strong passwords, and minimizing the sharing of passwords.

    It was also recommended that electronic data capture (EDC) systems be more inclusive, so that the security built into EDCs is accessible to "stakeholders in clinical trials" such as statisticians and other external consultants.  Such a move could terminate the need for the above three recommendations in one fell swoop.

    And finally, the authors of the paper noted that "encryption exemptions in breach notification laws should explicitly consider the strength of the passwords that are used."

    That's an interesting proposition.  That strong encryption be used is codified into many state data breach notification laws, in the sense that safe harbor is only granted if professionals working in the data security industry would deem a particular encryption algorithm as "secure."  What is, and what is not, deemed secure is left up to the pros to figure out.  This is a smart move, since vulnerabilities pop up unexpectedly, and would be impossible to write into law.

    I don't think anyone has suggested doing the same for passwords, though.  In many ways, this could reduce many of the data breaches we've seen in the past.


    Related Articles and Sites:
    http://www.jmir.org/2011/1/e18/

     
  • Laptop Encryption Software: A Key Tool In HIPAA/HITECH

    Another day, another report on the first year of the HITECH (Health Information Technology for Economic and Clinical Health) Act.  Like other similar reports, one can only conclude that the attentive use of cryptographic solutions like full disk encryption from AlertBoot would have prevented many of these breaches.

    Laptops, Theft Leading Cause of Breaches

    The report by Kaufman, Rossin & Co. notes that "theft was the primary cause of a data breach, occurring 58 percent of the time."  In other words, a little over half of all breaches, and which I assume includes hacker activity.  I expect the actual numbers to be much higher, though, for two reasons:

    • The report doesn't (can't, actually) include breaches involving less than 500 people because these are not made public by the HHS, as I noted in "500 is a tragic number."
    • There are those instances where it's literally claimed that it's not known how or where something was lost.  These tend to be classified as losses, but let's face it, a good number of them must be thefts.

    On the other hand, I imagine that the proportion of breaches including laptops would fall.  Over the past year, it was the leading cause of PHI breaches; however, when you consider that approximately 9,000 breaches in that period -- and also considering that laptops and other electronic storage devices tend to include records for more than 500 people -- it wouldn't be a bad guess to say the ratio of breaches involving laptops will fall, even if total incidences involving laptops increase.

    (I could be very wrong, though.  I have to admit that my perception on the link between electronic storage and big numbers must be colored by what I read on the news, which generally deals with big numbers.  You're not making headlines if a stolen laptop had data on two people.)

    32% of Breaches Reported within the First Three Months

    One conclusion that I have not seen before is on how fast these breaches are reported.  According to the Kaufman, Rossin & Co. report, thirty-two percent of breaches are reported within three months of it occurring.  Or at least, that's how I'm interpreting it.  They could have meant "within three months of finding the breach" but that would imply that 68% of covered entities out there are not in compliance with HITECH requirements to make the report with 60 days of being aware of the breach.

    Encryption Software Would Really Help

    There are many reports out there analyzing the first year of HITECH, especially on the requirement to alert the HHS on data breaches involving PHI.  Each one of those reports, and experts commenting on those reports have made the same observation.

    Namely, that the use of encryption would solve a lot of these problems.  Not all of them, of course, but a lot of them.  Especially when you consider that the loss of digital storage devices tend to involve more PHIs than other forms of breaches.


    Related Articles and Sites:
    http://www.bizjournals.com/southflorida/news/2011/02/23/report-details-health-care-reform-theft.html

     
  • Drive Encryption Software: Henry Ford Has Second Data Breach, Loses USB Key

    Henry Ford Health Systems has alerted the general public that they suffered a data breach, their second incident in three months.  It looks like they have another HIPAA violation on their hands, this time a consequence of not using drive encryption like AlertBoot on a USB memory stick.

    Employee Loses Flashdrive with Medical Data

    The breach occurred on January 31, when a USB flash drive with information on 2,777 patients was lost by an employee.  The device has not been found.  Furthermore, it is not known to date how the device was lost.

    What is known is that there were files with names, medical record numbers, test information, and results for urinary tract infections.  As data breaches go, this one appears to be an embarrassing one for patients as opposed to being a financially calamitous one.

    Also, it only affects patients who visited Henry Ford between July and October 2010.

    Still, one cannot deny that the lost information is classified as protected health information (PHI) under HIPAA, and will require Henry Ford Health System to notify the affected patients and the HHS as well.

    Henry Ford had Breach, Knew What They Had to Do

    It was in November 2010, barely three months ago, that a laptop was stolen from Henry Ford.  Oddly enough, that device, too, had information that affected urology patients.  In a serious case of déjà vu, the medical facility noted that SSNs and other financial information was not included -- just medical information, such as treatments and the like, as in the more recent case -- and that laptop encryption was not used, which was required per the facility's policies.

    At the time, they had declined to reveal how many people were affected, which was eventually made public by the HHS.  I guess the persons managing the breach must have noticed that, too, since this time they're much more open.

    According to a couple of sources, there is a zero-tolerance policy on unsecured patient information at Henry Ford, and employees will either face a suspension or termination.

    Can't Blame Henry Ford for This One, In My Opinion

    It was easy to blame Henry Ford in the previous data breach: a laptop was stolen, which is pretty easy to encrypt, and more importantly, to keep track of.  Plus, they kind of "aided" in the robbery because the office was left unlocked (self-locking doors anyone?).  Ten seconds is all it takes for a guy to step in, grab a laptop, stuff it under his shirt, and leave.

    This latest one?  I'm willing to point my finger to the employee (unless, of course, nothing was done to educate employees about data security).  First of all, I'm sure the news of the breach would have been fresh on everyone's minds, so saving all that info to the USB stick was a poor move.

    Second, there's a good chance that this lost USB stick is not hospital property, but someone's personal storage device.  As I noted yesterday, we're in an age where computer storage devices are placed as impulse-buy tsotchkes at the grocery check-out line.  How's the hospital responsible for the encryption of a personal device?  It cannot be, especially if it went ahead and educated the employee about patient data security.

    (Granted, they could have been a little more proactive with a security measure like AlertBoot, where plugging in a USB storage device to an encrypted computer will also automatically encrypt the USB device....this way, personal device or not, it's getting protected.)


    Related Articles and Sites:
    http://www.detnews.com/article/20110224/BIZ/102240471/1361/Henry-Ford-tightens-security-after-patient-data-lost
    http://www.freep.com/article/20110224/BUSINESS06/110224061/0/SPORTS03/Henry-Ford-Health-System-employee-loses-flash-drive-containing-patient-information?odyssey=nav|head

     
  • Data Security: Why Credit Card Data Needs To Be Encrypted On Your Computers

    You might wonder why credit card information should not be stored on computers (unless they're protected with encryption, such as AlertBoot).    PCI Security Standards, for example, state that primary account numbers (PAN) shouldn't be stored unless they're secured with strong encryption.  Well, now you can find out why and read all about it...and in real-speak, no less (i.e., no difficult techno-speak).

    Wired.com is carrying an excerpt of the book "Kingpin -- How One Hacker Took Over the Billion Dollar Cyber Crime Underground."  It's the story of one Max Vision (that's his real name, legally changed from Max Butler), a one-time white hat hacker turned bad.  Long story short: he's serving time for stealing two million credit card numbers from computers all over the US.

    If the rest of the book is anything like the excerpt, it will prove itself to be an interesting read.  The excerpt alone, however, shows us why saving credit card information in plaintext form is a bad idea.

    Stealing Card Info from POS Systems

    What did Max Vision do specifically, though?  He was able to log into PCs that were acting as point-of-sale (POS) systems by leveraging a vulnerability in Windows PCs:

    His scanning put him inside a Windows machine that, on closer inspection, was in the back office of a Pizza Schmizza restaurant...  it collected the day's credit card transactions and sent them in a single batch every night to the credit card processor. Max found that day's batch stored as a plain text file, with the full magstripe of every customer card recorded inside.

    Even better, the system was still storing all the previous batch files, dating back to when the pizza parlor had installed the system about three years earlier. It was some 50,000 transactions, just sitting there, waiting for him. [wired.com, my emphasis]

    He also exploited other vulnerabilities that gave him access to even more machines.  If you'd like to see how one hacker can create mayhem, this excerpt alone will open your eyes.

    Encrypting PCI Information

    Once you're done reading, you might wonder, well, how would encryption software have protected all these different business that were hit?

    Basically, encryption is a method that allows one to scramble and unscramble information.  Even the best hackers in the world call it a day and move on if they find files that are encrypted: it's virtually impossible to gain access to them.

    Granted, there are some methods that hackers could leverage to obtain the passwords to encrypted information.  For example, if a hacker is able to access a computer, chances are he's also able to install software such as a keystroke recorder, which creates a log of all keyboard sequences that were pressed.  It would easy to capture a username and password for any encrypted files.

    On the other hand, consider the 50,000 credit card numbers that Max Vision stole from Pizza Schmizza and promptly deleted.  Among the many reasons, I imagine one factor would have been that nobody would miss it because he saw that no one was accessing it.  These would have been protected because they were rarely accessed, if ever. 


    Related Articles and Sites:
    http://www.wired.com/threatlevel/2011/02/kingpin-excerpt/

     
  • HIPAA Encryption: Fines Not Related To Data Security Are Also Something To Think About

    From time to time I cover stories and issues where HIPAA/HITECH and medical data encryption intersect.  Today, I'm going to just observe that HIPAA/HITECH involves more than patient data security, and that the HHS is not a sleeping lion anymore.

    Cignet Health Fined for Not Collaborating

    On February 22, the U.S. Department of Health and Human Services' (HHS) Office for Civil Rights (OCR) imposed a fine of $4.3 million on Cignet Health of Prince George's County, MD.

    Of the total amount, $1.3 million was for denying patients access to their own medical files, which is a violation of the HIPAA Privacy Rule.  Under this rule, patients must be provided a copy of their medical records no later than 60 days from the original request (it's supposed to be 30 days from the request, but can be extended an additional 30 days if permission is granted).

    The other $3 million was imposed because:

    During the investigations, Cignet refused to respond to OCR’s demands to produce the records. Additionally, Cignet failed to cooperate with OCR’s investigations of the complaints and produce the records in response to OCR’s subpoena. OCR filed a petition to enforce its subpoena in United States District Court and obtained a default judgment against Cignet on March 30, 2010. On April 7, 2010, Cignet produced the medical records to OCR, but otherwise made no efforts to resolve the complaints through informal means.

    OCR also found that Cignet failed to cooperate with OCR’s investigations on a continuing daily basis from March 17, 2009, to April 7, 2010, and that the failure to cooperate was due to Cignet’s willful neglect to comply with the Privacy Rule. Covered entities are required under law to cooperate with the Department’s investigations. The CMP for these violations is $3 million. [sunherald.com, my emphasis]

    Dang, what the heck was Cignet thinking?  They refused to respond and to cooperate?  Plus, if the OCR was able to obtain a default judgment, it implies that Cignet didn't even bother to show up in court when summoned.  That's essentially saying, "hey, I admit to whatever I'm being accused of and agree to make it legally binding."  And when Cignet did eventually deliver the patient records,

    Cignet delivered 59 boxes of records to the U.S. Justice Department, which contained not only the records of the 41 patients, but also the records for 4,500 other patients who did not request their release. [medpagetoday.com]

    As if that's going to annoy the OCR.  If anything, OCR now probably has a reason to give Cignet another ginormous fine, since Cignet has unnecessarily shared PHI with unconcerned parties: the OCR didn't ask for the 4,500 other patients' info, so....they really shouldn't have that information.


    Related Articles and Sites:
    http://www.hhs.gov/news/press/2011pres/02/20110222a.html
    http://www.hhs.gov/ocr/privacy/hipaa/enforcement/examples/cignetresolutionagreement.html
    http://www.healthleadersmedia.com/page-1/LED-262929/HHS-Issues-Civil-Money-Penalty-for-Privacy-Rule-Violations
    http://www.sunherald.com/2011/02/22/2883973/hhs-imposes-a-43-million-civil.html
    http://www.medpagetoday.com/PublicHealthPolicy/HealthPolicy/25036

     
More Posts Next page »