in

This Blog

Syndication

Tags

News

AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.

AlertBoot Endpoint Security

AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.

May 2010 - Posts

  • Data Encryption: School (K-12) Medical Records Are Protected By HIPAA? (Updated)

    I came across an article that left my head scratching.  According to lahontanvalleynews.com, Tom Considine opined in his show "Who Complies" that a school was in breach of HIPAA because it was disposing of student health files by dumping it at a landfill.  Sure, it's an issue quite removed from laptop encryption software stories, but I couldn't help looking into it.

    (My second update to this post contains a link to "Joint Guidance on the Application of FERPA and HIPAA to Student Records," which should clear up a lot of issues regarding whether a student record comes under the auspices of FERPA or HIPAA.)

    I Think Tom's Wrong; FERPA Applies, Not HIPAA

    I've read Considine's words here, and I've got to wonder whether he actually looked into the issue.

    I did a quick check on-line, and found that, for the most part, school records don't fall under HIPAA.  Rather, FERPA comes into play, as detailed at privacyrights.org ("Health records kept by schools are classified as "education records" covered by the Family Educational Rights and Privacy Act (FERPA)") and worldprivacyforum.org, which notes the same (unless we're talking about private schools; apparently, they work under a different set of rules).

    FERPA, if you're not aware, stands for Family Educational Rights and Privacy Act and is overseen by the Department of Education.  (If you've ever had to go to the school medical center during your college years, you've probably come across FERPA literature one way or another).

    FERPA Doesn't Require Encryption

    As far as I can tell, FERPA doesn't require the use of encryption for sensitive information, including health information.  This probably accounts for the rash of university-related data breaches that I encountered in the news.  The Georgetown U. alumni database theft from a couple of years back come to mind, for example.

    Hm.  Perhaps FERPA should look into data protection.  Based on what I've learned above, I'm going to admit that I don't know whether I should take this at face value, but Considine claims that "schools are targeted five times more for identify theft because students may not learn about it until years later." (my emphasis; five times more than who? or what? is a valid question, I think).

    If this is true, considering how many schools and students we have in the US--and how few of them have adequate security for protecting assets like computers--there might be a potential minefield there that hasn't attracted much notice.

    (Update: I was about to publish this post when I noticed that databreaches.net had actually come to the same conclusion regarding HIPAA vs. FERPA.  Serves me right for not reading until the end; it could have saved me a lot of time.)

    (Update 25 May 2011): Found an excellent resource on FERPA v. HIPAA: Joint Guidance on the Application of FERPA and HIPAA to Student Records has nuggets like these:

    At the elementary or secondary school level, students’ immunization and other health records that are maintained by a school district or individual school, including a school-operated health clinic, that receives funds under any program administered by the U.S. Department of Education are “education records” subject to FERPA, including health and medical records maintained by a school nurse who is employed by or under contract with a school or school district.

    Some schools may receive a grant from a foundation or government agency to hire a nurse.   Notwithstanding the source of the funding, if the nurse is hired as a school official (or contractor), the records maintained by the nurse or clinic are “education records” subject to FERPA.


    Related Articles and Sites:
    http://www.databreaches.net/?p=11952

     
  • Disk Encryption Software: Loma Linda Hospital Reports Computer Theft

    Man, coming in to work on Memorial Day is a tough deal.  At least, it's given me time to catch up with data breach and security news.  For example, this blurb that escaped my notice from last week: the theft of a computer from Loma Linda Hospital, with information which I assume was not protected with hard drive encryption software.

    More Than 500 Affected, Desktop Computer Stolen

    According to pe.com, a desktop computer was stolen from an administrative office.  The computer contained patient names, medical record numbers, diagnoses, surgery dates, and the types of procedures underwent by the patients.  A separate notice on the presence of financial information and SSNs is not mentioned; however, seeing how the computer was stolen from the Department of Surgery, one imagines such information wouldn't necessarily be present.

    What Commentators Say

    Two comments at the pe.com have arrested my thoughts.

    First, guenavere noted how the new "Healtcare bill...states personal information both financial and health will be obtained by the government. I wouldnt be surprised if this is the beginning of it." (all misspelling errors her own).  Okkkkaaaaayyyy.....

    All the more reason for using encryption software, like centrally managed encryption, on patient records, then.

    Second, a comment by tax payer, asking if password protection was used.  Actually, this also relates to a comment by wkenddadPassword-protection is not really protection.

    In fact, under HIPAA, password-protection has been effectively given the status of "not really providing protection."  How else can you explain that only the use of encryption or the destruction of patient data is afforded reprieve from sending notification letters when a breach takes place?

    Or the fact that "secure" health information is literally defined as encrypted or destroyed information?

    On the Assumption that Encryption Was Not Used

    If you follow the "related articles and sites" links below, you'll notice that there is scant information on the breach.  Which might leave you asking, how do you (meaning, me, the author), or can you, assume that an encryption program was not used to protect the contents of the stolen desktop computer?

    The answer lies partially on the above HIPAA requirement on notifications: since there was a notification, it can be assumed that encryption was not used.

    Furthermore, the state of California--where Loma Linda Hospital is located--also has similar requirements regarding breached medical information.  As I recall, they, too, give safe harbor from reporting requirements if information has been encrypted.

    I'm also operating under the assumption that no medical entity would want to burden their patients by raising a false flag--I mean, the use of encryption software like AlertBoot would have nullified any threats possibly arising from this particular theft.

    Related Articles and Sites:
    http://www.pe.com/localnews/stories/PE_News_Local_D_nb26_information.3353e01.html
    http://www.mercurynews.com/news/ci_15165109?nclick_check=1

     
  • Laptop Encryption Software: Cincinnati Children's Hospital Breach Affects 61,000 Records

    A laptop computer was stolen from Cincinnati Children's Hospital Medical Center, resulting in the loss of 61,000 patient records.  The details surrounding the incident show that this is a clear violation of HIPAA.  If only they had used drive encryption software, they may have saved themselves a lot of money in terms of notification costs and potential fines.

    Another Theft from a Car

    According to cincinatti.com, the laptop was "stolen from a hospital employee's personal vehicle while it was parked outside the employee's home in late March."

    Password-protection was used on the stolen laptop, but the information did not make use of laptop encryption, which would have provided a far greater degree of security.  Patient names, medical record numbers, and services provided were part of the personal information that was breached.  SSNs, credit card numbers, and phone numbers were not included.

    Notification letters were sent out to several states and foreign countries.

    It was noted that by the hospital spokesman that "it was appropriate for the employee to have the laptop outside the work setting."

    I...disagree.  I think the employee was allowed to have the laptop outside the work setting.  I don't know that it was appropriate under the circumstances.

    Also, I might be working with the quote out of context, but I do note that the spokesperson didn't say anything about the employee being authorized to take the data outside the work setting, although that's the implication.

    This is Clearly a HIPAA Violation

    There is no getting around it: this is a HIPAA violation, and the admission by the hospital's spokesman just confirms it.  Under HIPAA, protected health information (PHI) must be secured, preferably with encryption if one's dealing with digital data.

    While there is no requirement to encrypt PHI--a covered entity, in this case the hospital, can forgo the security measure if it believes that PHI is safe enough--the covered entity must provide a reason as to why it believes PHI is safe without encryption.

    Now, "it's inside a car" is not a legitimate reason/defense.  And, when you combine the fact that the employee was authorized to carry that laptop around, and by implication the PHI...well, you can't blame the employee for this particular breach.  It was up to the hospital to make sure that this device was protected.

    It could very well be that someone in their IT department dropped the ball.  After all, the place does have over 10,000 employees and is the fourth largest company in Cincinnati by employee count (top three are Kroger, U of Cincicnnat, and P&G, respectively, if you're interested.  Walmart comes a distant seventh).

    Assuming the computer count is half of the number of employees, we're talking about 5,000 computers.  I don't know what the Children's Hospital's position is when it comes to disk encryption, but assuming they had a policy of complete coverage for all endpoints, it would be easy to miss a laptop or two.

    Or, it could even be a case where the user of the laptop disables the encryption in place because he or she feels "it slows things down."

    Centrally Managed Encryption Could Have Helped

    Yes, this is a plug for our AlertBoot encryption as a service, but the points I bring up are pretty salient:

    One, encryption as a managed service means a centralized database for managing encryption keys (which is an important objective under HIPAA when it comes to encryption).  This means easy key management, which means the hospital's IT department would have had an easier time with their encryption deployment.

    Two, easy audits: the built-in encryption audit report, which generates the report in real-time, shows which machines are encrypted, which ones had a problem with encryption, and which ones haven't even been touched by the encryption initiative at all.  If the stolen laptop above was one of those that fell through the cracks, the IT department would have been able to tell and resolve the issue.

    Three, proving compliance:  A recent survey from the Ponemon Institute showed that only 44% of companies would have been able to prove that they had secured their data.  Seeing how HHS has gained the power to fine covered entities up to $1.5 million for data breach incidents, the ability to prove compliance--using the same audit reports mentioned above--is not a small matter.

    Four, notifying patients.  Under HITECH-amended HIPAA, a covered entity is required to send breach notification letters to patients.  Seeing how over $150 is spent per record when it comes to patient notification and other follow up measures (setting up toll-free lines for answering patient questions, etc.), Cincinnati Children's Hospital is looking to at least a potential $9 million outlay.  The only way to avoid it all would have been by encrypting their PHI.

    And that's even before the HHS starts looking into the issue and assessing any fines, if any.

    Related Articles and Sites:
    http://news.cincinnati.com/includes/interstitial/ad.html
    http://www.phiprivacy.net/?p=2824

     
  • Disk Encryption: Towers Watson Information Breach The Next Colt Express?

    databreaches.net has noted that a third entity has reported a data breach related to two DVDs missing from Towers Watson, and is wondering whether this is the next Colt Express situation.  If you'll recall, Colt Express suffered a break-in, and computers with sensitive information--which were not protected with disk encryption software like AlertBoot--were stolen, a disastrous event for a benefits administration company.

    Eventually, Colt Express's breach went to affect companies such as Google, CNet, CA (Computer Associates), and others.

    Towers Watson - 3 Affected So Far

    The entities affected so far by the Towers Watson breach (well, the ones that went public):

    • Lorillard Tobacco: 2700 people, at least, per my reasoning
    • General Agencies Welfare Benefits Program:1874 people
    • City of Charlotte, NC: 5200 people

    Towers Watson was, at least in the cases reported so far, also in charge of benefits administration, so therein lies the parallel to Colt Express.  But, that's also where the parallels end as well. 

    TW is a company with a physical presence around the globe; Colt Express was filing bankruptcy proceedings at the time of the breach.  Colt Express suffered a break-in; TW lost two DVDs that were sent as part of a shipment.  Colt Express, it could be argued, didn't have control over the breach, whereas TW did.

    Accidents: Unintentional but Preventable

    TW sent that DVD unencrypted, despite knowing that the information contained within was sensitive in nature.  Of course, the breach itself is an accident.  However, it behooves us to explore that word a bit.  Is this an accident in the sense that "it was not preventable" or in sense that "we never meant for it to happen?"

    Clearly, it's the latter.  There was nothing unpreventable about the breach itself.  As a business that deals with sensitive information all the time, Towers Watson probably knew of the need for securing that data, and has no doubt used encryption software in the past for similar situations where DVDs had to be mailed (otherwise, I would have to point out that the breach was just waiting to happen).

    So, while I can appreciate the fact that this was an unintentional breach (very rare to find an intentional one, actually), I think that many would find it hard not to blame the company.  Or, more specifically, that one employee that either forgot or forwent the use of data encryption.


    Related Articles and Sites:
    http://www.databreaches.net/?p=11855
    http://www.charlotteobserver.com/2010/05/26/1459410/some-charlotte-workers-personal.html

     
  • iPhone Encryption Is For Naught Under Linux

    There are reports that an iPhone will reveal its content when hooked up to the newest release of Ubuntu Linux.  This, despite the fact that the latest iPhone generation--iPhone GS--comes with built-in hardware encryption.  Goes to show that "having disk encryption" and "having your data protected" are not always the same thing.

    Ubuntu 10.4 Lucid Lynx Compromises iPhone

    When an iPhone is hooked up to a computer with the latest version of Ubuntu linux, all the security in place falls by the wayside.  Techie-buzz.com put it the best:

    Apple has more than once, boasted about the hardware data encryption used on its flagship iPhone. The hardware encryption uses a 256-bit AES and is an in your face feature as it cannot be disabled by users even if they want to.

    An iPhone can be connected to a PC just like any other device though the connection requires the standard methods of authentication by a passcode and an initial pairing. Further, connecting a locked iPhone to a computer is also not possible.

    As security researchers Marienfeldt and Herbeck found out though, Lucid Lynx, the latest Ubuntu distro, makes a mockery of the iPhone's security:

    I uncovered a data protection vulnerability [9], which  I could reproduce on 3 other non jail broken 3GS iPhones (MC 131B, MC132B) with different iPhone OS versions installed (3.1.3-7E18 modem firmware 05.12.01 and version 3.1.2 -7D11, modem 05.11.07) , all passcode (4 digits) protected which means the vulnerability bypasses authentication for various data where people most likely rely on data protection through encryption and do not expect that authentication is not in place.

    To clarify, the given file access is read and write !

    The unprotected iPhone 3GS mounting is “limited” to the DCIM folder under Ubuntu < 10.04 LTS, Apple Macintosh, Windows 2000 SP2 and Windows 7. The way Ubuntu Lucid Lynx handles the iPhone 3GS [6,7,8] allows to get more content.[Bernd Marienfeldt]

    So, Where's the Encryption, Then?

    Stuck in a bad implementation, apparently.  According to Chester Wisniewski at Sophos, the thing that he noticed about iPhones:

    If you turn on an iPhone it boots all the way up and allows access from USB.

    If the device boots, it must be able to access the encryption key without a passphrase. In turn this means it is as good as unencrypted as soon as it is turned on.[Sophos]

    Contrast this with full disk encryption like AlertBoot on a laptop or desktop computer: the machine will not boot up until the correct password or passphrase is typed in.  This is called pre-boot authorization, and is meant to provide better protection where complete disk encryption is used.

    Of course, the trials and tribulations of the iPhone's encryption is nothing new.  Within days of its debut last year, security experts were commenting on how the iPhone's encryption was broken.


    Related Articles and Sites:
    http://www.zdnet.com/blog/hardware/ubuntu-lucid-lynx-1004-can-read-your-iphones-secrets/8424

     
  • Data Encryption Software: AMEX Site Fails Encryption, Fixes After Public Outing

    You might have heard by now that American Express dropped the ball when it came to on-line encryption.  While it's not the same encryption that AlertBoot uses in its drive encryption--we use something far stronger--the case does highlight an important aspect of data encryption: following up and auditing.

    AMEX's Mistake - Solved Less than 4 Hours After Going Public

    It should be noted that the problem is already fixed, and it only affected people who received a particular e-mail (or had access to a particular html link, I take it).

    The problem was first brought to light by Joe Damato at timetobleed.com.  He had received a signup e-mail from the American Express Network, via their "Daily Wish" program.  Damato visited the site, via a link provided in the e-mail, and found that the sign up form required a lot of personal information.

    He did some sleuthing, and initially found that secure http (https) was not being used, at least not at first glance.  So, he decided to do a little more sleuthing to see if the data would be sent via https.  It turns out it didn't. (Secure https is an encrypted internet session, if you didn't know.)

    Damato went public with the results by posting it on his blog (no mention on whether he gave Amex a heads up).

    Following Up and Auditing

    The fix was up and running in less than four hours, which is great.  What's not so great is that American Express had the problem to begin with.  I mean, we're talking about a credit card company that has plenty of reasons and experience not to make such a mistake.

    While I'd like to be as incredulous as Damato regarding the situation, I've seen such instances before, more than often enough.  The best way to combat this?  Following up and routine audits, which are supposed to be part of your data security framework anyway.

    In the above case, someone should have gone through the actual site and made sure that everything worked as intended.  I'm willing to bet that someone did, but wasn't as concentrated on security as on making sure "things worked."

    When it comes to AlertBoot, following up and audits are especially important.  Because AlertBoot is a centrally managed encryption service, and doesn't require IT personnel to go around securing individual machines, it was necessary to find a method for ensuring that a corporate computer was securely encrypted.  Thus, an audit report was built in from the ground up when AlertBoot was designed.

    An added benefit to this is that the same encryption audit report can be generated at any given time to provide a real-time picture of the encryption landscape.  So, in the event that a machine is lost or stolen, one can prove that the data is secure.


    Related Articles and Sites:
    http://timetobleed.com/warning-american-express-fails-miserably-at-basic-security/#
    http://digg.com/security/American_Express_Might_Not_Be_Encrypting_Your_Transaction

     
More Posts Next page »