in

This Blog

Syndication

Tags

News

AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.

AlertBoot Endpoint Security

AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.
  • HIPAA Encryption: Medical Record Theft From Shed Affects 40,000 In Jersey City

    Tens of thousands of patients were affected by a medical data breach in New Jersey.  Patient health records, collected between 1982 and 2009, were stolen from the shed of a Jersey City doctor.  The story would be unremarkable except for the number of people involved and the fact that information was stored on paper.  This is one of those instances where HIPAA encryption software wouldn't have helped.

    But it does raise a question:  are medical professionals kidding themselves when they proclaim a data "container" of any kind (be it a laptop, a USB drive, a tub full of discarded x-rays, or a box full of files) was stolen for its resale value, and not for the data in them?

    Terrible Security

    According to nj.com, the records were stolen from the grounds of a doctor's practice.  On those grounds was a storage shed.  In that shed were stored medical records of patients.  The thing securing this PHI data?  Two latches on the shed that "had been cut with an unknown cutting tool." (I'd bet good money the tool was a bolt cutter.  They're cheap and pretty portable.  It's why bike thieves use them).

    The breach involved SSNs, dates of birth, addresses, and medical histories.  Despite knowing what type of information was breached, the doctor was unable to name any of the patients when filing a police report.

    There is no easy and foolproof way of securing paper documents. This is especially true for single practitioners who cannot afford the professional services of storage experts like Iron Mountain (which, despite its name and line of business, has been at the center or a number of data breaches over the years).  What can an individual do?  Storing something in a shed sounds like a solution, although not an ideal one.

    Had it been me, I'd probably have done something similar but used a storage facility with security personnel and closed-circuit cameras.  It's not perfect but it's better than a random shed.  Plus, there is a certain degree of safety in anonymity: a shed owned by a doctor stores the doctor's stuff, but which lot among tens or hundreds of identical others belongs to him?  A crime of opportunity is harder for the latter.

    Was It for the Data?

    While the contents of the shed were not listed, it stands to reason that it contained something easier to flip on the streets than a bunch of documents.  Yet, there's no mention of other stuff.  The patient files were all that was taken.

    And with medical data theft (and illegal sale) growing at astronomical rates, the only conclusion to be garnered here is that the break-in is inextricably tied with data theft.

    Things become more nebulous, however, if theft involves something with tangible value, such as a laptop full of medical data.  Was the medical data the target or the laptop?  Many medical organizations will say it's the latter.  But there's no way to be sure.

    Furthermore, that's not the right question to ask.  The right question is, "seeing how a laptop full of PHI was stolen, what are the chances that the PHI will be breached?"  With incidents like the above, it wouldn't be an outlier to find that thieves check the contents of the laptop to see if they managed to land a fish that's more than it appears.

    Related Articles and Sites:
    http://www.phiprivacy.net/nj-medical-records-of-40000-patients-stolen-from-shed-behind-doctors-office/
    http://www.nj.com/hudson/index.ssf/2014/10/medical_records_of_40000_patients_stolen_from_jersey_city_doctors_office_police_say.html
    http://www.nj.com/hudson/index.ssf/2014/10/medical_records_of_40000_patients_stolen.html
     
  • HIPAA Laptop Encryption Used In Stolen South Carolina Dept. Of Mental Health Computer

    I don't think I've ever come across a story where a laptop stolen from a car turns out to be protected according to HIPAA standards.  Well, there's always a first time for everything: according to thestate.com, laptop encryption was used on a stolen computer that belonged to the South Carolina Department of Mental Health.  And while the department has earned its Teflon shield, the same cannot be said of the employee.

    Parked, Unlocked Car

    If there was ever a reason to use encryption software, the following story is it.  An employee with the above-mentioned department lost her state-issued laptop, cellular phone, keys, and a Wi-Fi hotspot device, among other things.  And while she's a victim of theft, she cannot avoid blame, either.

    For you see, the items were stolen from a car.  A car that was apparently parked unlocked.  Regardless of how safe the neighborhood, doing so is just asking for trouble.

    Of course, it's cases like this that makes the benefits of encryption salient: despite knowing that leaving your car parked and unlocked is a bad idea, people do it all the time.  The fact that there are valuables in it does not matter, apparently.

    In light of such actions, does it surprise anyone that employee education on data security procedures and best practices doesn't really work?

    Chicken and Egg

    There is room for debate here, though:  Couldn't it be that the employee had acted a bit irresponsibly knowing that the laptop was encrypted?  You know, like how people started speeding up once seat belts in cars became mandatory?  And then once again when airbags were mandatory as well?

    My guess is that the answer is "no" in this particular circumstance, since it wasn't only the laptop that was stolen.  A bunch of other stuff, presumably personal in nature, was stolen as well – and my guess is that these were not protected as well as the laptop had been.

    Why is This News?

    You got me.  Under HIPAA, the conditions are met for safe harbor from the Breach Notification Rule.  Furthermore, South Carolina is one of those states where the use of encryption provides safe harbor from state breach notification laws.  (And let's not forget that encryption provides real protection.  It's not just some requirement you cross off a list to "prove" that you're meeting some abstract requirement that's part of some bureaucracy).

    My personal guess is that an intrepid reporter found a case filed with the police department, and got wind of the data breach that way.  Had it not been for the theft of personal items, chances are that the public would not be aware of such a happening.
     
    Related Articles and Sites:
    http://www.thestate.com/2014/10/17/3751592_sc-mental-health-workers-laptop.html?rh=1
    http://www.phiprivacy.net/sc-mental-health-workers-laptop-stolen-in-rock-hill/
    http://www.perkinscoie.com/en/news-insights/security-breach-notification-chart-south-carolina.html
     
  • Data Encryption: TD Bank Settles For $850,000 After 1.5-Yr Long Investigation Over Lost Data Tapes

    In 2012, TD Bank reported the loss of backup tapes.  With over 250,000 people affected, it was one of the top data breaches for that year.  The lack of data encryption on the tapes – plus the fact that this happened in Massachusetts, which is considered to have one of the most onerous data security and notification laws in the country – meant that the financial institution had to report the breach.

    The site databreaches.net is reporting that the bank has announced a multi-state settlement of $850,000.  This is on top of whatever monies were used to deal with the immediate aftermath of the data breach (notifying clients, setting up call centers for people demanding to know more, carrying out forensic investigations, etc).

    All in all, there's nothing new here when it comes to the details of the resolution.  You've got the promises to do better, to educate employees, the upgrade to security, etc.  However, some things did catch my eye.

    9 AGs Involved, 1.5 Years

    Finding closure to the data breach took 1.5 years: the breach was reported in October 2012 (although the data breach itself took place 8 months earlier).  In addition, nine state Attorney Generals were involved – which presumably means residents in nine states were affected.

    Although there's no way to tell how often the bank had to accommodate demands for access and information, I can imagine that it couldn't have been easy.  It must have been costly, diverting manpower and man-hours to something that doesn't contribute to the bottom line at all (Unlike data security solutions like encryption software, which are expenditures without a positive ROI but can be argued that, like insurance, there's an upside should something go wrong).  To have done so for nine different AGs must have been something else.

    Backup Tapes will be Encrypted

    As part of the settlement, the bank has agreed that "no backup tapes will be transported unless they are encrypted and all security protocols are complied with."

    I'm surprised that this is even a thing…but then again, not so much.  Most reasonable people would of course ensure that similar future incidents do not reoccur.  In TD's case, since it's impossible to guarantee that backup tapes won't go missing again (it's outside their control), the best it can do is to encrypt sensitive data on tapes (which is within their control).

    On the other hand, I've already witnessed instances where organizations repeatedly fall victim to one breach or another that happens to be similar in nature: Unencrypted laptop stolen from car; unencrypted laptop stolen from van; unencrypted laptop stolen from home during burglary; etc.  Despite the differences, it's really just one data breach – and one that can be easily combated by the same action: encrypt your laptops.

    Related Articles and Sites:
    http://www.databreaches.net/nys-attorney-general-schneiderman-announces-multi-state-settlement-with-td-bank-over-lost-backup-tapes/

     

     
  • Encryption Law: EU Ministers Say Encryption Must Provide Safe Harbor From Publicizing Data Breach

    Details are beginning to emerge for the European Union's data protection laws.  Like many before it, the use of encryption software is being encouraged in order to safeguard the private information of people who fall under the EU's jurisdiction.  In addition, a number of other details are being considered as well, such as the use of anonymized data for satisfying data security requirements.

    Personal Data Breach Notification

    According to out-law.com, the EU committee has agreed that breach notifications must be made to regulators within 72 hours of discovery – assuming that it "'may result in physical, material or moral damage' to individuals."  Whether this language will constitute a loophole that undermines the breach notification laws remains to be seen. (For example, a person who's overseeing the breach may conclude that the risks are negligible – not because that happens to be the case but because going public with the breach would be bad business).

    A second similar problem that I see is the use of the term "undue delay":

    organisations would also face a new obligation to inform consumers "whose rights and freedoms could be severely affected" by a personal data breach of such an incident "without undue delay".

    What exactly is an undue delay?  Is a day? A week?  A month?  A year?  The use of such linguistic loopholes tends to create havoc for those the law intends to protect.  It's the reason why in the US, the set of federal rules governing patient information (aka, HIPAA) requires that a notification be sent as soon as possible but not later than 60 calendar days since discovery.  Pass this mark and you're in breach of the law.

    Likewise, the final version of HIPAA struck out the "harm threshold" requirement that reads similar to the may result in physical, material or moral damage to individuals language.  The reason?  It became quite obvious that such a clause invited abuse.

    Encryption Software Use Encouraged

    The committee also agreed that using data encryption would be grounds for providing safe harbor from reporting requirements,

    "appropriate technological protection measures" to protect the data that has been lost or stolen from being accessed by people not authorised to see it.

    "Such technological protection measures should include those that render the data unintelligible to any person who is not authorised to access it, in particular by encrypting the personal data,"

    Which is a good move, seeing how experts agree that the use of encryption provides more than adequate data security.  However, there is some room for improvement here, as not all encryption is created equally.  Perhaps the law should specify the use of strong encryption, or perhaps do what the HIPAA regulators have done and defer the definition of encryption to a body that oversees such matters (the NIST, in HIPAA's case).

    Likewise, the use of pseudonymization as a data security move is quite puzzling, as experts are increasingly finding out that this form of data protection doesn't really work, especially when combined with Big Data initiatives.  Although laws tend to lag behind technology, there's no reason why it has to when the evidence is well within reach.

    There is, of course, still plenty of time left for arriving at a final set of rules.  One hopes the EU will play it smart and not repeat the mistakes others have made before them.

    Related Articles and Sites:
    http://www.databreaches.net/businesses-should-not-need-to-publicize-personal-data-breaches-if-data-is-encrypted-say-eu-ministers/
    http://www.out-law.com/en/articles/2014/october/businesses-should-not-need-to-publicise-personal-data-breaches-if-data-is-encrypted-say-eu-ministers/
     
  • HIPAA Disk Encryption: Laptop Theft Affects 3400 In Georgia

    If an organization announces a data breach, but does not reveal whether it used data security software – like AlertBoot's managed HIPAA laptop encryption [ http://www.alertboot.com/ ; HIPAA disk encryption ] solution – is there a way to tell whether it was indeed using it?  You can if you're affected by HIPAA.

    Georgia Department of Behavioral Health and Developmental Disabilities

    According to phiprivacy.net, the Georgia Department of Behavioral Health and Developmental Disabilities (DBHDD) has alerted nearly 3,400 people that their information was breached when an employee of the department lost his laptop at a conference.  Well, "lost" isn't the right word.  It was stolen.  From the employee's car.

    Directly as a result of the theft, DBHDD has sent letters to the affected patients and followed other steps mandated by HHS (my emphasis):

    Because the laptop contains information of more than 500 individuals, the Health Insurance Portability and Accountability Act (HIPAA) requires that DBHDD notify the media about the incident. We have followed the reporting procedures mandated by the U.S. Department of Health and Human Services. The media notice, letter and this website provide information on how to contact DBHDD and federally-approved companies that offer free credit reports and free fraud alerts on those credit reports.

    It's not mentioned whether encryption software [http://www.alertboot.com/disk_encryption/mobile_security_byod_mdm.aspx ; cloud managed encryption ] was used, although the notice does acknowledge that "there are security measures in place on the laptop which will wipe the data and prevent access to the PHI if an unauthorized user attempts to access the internet."

    Now, this could either refer to (a) a disk encryption software [http://www.alertboot.com/disk_encryption/disk_encryption_product_tour.aspx ; disk encryption ] whose key can be erased from a remote source, like AlertBoot or (b) something that's not an encryption solution but manages to erase the information nonetheless.

    If I were a betting man – and I am – I would say that the notice is referring to the latter for the following reasons.

    HIPAA, GA Data Breach Notification Law, and Encryption

    A strong indication that encryption was not used lies in the department's actions as detailed in the above quoted blurb.  Under HIPAA, the use of strong encryption provides safe harbor from all those things that HHS mandates.

    Which is not a bad deal, seeing how going public with a data breach that has a real possibility of identity theft and other crimes leads to lawsuits; appropriating funds for rectifying the mess, including contacting breach victims and setting up answering services; the loss of face in the community at large as well as nationwide; and triggering an investigation by the HHS, which could lead to monetary penalties (up to $1.5 million per violation), annual security reports to the HHS for up to 20 years, and a full-blown inquiry into the policies and practices of the breached organization.  The latter can take years to complete.

    In short, if you've used encryption, you wouldn't be going through the Breach Notification Rule's mandates.  If you are, the odds are extremely good that you didn't use encryption.

    In addition, Georgia is one of those states where the state's own data breach notification laws provide safe harbor for encrypted data.  If the DBHDD breach had taken place in New York, one of the few states where there is no such protection, you may have to still have to report the breach (I'm not sure whether the law provides an out for data governed by HIPAA), and would provide an alternate reason for announcing the theft of a laptop with PHI.

    Otherwise, there's no real reason to do so because legally you're not required to and because the protection afforded by encryption is real (and not just some theoretical concept on a professor's whiteboard), so you're not doing wrong by your patients.

    Related Articles and Sites:
    http://www.phiprivacy.net/georgia-department-of-behavioral-health-and-developmental-disabilities-notifies-almost-3400-of-breach/
     
  • Managing Smartphone Encryption: Rehashing Myths

    The folks over at vice.com have commented on the latest smartphone security debacle – namely, the turning on of smartphone disk encryption by default – and the complaints from law enforcement over this decision.  Befitting the nature of the site, vice.com notes why law enforcement is wrong to raise the alarm.  And how some well-meaning people have bought into the arguments because they don't know better.

    Encryption Wars Redux

    What's probably most frustrating to people who are opposing the government's stance is that we've all been here before.  The encryption wars of the 1990s, where the government tried to rein in the use of cryptographic tools, covered the same arguments that are being made today, and led to the logical conclusion that backdoors should be anathema to everyone – including the government.

    The government's requirement that a backdoor be installed on security solutions for law enforcement is beyond the pale because it can't be guaranteed that only the government will be able to use it.

    Think about it.  Think about all the data breaches we've seen and heard of where hackers from Russia, or some Baltic state, or China, or wherever compromised the security of banking giants (who purportedly use the latest technology in security and hire the brightest), or the security of some government agency (including the military), or even the leading tech companies like Google.

    Granted, in these cases, it wasn't really a backdoor that was manipulated – there aren't any backdoors, as far as I know – but bugs, security holes, and other weaknesses.  From a technical standpoint, however, there is no difference between these weaknesses and a backdoor, although there is a difference in terms of policy or intent: a backdoor is put there on purpose.

    In other words, a backdoor is a weakness you plant on purpose.  That's it; nothing more, nothing less.  And while the government can promise to only use it in accordance with the law, what it cannot do is promise that everyone else who finds this backdoor will stick by that promise.

    Or, as the authors at vice.com put it more eloquently:
    So the next time a law enforcement official demands that Apple and Google put backdoors back into their products, remember what they're really demanding: that everyone's security be sacrificed in order to make their jobs marginally easier. Given that decreased security is only one of several problems raised by the prospect of cryptography regulation, you should ask yourself: Is that trade worth making?

    It's like that Refrigerator Joke

    This latest fight over encryption reminds me of that observation, that a person will open the fridge, late at night, looking for something to munch on.  He (or she – but usually he) finds nothing to his liking and closes the refrigerator door.  He then comes back 5 minutes later and opens it again, eyeing the contents again, then closes the door; and then comes back again… despite the fact that nothing has changed.

    Likewise with encryption and the argument for a backdoor.  Nothing has fundamentally changed in terms of the argument against encryption (and hence the need for a backdoor), while the arguments for the use of encryption have increased dramatically.

    Related Articles and Sites:
    https://news.vice.com/article/what-default-phone-encryption-really-means-for-law-enforcement
     
More Posts Next page »