in

This Blog

Syndication

Tags

News

AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.

Archives

AlertBoot Endpoint Security

AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.
  • Horizon BCBSNJ HIPAA Charge Over Two Laptops Settled For $1.1 Million

    Horizon Blue Cross Blue Shield of New Jersey (Horizon BCBSNJ) has settled a data breach that affected approximately 690,000 New Jersey residents. This data breach was noted on this blog not too long ago: In January, the Third Circuit Appellate Court declared that a lawsuit against the insurer could proceed because the "improper disclosure" of personal data is a violation of FCRA.
    That is, if unauthorized people gain access to personal information via a data breach, it is in violation of federal regulations concerning credit reports (which businesses use to vet out people, be they employees, potential clients, etc).
    As it turns out, concurrent to the above legal toil, Horizon BCBSNJ was being investigated by the Department of Health and Human Services. In the end, the company decided to settle for $1.1 million (that's $550,000 per stolen computer). And, the report accompanying the settlement uncovered more details on what happened.  

    Two Macs Stolen, Their Security Cables Cut

    The data breach occurred when two Macintosh laptops were stolen from the eighth floor of Horizon BCBSNJ's offices. These computers were password-protected but not encrypted. They were tied to their desks using security cables. There was no additional security.
    Which is odd because all Macs since the early 2000's came with FileVault, Apple's free encryption software for computers. Why was this not used? After all, it's free and doesn't impact a Mac's performance when turned on.  

    Oops. Uh-oh. Oops, Again

    It turns out that the insurer's IT department didn't know these two computers were out there. In fact, these two and 100+ more laptops did not show up on the IT department's radar because they "had been obtained outside of the company's normal procurement process."
    This is understandable. Not excusable, mind you, but understandable. Keeping track of inventory is one of those impossible quests. Even the military fails at it, and they're trying to account for dangerous stuff. Like warheads. That laptops are not accurately inventoried should not come as a surprise.
    Of course, it's because of reasons like these that IT departments generally tend to secure a device before it's released into the wild. In the earlier half of this decade, people were still arguing that one should conduct an investigation into how a machine would be used, who would be using it, what kind of information would be stored in it, etc., and then decide what type of security to install on it, if any.
    Others pointed out that that particular approach is a pipe dream because nothing ever happens the way it should (aka, a variation of Murphy's Law). Time has shown again and again that the cynical outlook is the correct one when dealing with the real world.
    Which brings us to Oops #2: it turns out that the laptops at the center of the data breach belonged to employees who were not supposed to be handling protected health information (PHI). And yet, these laptops contained PHI. Murphy strikes again.  

    Doing the Best You Can

    You've got to feel it for Horizon BCBSNJ: they had implemented encryption across all machines after the company had experienced a data breach that involved a stolen laptop with sensitive information. They had announced the completion of that particular security project in May 2008. They had taken the time to encrypt both laptops and desktop computers.
    And, five years later, the company fell victim to what is essentially the same problem: a data breach borne from laptop theft.
    But, this is the wrong way to look at the situation. Not being privy to Horizon BCBSNJ's internal data, the following can only be speculation, but they've probably averted a great deal of similar data breaches since 2008. After all, laptops are lost and stolen quite frequently, and the company does have over 5000 employees. And, the bigger the organization you are, the greater the probability (some might say certainty) that you'll be missing a laptop each year. That the insurer did not have to report a data breach, stemming from a missing computer, until 5 years later is, in a weird way, something to be congratulated on.
    As security professionals say, there is no perfect security. You can only minimize, as much as you can, the chances of being affected by a data breach. Horizon BCBSNJ could have done better, obviously. But knowing now what we do about the incident, and considering what we've seen in the data security field in the past 10 years, it could be argued that the insurer did a pretty good job (with room for improvement).
     
    Related Articles and Sites:
    http://hackensack.dailyvoice.com/police-fire/horizon-blue-cross-blue-shield-pays-11m-for-customer-data-breach/700389/
    https://www.databreaches.net/horizon-blue-cross-blue-shield-pays-1-1m-for-customer-data-breach/
     
  • Australia Finally Gets A Data Breach Notification Law

    The Land Down Under is finally getting a data breach notification law. This should come as a surprise to many since (a) many would have assumed that Australia already has one and (b) it's 2017 – unless you're a war-ravaged country, chances you have a breach notification law. Because that's how bad things are on the internet.

    And despite the country taking it's time on formulating a notification law that they can live with, one has to wonder if they've thought things through.  

    Applies to Entities Covered by the Privacy Act

    If you will, the new data breach notification law is an extension of Australia's Privacy Act because the new legislation applies only to those entities that are governed by it. That is people – real or legal – that are NOT:
    • Doing less than AUS $3million in sales p.a.,
    • A political party,
    • Part of the government.

    If not one of the above, the new law applies to you.

    Now, the government excusing themselves from following the legal obligations they place on others is nothing new. Plenty of countries do it, although not all: in the UK, for example, the government also has to reveal their data security shortcomings, be it the National Health Service, members of the judiciary, etc.

    The USA has also done the same. The Veteran Affairs department is constantly embroiled in hacks and other breaches. Likewise, other US state and federal departments have gone public with data breaches.

    But then again, not all countries follow the same level of transparency. So, Australia can be excused if it feels like not leading by example. It will be in excellent company either way.  

    Turnover of $3 Million

    However, one has to take exception to not covering small businesses that make less than $3 million in a year. Hard-working, financially-pressed mom-and-pop stores should be given a break; anyone knows that, when hacked, doing right by a data breach law can be expensive and time-consuming. Even Fortune 500 companies have problems with it, and they have money and personnel to spare for such things. (Well, not really – but they easily find the money and personnel to take care of it).

    But, by excluding small businesses, there is the tacit implication that they couldn't be embroiled in a huge data breach, especially if they're not making much in the way of sales. What if you're a "successful" internet startup that's financing your venture on borrowed money? In that case, sales figures would be $0. Employee count could be less than twenty, which coincides with a small business . But your customer base is a gazillion.

    A breach of this business's customer database would be tremendous. (For example, Instagram had 13 employees when it was acquired by Facebook and, if memory serves, had zero dollars in sales because it was still funding itself via venture capital. Monetization didn't come until later).

    Under the circumstances, the Privacy Act would not apply to would-be Australian Instagrams (Instagrammers? Instagrammies?). Shouldn't an exception be made for such a small business?

    It seems like a clause that introduces dependencies on the number of people affected by the data breach should have been included (or kept) before the law was approved.  


    Related Articles and Sites:
    http://www.zdnet.com/article/groundhog-lazarus-twice-dead-data-breach-notification-laws-re-enter-parliament/
    https://www.itnews.com.au/news/australia-finally-has-mandatory-data-breach-nofitication-450923
    https://www.itnews.com.au/news/what-does-data-breach-notification-mean-for-you-451025

     
  • Children's Medical Center of Dallas Pays $3.2 Million To Settle HIPAA Violations

    The Children's Medical Center of Dallas (Children's) recently settled with the US Department of Health and Human Services (HHS) over multiple failures to encrypt sensitive data in mobile devices. The settlement – $3.2 million dollars – is quite the figure, as is the timeline involved: It looks like an investigation could have been started as soon as July 5, 2013, and a final resolution was not reached until February 1, 2017.  

    Multiple Failures Over the Years

    As the HHS complaint shows, Children's had a number of data security breaches over the years.
    • Nov 2009 – loss of a BlackBerry. 3,800 individuals affected.
    • Dec 2010 – loss of an iPod. 22 individuals affected.
    • Apr 2013 – loss of a laptop. 2,462 individuals affected.
    But, it's not the number of data breaches that Children's has had over the years that HHS takes exception to. Rather, it's the fact that Children's knew that they had a bomb ticking in their hands and didn't do anything to rectify the situation… even as the bombs blew up time and again over the years. The need for better security was brought to Children's attention numerous times:
    • Strategic Management Systems Gap Analysis and Assessment, February 2007
    • PwC Analysis, August 2008.
    • Office of the Inspector General, September 2012.
    You'd imagine that a major hospital that's been recommended to secure their devices (and the data in them, more specifically) would do so as soon as possible. Instead, they waited until "at least April 9, 2013." Incidentally, that's a little after the HHS's final Omnibus Rule became effective, on March 26, 2013. As far as I can tell, Children's never had a problem after April 2013.  

    Interim Rules are Rules, Too

    Data security has always mattered under HIPAA. That almost no one really paid attention to it for nearly twenty years just goes to show how important HITECH was in forcing hospitals, clinics, and other medical practices to take it seriously.
    What really made people sit up and take notice was the 2011 fine of Massachusetts General Hospital. MGH paid $1 million to settle with the HHS over paperwork left on the subway. It affected less than 200 patients. And while all of this took place well before the Final Rule came into effect, monetary penalties had quite recently made it into the Interim Rules. MGH served as a preview of things to come, that the HHS meant business.
    And it worked. So many covered entities started looking into encryption and other data security technologies that it was like Christmas had come early for IT companies that specialized in the medical sector.
    I imagine that penalty was on the mind of Children's managers when they suddenly decided to start encrypting their data in 2013; the clock was ticking and they didn't exactly have a stellar record when it comes to not losing stuff. For their dallying, the hospital earned the fifth largest monetary penalty to date since HHS started fining people.  

    Security Issues Still Going Strong

    If I were a betting man, I would say that Children's will have plenty of company going forward. Unencrypted electronic devices that store protected health information are still getting lost today. With so many options for safeguarding patient data, it boggles the mind that this is still an issue.  

     

    Related Articles and Sites:

    https://www.databreaches.net/childrens-medical-center-of-dallas-pays-3-2m-penalty-for-multiple-violations-of-security-rule/
    http://www.hipaasurvivalguide.com/hipaa-omnibus-rule.php
    https://www.hhs.gov/sites/default/files/childrens-notice-of-proposed-determination.pdf

     
  • Third Circuit Appellate Court Says “OK” To Data Breach Lawsuit

    Recently, the US Court of Appeals for the Third Circuit concluded that "the improper disclosure of one's personal data in violation of FCRA [Fair Credit Reporting Act] is a cognizable injury for Article III standing purposes."

    In other words, people can go to court over data breaches and data breaches alone; there is no need to show that you were adversely affected by events following a data breach (for example, by proving that your data was misused by hackers).

    Of course, this doesn't guarantee that an individual will win in court. However, it does mean that anyone whose personal information was stolen as part of a data breach can, at least, see the inside of a court. For the past ten years or so, most (if not all) judges ruled that plaintiffs in such lawsuits didn't have "standing" and their cases were "summarily dismissed" from court. That's a fancy way of saying that the courts booted the cases and moved on to other stuff.

    When it comes to lawsuits revolving data breaches where personal information was compromised, this won't be happening anymore in the Third Circuit – which covers Delaware, Pennsylvania, and New Jersey. Hopefully, other districts will begin to see data breaches in the same light.  

    Theft of Unencrypted Laptops

    What led to this legal development? It started in 2013, with the theft of two laptops, Apple Macintoshes to be precise. These computers contained personal and medical information. Encryption was not used despite the fact that full disk encryption comes gratis on all Apple computers made since 2003. Not an ounce of hyperbole is added when I observe that the performance of a Mac is unaffected by the use of said encryption. Plus, since these were already "password-protected," users didn't have to jump through any additional "security hoops" to use their computers.

    Over 800,000 people were affected by the data breach. Oops.

    The owner of these laptops? Horizon Blue Cross Blue Shield of New Jersey, a company that's been involved in laptop-related data breaches before.

    Apparently, the only security was the cable lock that tied the laptops to their desks, and the computers' location on the eighth floor of Horizon's headquarters. Under HIPAA, this could have been perfectly adequate security.

    However, from the FCRA standpoint, it isn't. As the Third Circuit pointed out, the law behind FCRA focuses on consumer privacy. The fact that one's personal information has been transferred to persons unknown (that is, the data was easily accessible once the machines were stolen) means the company is potentially in violation of FCRA. The use of encryption, of course, could have laid this to rest three years ago, when the laptops were stolen. Instead, here we are.

    If things continue in this course, we could see a greater number of companies taking a careful look at the use of encryption, or lack thereof. Unlike federal laws and regulations like HIPAA that are limited in scope, or the patchwork of state laws that supposedly govern data security and privacy – which also fall prey to "standing" issues – FCRA affects many companies across many sectors.  

     

    Related Articles and Sites:

    http://law.justia.com/cases/federal/appellate-courts/ca3/15-2309/15-2309-2017-01-20.html
    http://www.nj.com/business/index.ssf/2013/12/horizon_bcbs_notifying_840000.html
    http://www.alertboot.com/blog/blogs/endpoint_security/archive/2013/12/11/hipaa-encryption-horizon-bcbs-of-new-jersey-data-breach-affects-840k-people.aspx
    https://www.databreaches.net/horizon-blue-cross-blue-shield-loses-round-in-data-breach-litigation/

     
  • UK Encryption: Royal & Sun Alliance Insurance Fined £150,000 For Stolen Hard Drive

    The UK's Information Commissioner's Office (ICO) has fined an insurance company, Royal & Sun Alliance (RSA), a total of £150,000 for the theft of an external storage device with information on nearly 60,000 clients (and credit card details for 20,000 people).  

    Stolen From a Locked Room

    Unlike your run-of-the-mill hard drive theft cases, there are a number of wrinkles to RSA's data breach. To begin with, the external storage device in this case is a NAS (a network attached storage device).
    NASes are like external hard drives but also so much more. One of their key differentiators to the lay person is their size: despite the modern device's emphasis on miniaturization, the modern NAS is still pretty big, considering. It's not unlikely for them to be about as big as a Nintendo Cube (or bigger). Due to its physical size, it's not possible to surreptitiously steal one of these babies; some thought and strategy, possibly pre-planning, is needed when stealing such a device.
    The other wrinkle is that the NAS was stored in a data server room which can only be accessed with "an access card and key," leading to the belief that staff or visiting contractors stole the NAS.
    In other words, it wouldn't have been easy to steal the device.
    And yet, as subsequent events have shown, it would not have been impossible, either. While NASes can offer file encryption, the stolen machine's data was not encrypted – either because this particular NAS didn't offer it or because someone in IT did not deem it worthwhile; excusable, some may think, since it was under lock and key.  

    Excusable?

    Well, it wasn't excusable. Far from it, as the six-figure fine shows. It's one thing for your average Joe to not encrypt his sizable storage device that he keeps locked up. A multinational insurance company, on the other hand, has responsibilities, and keeping the same data security practices as your average Joe is contemptible.
    Especially when you consider that up to 40 people were allowed unsupervised access to the room storing the NAS, or that nobody realized that the device had gone missing for over two months.
    This is exactly the type of situation where you want any sensitive data to be encrypted.  

    Giving a Break Where They Shouldn't

    Only the ICO knows how the fine's final amount was calculated. However, they note under "mitigating features" that the "personal data held on the device was not easily accessible."
    There must be some confusion here, since the lack encryption makes access to the data quite easy. It's true that you probably can't just access the information directly from a computer; however, a simple search in Google will provide more than helpful links for getting to the data, instructions that your average middle-schooler can follow while half-asleep.
    Imagine what staff or contractors that were given access to a data server room, literally a room where techie types go into, could do with access to the internet and a few keystrokes.  

     

    Related Articles and Sites:
    https://ico.org.uk/action-weve-taken/enforcement/royal-sun-alliance-insurance-plc/
    https://ico.org.uk/media/action-weve-taken/mpns/1625635/mpn-royal-sun-alliance-20170110.pdf
    https://www.databreaches.net/uk-150000-fine-for-insurance-company-that-failed-to-keep-customers-information-safe/

     
  • Netherlands Officially Files 5,500 Breach Notifications In 2016

    The Personal Data Protection Authority of the Netherlands (Autoriteit Persoonsgegevens, "AP") revealed last week that they received nearly 5,500 data breach notifications in 2016, the first year of mandatory data breach notifications for the European country.
    This contrasts with the 980 data breaches in the same period for the US, compiled by the Identity Theft Resource Center (ITRC), which is not government-affiliated. When you consider that the US has somewhere around 320 million people vs. the Netherlands's 17 million, something feels very, very wrong here.
    I can think of two possible ways to interpret the situation:
    1. The Dutch are just terrible at data security. This seems unlikely. It is the US, after all, that holds various records when it comes to data breaches. Last year, for example, Yahoo was crowned with the largest data breach in recorded history.
    2. The US data is severely undercounted. Most probably the reason for the seeming anomaly.
    The latter is supported by the data breach reporting environment in the US.
    To begin with, the US does not have a central authority in charge of data protection. There is no federal law addressing it, although a number of federal agencies do dictate data security in their respective areas; e.g., medical entities and their contractors follow the Department of Health and Human Services requirements regarding data security and breach notifications.
    At the same time, states have their own laws governing data breach reports, governing what is and isn't classified as such. And, each body that overseas such reports have their own policies on whether a data breach should be made public. Some make it easy to find online; others, not so much.  

    Running Numbers

    The 5,500 reported breaches translate to one data breach per 3,090 Dutch citizens. For the US, the 980 translates to one per 326,000 people. That's a ratio of 105 to 1.
    Granted, this is not the best way to represent the figures since it's legal entities that have the duty to report data breaches. A search in Wolfram Alpha shows that the total number of registered businesses in the Netherlands and the USA were, respectively, 1.03 million and 5.156 million.
    This brings down the numbers to one data breach per 187 Dutch businesses, and 5,261 per American businesses. The ratio is now 28 to 1, a considerable reduction, but still very large. Some of the difference could be attributed to the stronger regulations governing data security in Europe: stricter laws, with a propensity to err on the side of caution (read: privacy), means that the Dutch would see a data breach where Americans don't. Also, it could be that the Dutch are more forthcoming with such things because the legal environment is not as litigation-happy.
    No matter how it's cut and dried, however, one thing is certain: 980 breaches reported in the US seems comically low. If we were to assume that the US is comparatively affected by data breaches as the Netherlands, with a similar rate of notification to the authorities, then one would expect 27,500 data breaches in 2016.
    At the end of the day, all the signs point to this: we don't have in the US a good idea of how big or bad the problem is. The best we're willing to do, apparently, is rig the system so that we lowball the number to a point it's not realistic.
    That's a real problem because, who would feel the need to marshal resources when the problem appears to be so small?  

     

    Related Articles and Sites:

    http://blogs.dlapiper.com/privacymatters/the-netherlands-almost-5500-data-breaches-notified-in-2016-2/
    https://www.databreaches.net/the-netherlands-almost-5500-data-breaches-notified-in-2016/
    http://www.idtheftcenter.org/images/breach/ITRCBreachStatsReportSummary2016.pdf

     
More Posts Next page »