This Blog




AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.


AlertBoot Endpoint Security

AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.

October 2018 - Posts

  • Anthem, Yahoo To Shell Out Additional Money Over Data Breaches

    This week saw additional headaches for two US companies involved in major data breaches (we're talking top ten in US history to date). Yahoo, now a part of Verizon, has agreed to settle a lawsuit for $50 million. In addition, Anthem, Inc. – the Indiana-based BlueCross BlueShield insurance company – has agreed to settle HIPAA violations by paying a $16 million monetary penalty to the US Department of Health and Human Services (HHS).
    Earlier this year, Yahoo's "other arm" – now known as Altaba, a separate entity from Verizon – settled with the SEC for $35 million. Likewise, just a couple of months ago, Anthem settled a lawsuit for $115 million.
    The final tally so far for the Yahoo breach: $85 million in settlements, over $30 million in lawyer fees (for the plaintiffs), and a $350 million haircut when Verizon acquired the company. That's a total of $465+ million.
    For Anthem: a total of $165 million.
    And let's not forget that these figures do not include what each company paid for their own defense (the numbers certainly must be in the millions).
    Conclusion: data breaches now suck for both breachees and breachers. It wasn't always like that.

    Historical Inflection Point?

    Ten years ago, a lawsuit centered around a data breach would have been tossed from court. Today, that hardly seems to be the case… although exceptions do exist, like Equifax. (Still, it's only been a little over one year since that particular data breach. Yahoo and Anthem's travails took years to be resolved, and with Equifax's data breach being in the top five information security incidents of all time, it's still too early to tell whether the credit-reporting agency will join the two companies' dubious circle of honor).
    It may, perhaps, be too early to declare that the days of conveniently ignoring data security, in the belief that there will be little to no blowback when it happens, are really over. Still, there are many signs that this is a watershed year, including:
    • People are leaving social media platforms or decreasing their use, mostly due to privacy and data security concerns.
    • Over the course of ten years, pretty much everyone has been affected by a data breach. Chances are that everyone knows someone who has been affected quite negatively. Even judges who in the past couldn't see what the big deal was. Nothing like hitting close to home to understand what's what.
    • Greater and greater fines are being imposed for data breaches, a direct result of continuing and ever-expanding information security incidents.
    • The EU passed this year some of the strongest privacy laws yet.  


    Related Articles and Sites:

  • Google and Google+ : Data Breach or Not?

    This week's revelation that Google covered up a data breach connected to Google+, the much-unused Facebook-competitor, has spilled a lot of digital ink. Unsurprisingly, most of it is unsympathetic to Google. One exception was an article at, where it noted that "the breach that killed Google+ wasn't a breach at all."
    And, on the face of it, it's true. As far as Google knows, the "data breach" (in reality a bug that could have allowed a data breach) was never exploited. Its logs show nothing. And, in order to make use of this exploit, a person had to request permission for access to an API (Application Programming Interface), which only 432 people did. In the end, Google estimates that 500,000 people could have been affected… if there was an actual data breach. We're talking theoretical potential here, not post facto possibility.
    And the most damning indication that this is not a data breach is the fact that the data that could have been exposed from the API bug wouldn't have actually triggered a breach notification. If you look through the list of data that could have been compromised, you'll see that they wouldn't qualify as sensitive or personal information under US data breach notification laws. Full names, email addresses, a profile pic? Unauthorized access to these do not merit a notification under any of the 50 US state laws dealing with data breaches.
    Again, on the face of it, it looks like no big deal. If you dig into the details, however, you'll see some problems. First off, Google can't really know what happened because they only keep two weeks' worth of logs; the bug, on the other hand, went unfixed for over two years. Who knows what happened prior to the patch, outside of the two-week period?
    Second, the fact that less than 450 people applied for the API is little comfort when you realize that the Facebook Cambridge Analytica situation only required one renegade API user. (Perhaps we could get some comfort that "only" 500,000 people could have been affected, but we don't really know where that figure came from. Is it based on the severely curtailed log data? Or the total connections that the API-requesters currently have? What if they dropped connections over the years, thus depressing the figure?)
    Still, despite the above, it looks like this data breach is not really a data breach. Facebook said the same self-serving thing about the Cambridge Analytica situation…but, in that case, data was exploited in an unauthorized manner.  

    EU is not US

    As noted above, the data that was accessible via the bug is not covered under data breach laws, at least not in the US. The US does not have an all-encompassing federal law; it's all done at the state level. And, it was only earlier this year that the final 50th state succumbed to the times and passed a data breach notification law (thank you, Alabama). Under these laws, what's defined as a "reportable" data breach is strictly defined. When you look at the data at the center of the breach, it's obvious that it doesn't really pass the "personal information" test: without more "substantial" information like SSNs, driver's license ID numbers, financial data, etc., Google is in the clear.
    In comparison, the EU has stronger privacy laws, but the bug was found before these laws were strengthened quite recently. Still, a case could be made that the potential breach required public notification within the framework of the older European laws. For example, the UK (before Brexit) gave this example on what constitutes "personal data" as part of the EU's Data Protection Directive:
    Information may be recorded about the operation of a piece of machinery (say, a biscuit-making machine). If the information is recorded to monitor the efficiency of the machine, it is unlikely to be personal data…. However, if the information is recorded to monitor the productivity of the employee who operates the machine (and his annual bonus depends on achieving a certain level of productivity), the information about the operation of the machine will be personal data about the individual employee who operates it. [section 7.2, personal_data_flowchart_v1_with_preface001.pdf]
    As you can see, within the EU, there is a gray area as to what personal data is. It could be that Google is not out of the woods yet, legally speaking. Bugs are Identified and Fixed All the Time As article notes,
    There is a real case against disclosing this kind of bug, although it’s not quite as convincing in retrospect. All systems have vulnerabilities, so the only good security strategy is to be constantly finding and fixing them. As a result, the most secure software will be the one that’s discovering and patching the most bugs, even if that might seem counterintuitive from the outside. Requiring companies to publicly report each bug could be a perverse incentive, punishing the products that do the most to protect their users.
    Quite an accurate point. In addition, it should be noted that this literally is a computer bug and nothing more because it was discovered in-house. If a third party had found the security oversight and reported it to Google, it would have been a data breach: that person, as an unauthorized party, would have had to illegally access the data to identify the bug as such.
    In this particular case… well, you tasked someone with finding bugs and that person did find it. That's not a data breach. That's a company doing things right.  

    Hush, Hush. Sub rosa. Mum's the Word

    But then, why the secrecy surrounding it? Supposedly, there was an internal debate whether to go public with the bug, a debate that included references to Cambridge Analytica. If Google had been in the clear, the discussion would have been unnecessary.
    Or would it? With Facebook's Cambridge Analytica fiasco dominating the headlines at the time, Google couldn't have relished the idea of announcing an incident that, in theory, closely mirrors Facebook's – but has led to a different data security outcome. Thus, it is unsurprising that the bug, and the debate surrounding it, has been kept quiet. (It was a cover-up, some say. But again, Google didn't technically have a data breach. It truly was their prerogative on whether to go public).
    Now that the world knows of it, though, it has led to the same outcome: a global scandal; governments in the EU and US looking into the situation; another tech giant's propriety and priorities questioned (arguably, a tech giant that was held in higher esteem than FB); the alienation and angering of one's user base.
    Going public with the bug could be seen as a "damned if you do, damned if you don't" sort of situation. But, when considering what Google's been up to lately, like this and this, you've got to wonder what is really driving the company in Mountain View.  
    Related Sites: