This Blog




AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.


AlertBoot Endpoint Security

AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.
  • Florida Government Hard Drives Stolen For Games

    Many, if not most, data security professionals will tell you that you should run a risk assessment and accordingly develop your plans for securing information, sensitive or otherwise. Then there are others who will counsel that one should secure as much as possible: obviously protect what represents a high risk situation, but never discount the possibilities of what seems like a low or no-risk situation blowing up beyond expectations.
    The idea is that dealing with the legal, financial, and public relations fallout of a data breach are comparable regardless of the initial risk classification.
    For example, one might recommend that disk encryption be deployed to all computers – not just laptops – because it's never guaranteed that a desktop computer won't be leaving a business's premises in an unapproved manner. There is history to back this up: burglaries; theft or loss of computers that have fallen into disuse and put into "temporary" storage; computers where information was inadequately scrubbed (or not at all) before being retired; etc., have been reported in the media over the years.
    The motives behind such data breaches are as varied as the data breaches themselves. Some people may be after the data. Others may want to replace their aging computer back home. Yet others may be looking to flip the hardware on craigslist. And, of course, there is the time-honored, ever-surging, never-can-kill-it "oops" situation.
    And then you have the guy who really, really wants to play Xbox.  

    Custodian Swipes Hard Drives

    According to and other sources, a Florida man was arrested for stealing hard drives from the Florida Department of Revenue. The hard drives contained taxpayer information and their disappearance, needless to say, triggered a data breach.
    A swift investigation led to one Andru Reed, a 21 year-old who eventually admitted to the theft. Reed confessed that he had stolen four hard drives so that he could connect them to his Xbox and download video games. Law enforcement is still conducting data forensics to sniff out any conflicts in Reed's story, but they're pretty certain that the taxpayer data was never accessed.
    How was Reed able to steal these hard drives? Pretty easily. He was a custodian who was working the premises. So, he didn't have to "Mission Impossible" himself into the offices. He just walked in. Furthermore, the four hard drives in question were external hard drives. All he had to do was pick them up, ideally when no one was watching (which he bungled, apparently. When the police started the investigation, employees in the office mentioned seeing Reed acting suspiciously).  

    Encrypted or Not?

    As noted before, there are competing schools of thought when it comes to data security. The loss or theft of external hard drives can be deemed a low data risk situation if they never leave a secure area.
    That's a big if, though. Security breaches where employees or outside contractors purposefully steal sensitive data, usually to sell to legal and illegal data brokers, are not unusual. So, did the Florida Department of Revenue (FDR) encrypt these hard drives or not?
    We don't know. The March 27 statement from the FDR is pretty nebulous:
    At this time, we are taking all necessary precautions to review the established physical and digital internal security procedures to ensure uniform implementation across the Department. If after the full investigation it is found that any employee did not take the proper steps to protect taxpayer information they will be held accountable. []
    And then on April 17:
    Through the details presented, we are confident that the information on the drives was not accessed. As a result of the Department of Revenue’s thorough processes and procedures to monitor and maintain equipment, we were able to rapidly identify and report the property missing. []
    Florida, like most US states, has a data breach notification law. It states that notifications to individuals must be made no later than 30 days after the breach has been identified. If encryption was not used on any of the four storage devices, it will be known before the month is over. (That the drives were recovered does not negate that a breach took place).
    For the time being, for speculation purposes, all signs appear to point towards encryption not being used: the public announcement, which is required by law; the weeks-long digital forensics (it really shouldn't be taking that long with encryption in place); and the lack of the word encryption in any materials covering the case (it's usually mentioned if it was present).
    On the other hand, the words "data breach" are not linked to this situation in any form whatsoever. The fact that the theft and the recovery have been dealt with by the media without alluding to a data breach is unusual, and reason enough to wonder whether the external hard drives were secured correctly after all.  
    Related Articles and Sites:
  • Panera Data Breach: Further Proof That People Need Strong Data Security Laws

    Panera Bread has a public relations fiasco on its hands. It has embroiled itself in one of the most tragicomic data breaches the world has seen in a while, a breach that could have been easily avoided.

    Dylan Houlihan, the finder and eventual whistleblower of the security issue, has created a post providing the authoritative breakdown of what happened and when. But, the story can be summarized thus:

    • Dylan finds a security issue – the leaking of customers' personal information – at Panera's website and contacts the company.
    • Panera eventually acknowledges the issue and promises a fix.
    • Eight months later, with no fix, Dylan reaches out to someone who can effect change via public pressure.
    • Panera fixes the problem within hours of being contacted by Brian Krebs, the security blogger at
    • Further poking around shows that Panera didn't really fix anything. Furthermore, the poking around shows that the same problem exists in various places across Panera's online presence.
    • This finding blows holes into Panera's public announcement that they "take data security issues seriously."
    • Panera takes down their entire online presence, which is still down 48 hours after the entire fiasco first made news.

    Of course, one of the bigger questions is, if Panera was really able to fix the thing in two hours, what was it doing dragging its butt for eight months? In hindsight, it's obvious that they couldn't and didn't. And, seeing how 48 hours after the story broke,'s homepage is showing essentially a 404 page, we can strongly presume that they still don't have a handle on the problem.

    Which further leads one to believe that they didn't spend the past eight months trying to fix the problem at all.  

    What Does the Panera Bread Fiasco Show Us?

    Panera's actions are, unfortunately, not an exception to the rule. Certainly, there are plenty of companies that have tried to do right by their customers, either because they feel it's their duty or because it's the law, or some combination of the two.

    Then we have the companies like Equifax, Yahoo, Facebook, and others that offer some canned words about taking data security seriously…but an investigation shows otherwise. Panera looks like it might be joining this disgraced group. (While Facebook is promising change – and by the looks of recent events, they may mean it – it's still fair to lump them in this category because the internet giant has a history).

    The fact that some of the biggest, most powerful companies in the US (possibly in the world) are acting in this manner proves that the US needs strong data privacy laws. Now, some may point out that we only get to hear of the companies that failed in securing their data; thus, it makes it "seem" as if most companies are not doing anything to secure data, but that's far from the case.

    However, that the companies with the money to do something are caught being cavalier about data security issues can only give weight to the thought that those with less money are probably doing even less security-wise… or, at least, not the most they could be doing. And even if this is not the case, it wouldn't be wrong to assume that current data security laws had a strong hand in ensuring, ah, shall we say not-so-reprehensible? responses.

    Like fixing obvious data security problems, going public with the breach, offering credit monitoring – all things that are codified in state laws (although not all states have identical laws).

    Don't hold your breath on those stronger data security laws, though.

    Recently, thirty-two state Attorneys General sent a letter to Congress noting that the proposed federal "Data Acquisition and Technology Accountability and Security Act" replaces stronger state-level privacy laws. And, as they point out, this would essentially give companies like Equifax a slap on the wrist if they experience data breaches.


    Related Articles and Sites:


  • Australia's Notifiable Data Breaches Law Nets 31 Reports In 3 Weeks

    A new Australian law appears to be succeeding in finally unveiling the current state of data breaches in the Land Down Under. According to a release by the country's information commissioner's office (the OAIC), thirty-one data breaches were reported to the government since the law took effect on February 22, 2018.

    Notifiable Data Breach

    Australia's Notifiable Data Breach (NDB) scheme, which makes mandatory the reporting of data breaches to the government, was a long time coming. While Australia already had data breach laws before the NDB, going public with a breach was a voluntary act. Obviously, that was never going to work. And the numbers prove it: in 2010, only 56 data breaches were reported. That roughly doubled by 2014, when 104 data breaches were reported.
    Obviously, these are comically tiny numbers when you consider that 20 million people live in Australia, and that more than 2 million companies were registered with the government in 2014 – 2015. The numbers would suggest that either (a) Australia's businesses are unusually top-notch when it comes to data security or (b) data theft and loss incidents were seriously underreported.
    Even today, with the latest revelation, a person tracking such incidents may feel that the numbers are a little low, possibly due to the public not being aware of their responsibilities under the new law, or because the OAIC has yet to show that it's willing to extend serious repercussions to non-abiders of the NDB.
    The USA, with its disparate set of laws and regulations regarding data breach notifications, has shown the effects of voluntary vs. mandatory, enforced vs. unenforced notifications. Such laws began to surface with California 15 years ago, and other states have passed their own versions, unwittingly leading to an experiment on what is effective. The conclusion: even when the law mandates going public with a data breach, many companies will not do so if repercussions for not doing it fail to really materialize.
    In addition, HIPAA / HITECH regulations covering the US medical sector showed that the fastest and surest way of ensuring that companies take notice of privacy and data security laws is to penalize companies, in monetary form, and publicize it.

    Turnover of $3 Million, A Couple of Conditions

    According to, the new law applies to companies that have an annual turnover (aka, total sales or total revenue) of $3 million or more, with certain exceptions like APP entities that trade in personal information. In addition, the data breach to be reported must be:
    • Unauthorised access to or disclosure of personal information that could be used to harm an individual; and
    • Risk of unauthorised access or disclosure, in which case the information has been lost and is in danger of being used to harm an individual
    The same article quotes an expert who says that the new law may not really affect the behavior of businesses, seeing how:
    the Australian laws are still "less stringent and the penalties less severe than similar regimes in other jurisdictions".
    Considering how the law is worded, it's hard not to agree. For example, what does it mean that the data breach could "harm an individual"? There's too much room for interpretation there, even if the OAIC notes that "objective assessment… from the viewpoint of a reasonable person" should be used in making the determination.
    Thankfully, at least it's pointed out that the use of strong encryption provides safe harbor from the NDB, as encrypted data is safe from unauthorized access. Indeed, multiple examples are provided where the NDB exception is directly tied to encryption, underscoring the importance of encryption in safekeeping personal and private data.
    Related Articles and Sites:
  • Fresno State Hard Drive Stolen, 15000 Affected

    At least 15,000 California State University, Fresno "student athletes, sports-camp attendees, and Athletic Corporation employees" were affected by a data breach earlier in the year, according to and other news sites. A hard drive, 18 laptops, and other items were reported missing on January 12 from the university's North Gym building. On the face of it, it seems that the device was not targeted in the theft which, based on the university's 2017 – 2018 academic calendar, appears to have been planned to coincide with Fresno's winter break period.  

    Data From 2003 Onwards

    In Fresno State's public data breach notification, the university notes that only 300 of the affected are "currently affiliated with the University," implying that most of the breached data involves former students, laypeople, and employees.
    The breached information includes:
    some personal information, including names, addresses, phone numbers, dates of birth, full or last four digits of Social Security numbers, credit card numbers, driver’s license numbers, passport numbers, user names and passwords, health-insurance numbers and personal health information.
    Considering the type of information that was being held – and how far back it went: 15 years – it's hard to understand why this external drive, which was used as a backup device, was not protected with encryption. Why wasn't it?
    Possibly, the (roundabout) answer lies in the 18 laptops that are not mentioned in Fresno's notification. Why are the laptops not mentioned if they were also stolen at the same time as the external drive? One possibility is that none of them held any personal, sensitive data.
    The more probable explanation is that these laptops were encrypted, obviating their inclusion in the breach notification. Maintaining this train of thought, it's probable that Fresno is dealing with an employee's wayward data security practices. Of course, it could also be that the university's IT department fumbled: if you've got hundreds of devices to secure, an odd hard drive or two could very well slip through the cracks and remain unprotected.  

    Cold Comfort

    Fresno, like many entities that report on data breaches, noted that they had:
    not received any reports that any of the stolen information has been accessed or misused in any way, and there is no reason to believe that the hard drive was stolen for the information it contained.
    Lawyers should stop their clients from adding the above language in breach notifications. It's embarrassing. The problem with it (aside from the fact that it's about as believable as we're sending this notification out of an abundance of caution: everyone knows you're sending it because it would be literally illegal not to) is that it is meaningless.
    In this day and age, people know that the data contained in devices can be more valuable than the hardware itself, and you can bet that people who steal computers are even more likely to be aware of this fact. So, not getting any signals that the stolen information was accessed... means squat.
    In addition, there's this implication that the information was not or will not be accessed because the hard drive wasn't stolen for the information. How faulty is that logic? Let us assume that some guy boosts a car because he's going to sell it to a chop shop. Are you telling me that he's not going to maybe take a peek in the glove compartment box or the trunk because he stole the car for its hardware, and not its content? Possibly lift up the armrest to access the center console? Steal the quarters in the ashtray?
    Having your personal details stolen is terrible. Receiving a breach notification letter is terrible. Ham-fisted attempts at PR are vexing and insulting. It wouldn't be surprising to find that such language backfires on its intent and, not only does it not comfort people, but encourages them to file lawsuits.
    Related Articles and Sites:
  • HIPAA Breach Results In Lawsuit And Countersuit Between Aetna and KCC

    Reuters reported earlier this month that Aetna, the health insurance company, and Kurtzman Carson Consultants (KCC), an administrative-support services provider, have sued each other over a mishandled class action settlement notification.
    Last year, Aetna settled a number of lawsuits regarding the fulfillment of HIV medication prescriptions. With the legal issues finalized, it was up to KCC to mail the settlement notifications and finally close the book on the situation. Unfortunately, the notifications were sent using envelopes with transparent windows, the ones where names and addresses show through. But in this case, there was a little more:
    Most of the first sentence of the notification was also displayed - including the words "when filling prescriptions for HIV medications." []
    That's as private as private information can get. Naturally, Aetna was sued for breach of patient privacy, which the company quickly settled. In turn, the company sued KCC "to indemnify…the entire cost of the notification disaster," or nearly $20 million. Aetna claims that they didn't know that envelopes with transparent windows would be used, that private information would be showing, etc.
    Basically, it wasn't Aetna's fault.
    KCC, however, has countersued, stating that "Aetna and Gibson Dunn [the insurance company's legal representation] knew what the notifications would look like" and, allegedly, approved it prior to mailing out the settlement notifications.
    Obviously, someone has to be lying. The calamities don't end there, however.  

    No Encryption in this Day and Age?

    KCC has also averred in their suit that,
    When Aetna’s lawyers passed along the list of health plan members to be notified about the HIV prescription policy, there was no protective order in place. Nor did Gibson Dunn encrypt all of the data it sent to [KCC].
    In fact, KCC states that private health information (PHI) wasn't encrypted nor password-protected (not that password protection would do any good; it's certainly not a HIPAA-compliant PHI security measure). And, they further claim that "more data than was necessary to perform the noticed function" was sent to them… which is not necessarily forbidden under HIPAA but is definitely frowned upon. In fact, it might be one of those red flags that spark an investigation by the Department of Health and Human Services (HHS).
    On the other hand, passing sensitive patient data around without encryption? We all know how the HHS feels about that one. The Reuter's article summed up what's at stake for Aetna and KCC in this manner:
    For both Aetna and KCC, as you can see from the dueling complaints, responsibility for the botched settlement notifications is really an existential question. As a health insurer, Aetna has a moral and legal obligation to protect patient privacy. As a claims administrator, KCC is supposed to know – of all things! – how to mail out a settlement notification without violating recipients' privacy.
    The above is insightful and yet misses a number of observations.
    It should be noted that KCC received the data from Aetna's lawyers. So, if KCC's allegations are true, then Aetna has another business associate that's not paying attention to HIPAA/HITECH requirements. And, what's true for KCC – that they should know how to properly mail out notifications because it's their job – can also be said for lawyers that are sharing sensitive data that belongs to a HIPAA covered-entity. After all the law has specifics on how PHI data should be handled by business associates of HIPAA covered-entities. Business associates such as lawyers, who, by virtue of their profession and their client, should know not to pass around PHI unencrypted.
    Also, the allegations open up a another can of worms for Aetna, seeing how it now has two business associates that have contravened HIPAA/HITECH data security rules in less than one year. It can take very little to get the HHS to open up an investigation into data security violations. Having three HIPAA incidents in a one-year period must certainly attract attention, and KCC's allegations gives the HHS a reason to dig more in depth into Aetna's adherence to HIPAA privacy and security rules.


    Related Articles and Sites:

  • HIPAA Security Trickle-down? Notifications State Sensitive Information Not Contained In Stolen Devices

    According to, two medical entities recently alerted patients of a data breach: Eastern Maine Medical Center (EMMC) and Nevro Corporation.
    In the case of EMMC, an external hard drive went missing. For Nevro, a number of laptops were stolen during a break-in. Information contained in these devices was not protected with data encryption in either case, but then again, "sensitive information" was not stored on any of the devices involved.
    While the lack of encryption seems reasonable at first glance, the truth is that a number of HIPAA / HITECH regulations were probably broken.  

    Eastern Maine Medical Center

    In the case of EMMC, the data breach was triggered when a third-party vendor's hard disk drive disappeared. Bangor Daily News reports that the "missing hard drive contains information on 660 of the patients who underwent cardiac ablation between Jan. 3, 2011 and Dec. 11, 2017."
    The missing drive was last seen on December 19. Reportedly, the storage device contained:
    Patients' names, dates of birth, dates of their care, medical record numbers, one-word descriptions of their medical condition and images of their ablation… [but NOT] Social Security numbers, addresses and financial information.
    On the face of it, it looks like the data breach could be classified by most people as "small potatoes."  

    Nevro Corporation

    Unlike EMMC, Nevro was responsible for its data breach. And yet, the company cannot be strongly faulted for the data mishap: it's not as if the laptops were in an unsafe location (like an employee's car). The laptops were at the company's headquarters, which one assumes was reasonably secure against break-ins.
    Per Nevro's breach notification letter, "nearby business were also targeted" and laptops were stolen from them as well, so chances are that Nevro had comparable security in place. (Either that or most businesses in the area decided to dispense with security, a dubious assumption).
    The company noted that all of the stolen laptops were password-protected "although not all were encrypted." Yet, the silver-lining is that "limited categories of information" were stored on these devices and that none of them "contained, sensitive identifying information such as Social Security or other government-issued identification numbers or credit card or financial institution information."
    The "limited information" pertains to names, addresses, and other similar information listed by EMMC. Indeed, Nevro seemingly implies that it's only sending affected patients because
    applicable state law considers this type of information [limited information about your treatment relationship with Nevro] sufficient to warrant a notification.
    Again, most people would look at this as small potatoes (especially when you take into consideration what Equifax admitted to last September. That was definitely not small potatoes; heck, it went well beyond the tuber family).
    As pointed out in previous posts, such "not sensitive" information can still be used to carry out fraud and scams. Tech support scams, for example, are successful even though there is very little personal data involved. Can you imagine how much more convincing a phone scam would be if someone called a person about his or her cardiac ablation?
    That being said, there is a remote possibility of it happening. In contrast, the malicious use of SSNs and other information generally considered to be "sensitive" is more than possible. So, the lack of what most people would deem "sensitive personal information" should come as something of a relief to patients.  

    Could Still Be a HIPAA Breach

    It may not be, however, a relief for the two organizations. A cursory search on the internet seems to indicate that both fall under the purview of the Health Insurance Portability and Accountability Act (HIPAA). HIPAA has very strict definitions of what is and is not PHI (protected health information).
    As this link shows names, physical addresses, telephone and fax numbers, email addresses, etc. are considered to be PHI if combined with certain information, such as what medical treatment one was receiving. So, technically, it looks like the two organizations have a full-blown medical data breach in their hands.
    It goes without saying that the use of full disk encryption would have paid off wonderful dividends in both cases because HIPAA provides safe harbor if data is encrypted when lost or stolen. That not being the case, what will be the fallout?
    HIPAA / HITECH data security compliance is administered and overseen by the Office of Civil Rights (OCR) of the Department of Health and Human Services. The OCR has not been shy in dispensing monetary penalties, sometimes in the millions of dollars.
    And, as befitting such large sums, it often takes years to reach a decision on how to deal with HIPAA covered-entities that have suffered a data breach.
    Related Articles and Sites:
More Posts « Previous page - Next page »