This Blog




AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.


AlertBoot Endpoint Security

AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.

April 2013 - Posts

  • Data Backup Encryption: Kmart (Inadvertently) Suffers Data Breach At Gun Point

    Do you backup your data?  Excellent!  Do you use encryption software to protect its contents?  Not doing so means that you've joined the "Data Breach Club," where the chances of a data breach are not an "if" but "when."  Take Kmart as an example, which had a data breach because a thief robbed one of its store at gunpoint.

    Nobody Expects their Data Backup to be Stolen

    When I first heard that Kmart had to publicize a data breach because of HIPAA regulations, it hit me like a bag of surrealistic bricks (Kmart and HIPAA/HITECH?).  But, I remembered that many Kmart locations also include a pharmacy.  The story, as describes it, is as follows:

    On March 17, an armed robbery took place at a Little Rock, Arkansas Kmart.  The assault took about an hour after closing time, and the perpetrator pointed a gun to the assistant store manager and forced him to open the store safe.  The thief wiped it clean, which included $6,000 in cash and a backup disk.

    The backup disk contained "full names, addresses, dates of birth, prescription numbers, prescribers, insurance cardholder IDs and drug names for some 788 customers" and, in certain cases, SSNs as well (well, more than a few.  The spokesperson noted it was a "few hundred customers."

    It was expressly pointed out that disk encryption was not used, nor its enfeebled cousin, password-protection.

    Aside from the obvious mistakes, the spokesperson made two additional observations: (1) that accessing the customers' information "is slim to none, because you would need to know what software package" was used, and (2) that they were quick in contacting customers because they did so in about a month, as opposed to the 60 days that they're given.

    Data Breach Possibility, Slim to None: Only If You Used Encryption

    The observation that accessing customers' information is slim to none is debatable at best.  It is slim to none because chances are the thief is not going to look.  Generally, when a laptop gets stolen, it's wiped and reformatted for sale (at least, that's the reigning consensus).  One assumes the same would hold for disk drives used as backups.

    Then again, we must remember that this disk drive was inside a safe.  That already suggests that something valuable is stored in it.  Under the circumstances, what are the chances that the thief will ignore the suggestion that it's worth his while to see what's in it?

    And, if he does, then the odds of a data breach are not really slim to none: freely available software from the internet can be used to scan a disks contents for particular information, like Social Security numbers (either as a pattern of 000-00-0000 or as a string of 9 numbers).

    Only in the event that encryption is used can one confidently declare that particular breach is nearly riskless.

    HIPAA Data Breaches and Unreasonable Delays: You (Don't Really) Have 60 Days to Report It

    One of the more misinformed statements I've read is the following:
    Asked why the delay [a little over one month], Sears spokesperson Shannelle Armstrong-Fowler pointed out that the chain moved much more quickly than the law requires. "Under HIPAA guidelines, 60 days are available for a health care entity to investigate and report on a potential breach. We completed our investigation and notified customers in approximately thirty days," she said.
    This is entirely correct as well as partially true (what, you say?  That sounds like a contradiction?  Read on).  As the Department of Health and Human Services (HHS) has pointed out in various publications, a breached entity must contact affected patients within 60 calendar days.  However, it has noted that the HIPAA covered-entity must also contact patients as soon as possible.  In a previous post (Does HIPAA / HITCH Really Give You 60 Days For Patient Notification?), I wrote the following:
    It behooves administrators for a HIPAA-covered entity to take a good look at the HHS's opinions on the matter of data breaches and notifications.  The 60-day limit is an "upper limit" and covered entities are expected to contact patients ASAP.
    and supported the argument by noting the following passages from the Federal Register:
        "...if a covered entity learns of an impermissible use or disclosure but unreasonably allows the investigation to lag for 30 days, this would constitute an unreasonable delay."

        "...if a covered entity has compiled the information necessary to provide notification to individuals on day 10 but waits until day 60 to send the notifications, it would constitute an unreasonable delay despite the fact that the covered entity has provided notification within 60 days."
    If the HHS Office of Civil Rights (OCR) were to conduct an audit and were to find that Kmart had unnecessary delayed contacting patients, it could mean severe legal repercussions for the wholesaler.  Under HIPAA, 60 days is not really 60 days.

    I'm no PR expert, but it seems to me that the spokeswoman should have focused on stating that they had to conduct an investigation, couldn't finish it any sooner, and notified its customers as soon as possible.

    Of course, when you consider that the stolen disk affected 788 Kmart customers, one wonders whether they couldn't have been notified any sooner, and whether 30 days was really necessary.  I've certainly seen situations where even more people were affected and notification letters were sent in a couple of weeks.

    On the other hand, I've seen the inverse as well.  The trick, it seems, is to design your systems with the possibility that a data breach will occur.  By doing so, processes for a quick recovery are implemented.

    For example, the reporting engine in AlertBoot Mobile Security allows one to easily generate mobile security audit and incident reports.  It's used by many of our clients to prove compliance with laws and regulations in the event a mobile device (like a smartphone or a tablet) or a laptop computer is lost or stolen.

    Related Articles and Sites:


  • US Fifth Amendment Rights: Suspect Cannot Be Compelled To Surrender Encryption Password

    The United States' Fifth Amendment and encryption software like AlertBoot have a complicated relationship.  The question is: can the government force you to reveal your encrypted data?  The answer: it's complicated and depends on the situation.

    However, it looks like things are beginning to converge towards certain key ideas.  While nothing will be definitive until the issue is addressed at the highest courts in the nation (and not for lack of trying), a handful of cases are allowing one to converge upon when forcing a suspect to give up a password or to provide decrypted data would be a violation of a person's Fifth Amendment rights (and when it isn't).

    In Re The Decryption of a Seized Data Storage System, 13-M-449 (E.D. Wis. 2013)

    In re The Decryption of a Seized Data Storage System, the latest "encryption vs. the Fifth Amendment" case I've come across, a man is accused of storing child pornography in several encrypted computer hard drives.  The FBI, after unsuccessfully trying to gain access to these disks for four months, attempts to coerce the suspect to decrypt the disks.  The suspect refuses to do so, pointing it would be a violation of his Fifth Amendment rights.

    So far, this is no different from the handful of past cases that dealt with the same issue: the John Doe case from 2012; the Fricosu case from 2011; and the Boucher case from 2009. (Incidentally, including Seized Data Storage System, three out of the four cases involve child pornography; however, this law is important for all sorts of reasons).

    Seized Data Storage System a different from the others because the judge in charge ruled that coercing the suspect into decrypting the data is a violation of his rights.  Of the three previous cases I've quoted above, the John Doe case resulted in a win for the suspect, whereas the Boucher and Fricosu cases resulted in the courts ordering the suspects to provide decrypted data.

    Despite the different outcomes, it's now clear that the three cases were following established procedures and legal precedents.  This fourth is just another data point that shows what's what.

    Data Encryption Software, US Fifth Amendment, Foregone Conclusion, and Act of Production

    While going over the Fricosu case a couple of years ago, I happened upon some material that explained how the government could legally coerce a defendant to produce evidence against himself without trampling on his rights.

    Now, I'm not a lawyer, but basically, it's revolves around the doctrines of "foregone conclusion" and the "act of production."  For example, forcing a suspect to produce evidence is not against one's rights if the government already knows about the evidence.  The suspects may refuse to do so, but then they're in contempt of the court that made the order.

    In the case of Seized Data Storage System, the judge concluded that the act of providing a password, either directly or indirectly, would work against the suspect.  It would give them information that the government has no other way of confirming whether it exists.  Hence, the Fifth Amendment rights kick in.

    (More specifically, the judge called it a "close call".  If you go over the Fricosu case, you might get an idea why: as far as I know, it was never admitted by the defendants that they held data in their laptop.  A tape recording, however, revealed that the defendants had some kind of data that they wanted to keep away from the government's lawyers).

    All of this is still in line with the courts' rulings over the past five years.  Unless something dramatic happens, it looks like the courts are basically in cruise control mode.

    Related Articles and Sites:
  • Canada Data Breaches: 3,000+ Cases Over 10 Years, Affects 725K

    Organizations around the world, both in the private and public sectors, are leveraging the use of technology to their advantage.  Take BYOD as an example: "bring your own device" initiatives are meant to reduce costs while increasing job satisfaction and worker efficiency.  There is a darker side to BYOD, however: losing sensitive and private data, which doesn't sound like a big whoop until something goes terribly wrong.  Because of the potential for data breaches, BYOD data security solutions and services like AlertBoot Mobile Security are not only a good idea, but can be a compliance requirement.

    The key word there is "can," though.  When you consider the value of personal data in the black market, or even to legitimate data brokers, one can only wonder why there aren't stricter laws addressing the issue of personal data security.  It's a complex situation and a simple answer isn't readily available.  However, a significant part of the answer could be that people have no idea how bad the situation is because it doesn't get reported.  Take into consideration the Canadian government's recent revelation.

    Over 725,000 Affected Over the Past 10 Years

    According to a document that was presented in Canada's Parliament, there were more than 3,000 data breaches in the past 10 years.  More than 725,000 Canadians were affected.

    However, less than 13% of data breaches were reported (the implication, I guess, is that they were supposed to be reported to the Canadian Privacy Commissioner).  Furthermore, there is a good chance that the 13% figure is inflated.  According to the same report, the government's list cannot possibly include all data breaches.  Hence, the 13% figure would actually be lower:
    For instance, the Canada Revenue Agency didn’t provide any numbers, saying that a search of the hard copy records of breaches would be too cumbersome to be completed.
    And those are instances of "known unknowns."  Imagine what the picture would look like if the veil of "unknown unknowns" were lifted as well.

    GIGO: Garbage In, Garbage Out

    If you were in charge of coming up with a policy and found that there were only 300 or so breaches over the past 10 years (as opposed to 3,000), would if affect how you approached the project?  Would it affect your conclusions on what needs to be done?  Would your calculations show that the use of certain information security solutions were not "cost effective"?

    My guess is that the answers to all of the above would be in the affirmative.

    The last question is especially interesting.  In this day and age, the bottom line tends to be the arbiter of whether something gets implemented.  Hence, many IT departments have attempted to calculate a ROI (return on investment) for data security products and services, including mobile device management and security services for securing devices that are used in BYOD programs.

    I should mention that such a calculation is an exercise in foolishness: information security is not an investment in the financial sense.  It will not produce money or any other type of financial asset; and, of course, just because it doesn't generate income doesn't mean it isn't worthwhile.

    For example, what's the ROI of a toilet?  None (unless you're a company that sells porcelain bowls).  Would your company be better off without toilets in the workplace?  Probably not.  While there isn't a return on investment, there certainly is a return in some kind of value.

    All of this being said, if one is going to do some calculations, it still behooves them to use data that is as accurate and as precise as possible.  If one finds that a BYOD security program will cost the company $10,000, it might cause him to balk if he's looking to prevent 300 data breaches vs. 3,000 of them.

    The report to Canada's Parliament could very well explain why there isn't more being done to protect sensitive data at the federal level, and why Canada's been experiencing increasingly bigger data breaches.

    Related Articles and Sites:
  • Personal Data Breach: Consumer Churn Rate Directly Tied To Infosec Events Is Significant

    A global study has revealed that personal data breaches lead to sizable numbers of customers to turn their back on companies.  This might not be news, but perhaps the figures are: 23% of the respondents affirmatively answered that they have stopped doing business companies that failed to properly safeguard their data.  All the more reason why a company should up the security ante by using some kind of data protection solution like AlertBoot (especially in this age of BYOD).

    We Will vs. We Have

    News of this study comes courtesy of  As the author at the site noted, there is a tremendous difference between what people claim they will do vs. what they actually end up doing.  To account for this discrepancy, the authors of a study by the Economist Intelligence Unit asked the following (my own paraphrase):
    • Would you stop doing business with an organization that breached your data?
    • Have you actually suffered from a data breach, and if so, did you stop doing business with the company that experienced the data breach?

    To the former, 32% of the respondents answered in the affirmative.  To the latter, 38% answered in the affirmative.

    This is a very curious outcome.  Generally speaking, the latter tends to be lower than the former.  That is, there are always more people that say they will do something, in contrast to those who actually do something.  Hark back to New Year resolutions, for example: you'll always have more people who promise to lose weight, or to read more, or to procrastinate less; how many keep that promise, though?

    What does this unexpected finding mean?  Off the top of my head, it seems to indicate that it's only after they've become victims of a data breach that people realize the severity of the situation.

    Spillover Effect

    Not only that, it turns out that there are further ramifications:
    the EIU research also found that 46% of respondents that had suffered a data breach had advised friends and family to be careful of sharing data with the organization.
    Many companies look to get their products to "go viral" or make it spread via word of mouth, knowing that recommendations from friends, family, and acquaintances carry more weight than any marketing campaign some guys in an office can create.

    Imagine, then, the disastrous effects the above could have on a company.

    Nip It in the Bud because It's a Drop in the Bucket

    An ounce of prevention is worth a pound of cure; so goes the old saying.  Nowadays, I'm under the impression that the value of the cure is much, much higher.

    Consider all the things that could go wrong by not employing, say, a BYOD security solution like AlertBoot Mobile Security.  Assume that you can get the service for $100 per year, per device (it's actually much more cost effective, but I like easy numbers to work with).

    Also, assume you've got 100 employees who opt to bring in their smartphones and tablets to use at work.  This means you'd be spending $10,000 per year on what appears to be a bottomless pit.  After all, it's not as if security threats are going away any time soon.  Ten large ones sound like a big number.

    But what about the flipside of the coin?
    • There's the approximate one-third of your customers that will not be doing business with you in the foreseeable future.  What does that translate to in lost revenue?
    • Your marketing will see a drop in ROI as you work harder to bring in new clients to replace the ones you've lost.  That's money you didn't need to spend if you had proper security, on an activity whose efficiency is debatable.
    • Depending on which sector your business is in (finance, healthcare, e.g.), you might have to incur the costs of an audit, internal as well as external (by the government, such as an audit by HIPAA/OCR).  These easily run into the five figures, at least.
    • Reaching out to "breachees".  Most state and federal laws that oversee personal data laws require that first-class mail (or equivalent) be used.  If the breach involves 200,000 people and you can mail each letter for $0.25, that's $50,000 you're spending to shoot yourself in the foot.  That cost doesn't include the loss of productivity as your employees are working to help you shoot yourself in the foot.
    • Why do I keep writing that "you're shooting yourself in the foot"?  Because around 33% of the people you're reaching out to will probably turn their backs on you, per the survey.
    • Lawsuits.  'Nough said.

    No doubt there is more to the flipside of the coin; I've just run out of time to list them all.  What would all of this cost?  Depends on the size of the breach, but it could very well be in the millions of dollars.

    For example, BCBS of Tennessee saw its data breach costs soar to $7 million when 220,000 patients were affected by a data breach.  By the end of the whole ordeal, they had spent nearly $10 million for contacting members affected, investigating the theft, and offering free credit protection".

    And this is before the fine that OCR levied on them for breaching HIPAA (technically, BCBS settled for $1.5 million, which is the maximum penalty that OCR can assess), or the reputational damage they took.

    Or the security solutions they ended up adding into their risk prevention portfolio.

    Related Articles and Sites:
  • HHS Laptop Encryption: Arizona Counseling and Treatment Services Announces Data Breach

    Dissent at brings us news that the Arizona Counseling and Treatment Services, located in the city of Yuma, has announced the theft of an employee's laptop computer with personal patient information.  Although it's not spelt out, it appears pretty evident that laptop disk encryption for protecting PHI like AlertBoot was not used.

    When it comes to PHI security, encryption software is not required per HIPAA/HITECH (that is, its use is not mandated).  However, more often than not, it is recommended for a number of reasons, which we'll explore further below.

    Arizona Counseling and Treatment Services: Was Encryption Used?

    The general counsel and spokeswoman for Arizona Counseling and Treatment Services (ACTS) made these facts available, according to
    • An employee's laptop and external hard drive, both containing patient data, were swiped during a home burglary between March 18 and March 25.
    • The laptop was loaded with tracking software. (The hard drive was not, but it's to be expected.  A hard drive lacks the components to do stuff, like ping its location).
    • Neither device has been recovered to date.
    • Breached data includes names, dates of birth, treatment plans but no Social Security numbers or financial information.
    • Patients will be notified and offered help with credit monitoring.
    • Over 500 patients were served between 2011 and 2013.
    • A public notice will be made "because of the size of the breach".

    The last two bullets are especially interesting, in my opinion.

    Notice how it's not revealed how many people are affected, but that over 500 patients were involved.  The significance of that number lies in the HIPAA/HITECH requirements.  If more than 500 individuals are affected during the course of a HIPAA data breach, the medical "covered entity" must notify the Department of Health and Human Services (HSS) within 60 calendar days.

    Affected patients must also be individually notified within the same period; however, in the event that the covered entity doesn't know how to reach all of them, a public notice must be made (also known as a substitute notice).  If over 500 people are affected, the public notice becomes a requirement.

    These two points also imply that encryption was not used, since the use of encryption voids the above requirements.

    Why Use Encryption for PHI Data?

    The reason for using medical data encryption software is myriad.  Chief among them: it will protect your patients' data.  It really will.  Encryption that has been FIPS validated is so secure that it can be used to protect the government's secret information; it certainly will go a long way when it comes to medical information.

    There are a number of other benefits for those who fall under the HIPAA/HITECH umbrella, however.  First, the use of encryption provides safe harbor from the Breach Notification Rule.  If the device holding PHI was lost or stolen, nobody has to be informed of the event because the information is secure.

    This "get out of jail" card includes the HHS, who have their hands full with actual data breaches (from, my emphasis):
    In a conversation with a spokesperson from HHS this week, I learned that despite HHS’s previous statements to me that it investigates all breach reports, it turns out that the decision to investigate is made by regional directors. Although HHS’s original intention was to investigate all breaches, the sheer number of breach reports and the lack of adequate resources resulted in a change in their policy.
    Basically, if it's encrypted, it's not a data breach.

    Second, it could make your risk assessments a piece of cake.  A security risk assessment happens to be an integral part of being in compliance with HIPAA rules.  There are many companies out there that perform these assessments and decide that encryption is not in their cards.  For example, perhaps their computer is (a) in a room that is not easily accessible by the public, (b) chained to an immobile object, like a very heavy desk, and (c) protected with unique IDs and password-protection (not encryption) for each user that accesses the computer.

    The problem with the above is that there isn't a guarantee that the computer won't be stolen (burglary).  Now, if encryption is used, there is no argument: this is not a HIPAA breach.  If encryption is not used, then you have to go through an investigation to determine that, indeed, it's not a HIPAA breach.  Or, maybe, a different conclusion will be reached based on actuality or a technicality.

    It's not for nothing that, not too long ago, the OCR director made this remark:
    "We love encryption, and those who use encryption love it, too," Office for Civil Rights Director Leon Rodriguez said. "In the event of a breach, using encryption assures that that information is unreadable, unusable or undecipherable, which, basically, would qualify that entity for the safe harbors under our breach notification rule."
    Related Articles and Sites:


  • Financial Data BYOD: Investment Industry Regulatory Organization of Canada Loses Info On 50K

    According to Canadian media, the Investment Industry Regulatory Organization of Canada (IIROC) has lost a "portable device" that contained information on over 50,000 people.  The IIROC has not been very responsive regarding the details, including whether the device was protected with a mobile data management software like AlertBoot.  However, we know this much: they're "very sorry."

    52,000 Clients of 32 Brokerage Firms Affected

    According to, among other media outlets, the IIROC has blamed itself for the "unfortunate but isolated incident" and has promised to strengthen their internal controls so that the situation does not present itself in the future.

    The regulator's spokeswoman noted that the IIROC does not want to make public details about the case (and make things worse):
    "We are concerned that disclosing details of the incident may put clients' information at greater risk of being targeted for unauthorized use," she said. "We have communicated with all affected firms and are notifying their clients whose information was on the device."
    Maybe it's just me, but this does not sound like the words of a confident organization that knows their data is secure, despite not exactly knowing its current whereabouts.  Could this be indicative of a situation where this lost device has not been encrypted?

    This would not be the first (or last) time that something like this has happened.  The loss of USB drives and external hard drives have accounted for hundreds of public data breaches around the world.  You can bet that many more go unreported.

    The combination of "extremely portable" and "high capacity," compounded with people's inability to delete data – it's always easier to keep it around if you've got lots of storage space left, which is why my web-browser bookmarks point to YouTube clips that don't exist anymore – creates a potent and poisonous mix that will lead to a data breach, sooner or later.

    Our Recommendation: Control and Encrypt

    The best way to ensure that a portable device doesn't turn into a data breach is to not use one.  Now, you might think this is easier said than done, but it isn't, in a way.  There are companies out there in the world where they prevent the use of USB flash drives and such by taking a penny and gluing it to USB ports (my guess is that they're big into Bluetooth keyboards and mice).

    Most companies, however, will benefit from the use of their USB ports.  But, keeping them open and accessible also means that an employee could use their own USB sticks to copy data.  What to do?  At AlertBoot, we recommend controlling where the USB device can be used, and making sure that it's encrypted.

    First, the use of encryption software will ensure that there is no unauthorized access when and if the device goes missing.  Second, you can control where and how the device can be used by ensuring it doesn't work on unauthorized computers.  Under the AlertBoot solution for full disk encryption, a USB storage device can only be shared with computers that are part of a trusted group.

    So, for example, a USB device will work among computers lined at the front of the room, but not with those at the back of the same room (the device would show as unformatted thanks to encryption).  It's just a matter of how you group the computers: by department, by team, by floor, etc.

    Related Articles and Sites:
More Posts Next page »