This Blog




AlertBoot offers a cloud-based data and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.

AlertBoot Endpoint Security

AlertBoot offers a cloud-based data and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.
  • HIPAA Business Associate: Not Having A Written Agreement Is Grounds For Reporting A Data Breach

    When it comes to preventing HIPAA data breaches, one of the best ways of doing so is via the use of PHI encryption software.  However, there are so many aspects to the HIPAA Security Rule that sometimes it gets confusing.  For example, what happens if you violate one HIPAA rule while you have encryption in place?  Under most scenarios, you should be protected under the safe harbor clause.

    But the Berea College Health Services (BCHS) case shows that it may not be so simple.

    The Non Data Breach

    The site site has unearthed a relatively interesting data security violation.  Berea College in Kentucky has notified patients of BCHS that they were involved in a HIPAA breach.  Apparently, a billing contractor had gotten a hold of and used BCHS patient information, as intended.  This triggered a data breach, however, because there wasn't a written business associate agreement between the two:
    Although this contractor had access to medical records, including names, addresses, dates of births, insurance numbers, social security numbers, and diagnosis and treatment information, BCHS has no reason to believe that any patient information has been misused or disclosed inappropriately. We did not have a written agreement in place because BCHS failed to request it. The contractor has advised us that patient health information was used and disclosed only for BCHS billing and for no other purpose, and we have been assured that the contractor has returned to BCHS or destroyed any patient information that she might have accessed. Nevertheless, we are obligated to notify you of this issue.
    There is no reason to believe that there was any foul play involving PHI.  Indeed, if the notification letter is to be believed, the only transgression is the lack of a formal agreement.  I also noticed that the failure to encrypt PHI data went unmentioned, leading me to believe that everything was taken care of in that area.

    Lack of Agreement Trumps Safe Harbor?

    The HHS Office for Civil rights has made it clear over the years: encrypt your data and you're protected (although there are certain caveats.  For example, the encryption that was used must be something that NIST has approved or is likely to have approved...although that last one is never a sure thing, making the former the only sure-fire option).

    Does the situation with BCHS mean that data encryption does not provide as much safe harbor as people are led to believe?  Or perhaps BCHS was being a little too cautious?  After all, there's nothing forbidding a covered entity from issuing a letter of apology even if they don't have to.

    My own conclusion is this: at the most fundamental level, BCHS has run into one of those caveats regarding encryption and safe harbor.

    You see, even if the data was sent to the business associate in encrypted form, and was stored in an encrypted format while she was working with the data, she accessed it.  She had to if she was going to work with the information.  But without a formal agreement, she was technically an unauthorized third party and shouldn't have had access to the information.

    In other words, encryption was breached.  Encryption safe harbor is a moot point if a hacker were to somehow gain access to encrypted data.  While BCHS is not dealing with a hacker, the lack of a formal agreement means that they were operating under a similar situation.

    The moral of this story?  Make sure all your tees are crossed and eyes are dotted, literally as well figuratively.

    Related Articles and Sites:


  • Data Breaches: UK ICO Declines To Investigate Supposed Santander Email Breach

    The Information Commissioner's Office in the United Kingdom has declined to investigate Santander, the Spanish banking group, for a purported data breach.  According to, people who've set up emails that are strictly used for correspondence with Santander are being spammed with junk mail, lending credence to the theory that the bank's database was breached.

    The ICO, however, notes that there isn't enough evidence of a data breach.  Santander, for their part, have only stated that they are conducting an investigation into the allegations but have no uncovered a data breach to date.  The statement, however, was made back in December 2013.

    The Evidence

    It wasn't only Santander that was affected.  According to, email addresses registered with the UK Government Gateway and NatWest FastPay were also affected.

    Some of the spam emails include the last name of the recipients, information that is not present in the email address itself, indicating that a database which ties both an email address with personal information must have been breached.  (The other unsubstantiated accusation is that the information was sold by the bank to third parties.)

    The Counterargument

    The problem is that, of course, none of this is necessarily smoking evidence.  It's not unusual for people to set up a free email address with the intention of using it for one thing – I do it myself; I, for one, don't appreciate the spam that comes from legitimate business I sign up with.  I'd rather keep my personal inbox uncluttered without having to set up filters and whatnot – but end up using it for something else.

    Then, there is the possibility that a third party was breached.  For example, Santander may not have sold the information to a third party, but usually EULAs tend to contain a clause where information can be shared with partners.  What if a partner was breached?  Of course, under most legal statutes, Santander would be in the hock but still...Santander is not really the one breached.  If you're looking for a remedy, you won't find it by quizzing and prodding Santander.

    Last but not least, there is always the chance that the email provider was hacked.  Of course, this scenario is less likely under these specific circumstances seeing how all the complaints have one thing in common: Santander.

    Is the ICO Capitulating?

    Has the Commissioner decided to give Santander a break...or worse, bowing under pressure?  I don't think so.  The evidence – a unique email address combined with a last name – is quite tenuous.  If that were enough to identify the "breachee" then an argument could be made than an IP address is enough to identify an internet user; we all know the latter is not quite right.  Neither is the former.

    Related Articles and Sites:


  • HIPAA Desktop Encryption: Sutherland Healthcare Solutions Breach Affects 340 K, Reward Offered

    Sutherland Healthcare Solutions (SHS), a billing contractor for the Los Angeles County, has offered a reward of $25,000 for the return of computers stolen from their offices.  The data breach was initially reported as affecting approximately 170,000 people; the number has been revised to 338,700.  All of this because HIPAA desktop encryption was not used to properly protect PHI.

    Eight Desktop Computers Stolen.  What About HIPAA?

    Previous reports on the SHS breach were vague on the details.  Further reporting two months down the line shows that the computers stolen from SHS offices are "computer towers," more specifically HP Pro 3400s.  According to the specs, the dimensions of this particular computer are 368 x 165 x 389 mm (or 14.5 x 6.5 x 15.3 inches) and weighing a little under 16 pounds.  In other words, it's the size of a big encyclopedia volume.

    Installing HIPAA data encryption software is a cinch.  And, the use of data encryption provides safe harbor from HIPAA's Breach Notification Rule.  So, why were these computers not protected?

    The argument is often made that desktop computers do not need encryption because (a) HIPAA technically doesn't require the use of encryption and (b) desktop computers are not easily stolen.  Furthermore, it would be incredibly easy to spot such a theft, preventing the breach from occurring while it happened.

    Except that that is not how it usually unfolds.  The article that covers the breach at shows one man who's suspected of stealing the computers.  In the individual frames of the surveillance footage that were made available, he's holding a black bag that was undoubtedly used to moving the desktops to and fro, one by one.

    He probably made eight trips, at least – earlier reports noted that computer monitors were also stolen – meaning that there were at least eight individual instances where, in theory, he could have been stopped.  Anecdote may not be proof, but instances where desktop computers are stolen from offices are so common that the myth of "desktop computers cannot be easily stolen" should die a fiery death.

    Is Encryption Really "Not Required"?

    Now that we've covered aspect (b) of the argument, let's turn our eyes to aspect (a) of the " desktop computers do not need encryption" argument.

    Is encryption really not required under HIPAA rules?  The technical answer is no.  Under HIPAA Security rules, the use of encryption is an "addressable" issue, not a required one.  However, "addressable" differs from a layperson's definition.  "Addressable" under HIPAA really means "it is required unless you can prove that something else will work just as well."

    Consider this other found at on whether encryption is mandatory under the Security Rule (my emphases):
    No. The final Security Rule made the use of encryption an addressable implementation specification...and must therefore be implemented if, after a risk assessment, the entity has determined that the specification is a reasonable and appropriate safeguard in its risk management of the confidentiality, integrity and availability of e-PHI. If the entity decides that the addressable implementation specification is not reasonable and appropriate, it must document that determination and implement an equivalent alternative measure, presuming that the alternative is reasonable and appropriate. If the standard can otherwise be met, the covered entity may choose to not implement the implementation specification or any equivalent alternative measure and document the rationale for this decision.
    As you can see from the above, encryption is not required...but you need to use an "equivalent and alternative measure" to secure the data.  What people are confusing is interpreting "encryption is not mandatory" with "data security is not mandatory."  The latter is required, the former not...but, then again, the latter is required if one wants to take advantage of the safe harbor under the Breach Notification Act since only encryption and data destruction are apply.

    Related Articles and Sites:
  • Kentucky Data Breach Law Signed

    The number of US states that haven't signed a data protection law has dropped to three.  According to, the state of Kentucky is the latest state to sign a bill that is aimed at protecting personal data of Kentuckians.  Like many similar state laws, the use of data encryption provides safe harbor from reporting data breaches to consumers.

    Safe Harbor, Personal Information Defined

    Like many state laws concerning data security and data privacy, the law makes exceptions for information protected with encryption software.  First, a "breach of the security system" is defined as:
    unauthorized acquisition of unencrypted and unredacted computerized data that compromises the security, confidentiality, or integrity of personally identifiable information maintained by the information holder as part of a database regarding multiple individuals that actually causes, or leads the information holder to reasonably believe has caused or will cause, identity theft or fraud against any resident of the Commonwealth of Kentucky
    The one twist I can immediately make out is that the law requires the breach to be directly linked to ID theft or lead the "information holder to reasonably believe" it will happen.  I can understand the need to put limits – after all, most data breaches fizz out with nothing happening – but the latter requirement literally puts the fox in charge of the hen house.  Wouldn't it be in most information holders' interest to believe that ID theft will is not in the cards when data is lost or stolen?

    Second, the law clearly states that the breach of unencrypted data will be followed with a notification "in the most expedient time possible and without unreasonable delay."  The logical conclusion is that information that is encrypted does not require a data breach notification (which is only natural, seeing how the breach of a security system has been defined).

    Student Data Also Protected

    Being at the tail-end of the breach legislation game has its own rewards.  The Kentucky legislature has made it a point to ensure that student data is protected.  Among other things, it is now illegal to "process student data for any purpose other than providing, improving, developing, or maintaining the integrity of its cloud computing service."

    This is no doubt directed to certain services that acknowledge data-mining student information for profit, financial or otherwise.

    No Breach Law, More Expensive Insurance Policies

    An interesting factoid that I learned while reading of Kentucky's data breach law, courtesy of
    insurance companies were charging Kentuckians more for cyber-security policies in the absence of any state laws requiring such notification after incidents such as the Target and Neiman Marcus data breaches.
    I cannot even begin to fathom why this would be so, but apparently it's a thing.  Assuming this has a causal link with legislation, I guess this is another reason why the US should have a federal data breach law.

    Related Articles and Sites:
  • Canada Digital Privacy Act: $100,000 Fine For Not Reporting Data Breaches

    Canada is set to introduce a new bill that would make it illegal to not report data breaches to people affected by it, or for failing to report said breach to the Privacy Commissioner.  The new law, called the Digital Privacy Act, will be a much needed amendment to PIPEDA (Personal Information Protection and Electronic Documents Act), which already provides guidelines to things like the use of data encryption for protecting personal data.

    The big element in this news is that fines of up to $100,000 could be handed out by the Privacy Commissioner's Office.  With the stick, however, you should look for the carrot.

    Encryption Software and Other Security Tools Get Boost

    Data encryption has been pointed out as a means of protecting personal information from data breaches under the PIPEDA.

    Among other things, the Digital Privacy Act appears to reinforce the need to use such tools. (Does this come as a surprise, though?  We are living in the digital age, after all).  For example, the following definition is being added to PIPEDA:
    "breach of security safeguards" means the loss of, unauthorized access to or unauthorized disclosure of personal information resulting from a breach of an organization's security safeguards that are referred to in clause 4.7 of Schedule 1 or from a failure to establish those safeguards.
    Why would this be a thumbs-up for full disk encryption and mobile device management and other forms of data security?  Because these tools are expressly designed to prevent the loss, unauthorized access, or unauthorized disclosure of personal information.  Encryption is a security safeguard (which the original PIPEDA legislation makes abundantly clear).  Since a computer remains encrypted once it is encrypted, its security safeguard is still in place if a laptop computer were to be lost.  Result: not a breach of security safeguards.

    This ties in to the following requirement that is being introduced by the new bill (my emphasis):
    An organization shall report to the Commissioner any breach of security safeguards involving personal information under its control if it is reasonable in the circumstances to believe that the breach creates a real risk of significant harm to an individual.
    With "significant harm" being defined as (my emphasis):
    For the purpose of this section, "significant harm" includes bodily harm, humiliation, damage to reputation or relationships, loss of employment, business or professional opportunities, financial loss, identity theft, negative effects on the credit record and damage to or loss of property.
    The one avant-garde aspect of this Act is that it has classified identity theft and credit records as factors "harm".  As far as I know, this is the only case in the world so far where this is so.  For example, in the U.S., the effects of a data breach on a person's credit record is not exactly viewed as a cognizant harm.

    Questionable Loophole

    It's not all milk and honey when it comes to the Act, though.  Many people who have experience with "harm threshold" clauses are probably not going to be too crazy about the "creates a real risk of significant harm" portion.  After all, who determines what is a "real risk"?  However, Canadians have thought of everything:
    The factors that are relevant to determining whether a breach of security safeguards creates a real risk of significant harm to the individual include

    (a) the sensitivity of the personal information involved in the breach;
    (b) the probability that the personal information has been, is being or will be misused; and
    (c) any other prescribed factor.
    The use of encryption software ensures that the real risk of significant harm is virtually eliminated; however, if not, I guess a company could argue their case to the commissioner.

    Related Articles and Sites:
  • HIPAA Security: Michigan Department of Community Health Data Breach Affects 2,595

    One of the most unfortunate types of data breach cases I come across are those that involve instances where PHI encryption for laptops was used but still resulted in a HIPAA data breach, like the following one at the Michigan Department of Community Health.

    According to the entry at, an employee of the Ombudsman’s Office at State Long Term Care experienced a burglary.  The thief took a laptop computer and a flash drive.  The former was protected with encryption software, as many covered entities have done in light of the final Omnibus Rule.

    The flash drive, however, was not encrypted.  A total of 2,595 people, living and deceased, were affected by this latest PHI breach.

    Personal Data Stolen, Laptop and Drive Not Recovered

    The stolen information included people’s names, addresses, dates of birth, SSNs, or Medicaid IDs (although not all were affected).  The burglary occurred around January 30th, with MDCH learning about it on February 3rd.  As of the press release announcing the data breach, April 3, the devices were not recovered.

    And, chances are that they won’t be.  For the flash drive, the chances of it being recovered are close to nil.  For the laptop, assuming some other type of laptop security protection software was employed in addition to the encryption -- such as a tracker -- the odds of recovery are higher but can still be pretty low.

    For example, the use of Absolute Software trackers could lead to a 75% recovery, if you believe the manufacturer’s claims.  The only caveats here are that (a) we have absolutely no idea how long it takes to recover the device (is a day, a week, a month?), (b) it’s not foolproof.  The use of a Faraday cage-like device or going into a basement may be enough to defeat the technology, and (c) recovery falls outside of the safe harbor requirements under HIPAA.  This last one requires the use of encryption (or data destruction) to go into effect.

    Still, the technology is much more impressive than conventional tracker software for your laptop, which generally tends to “track” the last known IP address, seeing how these devices don’t come with a GPS module, and are not as accurate in terms of pinpointing a device’s location.

    Full Disk Encryption Has Blind Spots

    With HHS (and their OCR branch) heavily promoting the use of HIPAA encryption, it’s kind of hard for the layperson to understand what went wrong here.  Encryption appears to have been used, but one still had a data breach.  Yes, the flash drive was at fault, but...wasn’t the data encrypted when it was transferred from the computer?

    In order to make sense of what’s going on here, one needs to understand the basic concepts of the underlying technology.  There are many different types of encryption.  There is full disk encryption, which was probably used to protect the laptop’s contents.  As the name implies, full disk encryption (usually abbreviated to FDE) encrypts the entire content of a hard drive.  To be more specific, it encrypts the hard drive itself; because data is stored in the hard drive, they end up protected, too.

    This distinction is very important, as it explains why the flash drive’s contents were not protected.  Since it’s the hard drive that’s encrypted and not the data, when you copy the data over to another storage device, such as a flash drive, that information is not encrypted anymore.

    Is there a way to encrypt the data?  Absolutely.  There are technologies that encrypt data, such as file encryption.  Generally, it’s a less optimal way of protecting an entire device’s data.  Thus, they tend to work in tandem: FDE for the laptop, file encryption for any files that are making it off of the laptop.  This way, one of the top weaknesses of FDE is shored up.

    Related Articles and Sites:


More Posts Next page »