This Blog




AlertBoot offers a cloud-based data and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.

AlertBoot Endpoint Security

AlertBoot offers a cloud-based data and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.
  • Smartphone Security: Thefts Doubled In 2013 And Other Stats

    According to Consumer Reports, 3.1 million smartphones were stolen in 2013, a significant increase from the 1.6 million smartphones thefts the previous year.  Despite the tremendous growth in purloined smart devices , the same report found that most people are not following even the most basic of steps to protect data on mobile devices, such as turning on smartphone encryption.

    There are a number of things that smart phone users can to do ensure that nothing more goes awry once they're the victims of a theft.

    Number of Missing Smart Phones Waaaay Up

    In addition to the 3.1 million smart phones that were stolen, and additional 1.4 million were lost for a total of 4.5 million missing devices.  Some may attribute this growth to an increase in people who are using smartphones...and they would be right.

    According to a February 2014 post, approximately 66% of US consumers own smartphones, an increase from 44% in 2011.  Some rough calculations, and assuming a linear rate of growth, indicate that the growth of smart phone users from 2013 to 2014 is around 12%.

    The increase of lost phones from 1.2 million to 1.4 million is 16%, which is roughly in line with the growth of smartphone users.  However, the rate for stolen phones is nearly 94%.  Obvious conclusion: thieves are out to get ya.

    No Security Measures: 34%

    In addition to the above stats, the Consumer Report findings show that:
    • 34% did not use any security measures whatsoever
    • 36% used a 4-digit PIN
    • 11% used a PIN that's "longer than 4 digits, a password, or unlock pattern"
    • 7% user some other security feature, like encryption

    Seeing how mobile device encryption is useless without a password, I take it that the 7% figure also includes some form of password aspect to it.  In other words, a total of 54% of smartphone users are securing their devices with the most basic of data protection for mobile devices.

    Unfortunately, it's only the 7% of users who are fully protected, seeing how reading a smartphone's contents only requires connecting it to a computer if encryption is not enabled on a smart phone (or a tablet, for that matter).

    Still, the numbers show an improvement.  When I read of a similar survey a year or so ago, I'm pretty sure the stats showed that about 25% or so were using passwords on their devices.  Now, the number has doubled, meaning that people are definitely more aware of the potential problems of not securing their mobile devices.

    As many smart phone TV ads note, these devices aren't only part of your life, they hold your life.  It's in every person's interest to be part of that group that's using encryption to protect their device's content.

    Related Articles and Sites:
  • Data Breach Cost: South Carolina Earmarking $27 Million For 2-Year-Old Hacking Incident

    South Carolina is in the news again for a data breach that occurred in 2012.  If you'll recall, that was the year when SC admitted that its tax collection department had suffered a data breach, affecting 6.4 million people.  Two years after, the effects of the data breach are still being felt.

    $20.7 Million Earmarked for Next Fiscal Year

    The one silver-lining on government data breaches, if you can call it such, is that a lot of information is made public, giving us a better understanding of what an organization has to go through if it experiences a data mishap.  For example, figuring out the total cost of a data breach has always been like peering through an opaque window: some figures are revealed but the whole of the picture always remained hidden, either because that's really none of our business (how much a company is investing in security software) or because no one's interested anymore (what costs did a company incur on the third year after the data breach?).

    A lot of articles covering the South Carolina budget for next year are noting that at least $27 million is being earmarked for information security and credit monitoring.  Of this figure, about $6.5 million will be used for a third year of credit monitoring services.

    Of the remaining $20.7 million, $5.7 million is for a 12-person information security division and 3-person privacy office; $6.1 million is for maintaining and expanding the division's services; and $8.7 million is for upgrades to current data security.  The bulk of the money, as you can see, is being used for security operations that the state should have had before the 2012 data debacle.  Under the circumstances, it's impossible to say that, three years in, the ongoing cost of a data breach is $27 million.

    The only ongoing cost appears to be $6 million, which is not exactly small change (and also a dubitable expense.  Brian Krebs of describes his first-hand experience on such services.  Conclusion?  Not really worth it unless you get it for free).

    Other Goodies

    In addition to the extra money being spent to ensure that proper data security is in place, the South Carolina legislature is also making it mandatory for state agencies to implement information technology policies as well as shoring up other weaknesses, such as not knowing whether IT security is actually being implemented.

    Of course, such things don't necessarily require state-level legislation.  It's a matter of getting the right tools for the job.  For example, the AlertBoot managed encryption solution uses a cloud-based console, making it easy to checkup on endpoint deployments and installations from any web browser with a connection to the internet.

    Related Articles and Sites:


  • HIPAA Business Associate: Not Having A Written Agreement Is Grounds For Reporting A Data Breach

    When it comes to preventing HIPAA data breaches, one of the best ways of doing so is via the use of PHI encryption software.  However, there are so many aspects to the HIPAA Security Rule that sometimes it gets confusing.  For example, what happens if you violate one HIPAA rule while you have encryption in place?  Under most scenarios, you should be protected under the safe harbor clause.

    But the Berea College Health Services (BCHS) case shows that it may not be so simple.

    The Non Data Breach

    The site site has unearthed a relatively interesting data security violation.  Berea College in Kentucky has notified patients of BCHS that they were involved in a HIPAA breach.  Apparently, a billing contractor had gotten a hold of and used BCHS patient information, as intended.  This triggered a data breach, however, because there wasn't a written business associate agreement between the two:
    Although this contractor had access to medical records, including names, addresses, dates of births, insurance numbers, social security numbers, and diagnosis and treatment information, BCHS has no reason to believe that any patient information has been misused or disclosed inappropriately. We did not have a written agreement in place because BCHS failed to request it. The contractor has advised us that patient health information was used and disclosed only for BCHS billing and for no other purpose, and we have been assured that the contractor has returned to BCHS or destroyed any patient information that she might have accessed. Nevertheless, we are obligated to notify you of this issue.
    There is no reason to believe that there was any foul play involving PHI.  Indeed, if the notification letter is to be believed, the only transgression is the lack of a formal agreement.  I also noticed that the failure to encrypt PHI data went unmentioned, leading me to believe that everything was taken care of in that area.

    Lack of Agreement Trumps Safe Harbor?

    The HHS Office for Civil rights has made it clear over the years: encrypt your data and you're protected (although there are certain caveats.  For example, the encryption that was used must be something that NIST has approved or is likely to have approved...although that last one is never a sure thing, making the former the only sure-fire option).

    Does the situation with BCHS mean that data encryption does not provide as much safe harbor as people are led to believe?  Or perhaps BCHS was being a little too cautious?  After all, there's nothing forbidding a covered entity from issuing a letter of apology even if they don't have to.

    My own conclusion is this: at the most fundamental level, BCHS has run into one of those caveats regarding encryption and safe harbor.

    You see, even if the data was sent to the business associate in encrypted form, and was stored in an encrypted format while she was working with the data, she accessed it.  She had to if she was going to work with the information.  But without a formal agreement, she was technically an unauthorized third party and shouldn't have had access to the information.

    In other words, encryption was breached.  Encryption safe harbor is a moot point if a hacker were to somehow gain access to encrypted data.  While BCHS is not dealing with a hacker, the lack of a formal agreement means that they were operating under a similar situation.

    The moral of this story?  Make sure all your tees are crossed and eyes are dotted, literally as well figuratively.

    Related Articles and Sites:


  • Data Breaches: UK ICO Declines To Investigate Supposed Santander Email Breach

    The Information Commissioner's Office in the United Kingdom has declined to investigate Santander, the Spanish banking group, for a purported data breach.  According to, people who've set up emails that are strictly used for correspondence with Santander are being spammed with junk mail, lending credence to the theory that the bank's database was breached.

    The ICO, however, notes that there isn't enough evidence of a data breach.  Santander, for their part, have only stated that they are conducting an investigation into the allegations but have no uncovered a data breach to date.  The statement, however, was made back in December 2013.

    The Evidence

    It wasn't only Santander that was affected.  According to, email addresses registered with the UK Government Gateway and NatWest FastPay were also affected.

    Some of the spam emails include the last name of the recipients, information that is not present in the email address itself, indicating that a database which ties both an email address with personal information must have been breached.  (The other unsubstantiated accusation is that the information was sold by the bank to third parties.)

    The Counterargument

    The problem is that, of course, none of this is necessarily smoking evidence.  It's not unusual for people to set up a free email address with the intention of using it for one thing – I do it myself; I, for one, don't appreciate the spam that comes from legitimate business I sign up with.  I'd rather keep my personal inbox uncluttered without having to set up filters and whatnot – but end up using it for something else.

    Then, there is the possibility that a third party was breached.  For example, Santander may not have sold the information to a third party, but usually EULAs tend to contain a clause where information can be shared with partners.  What if a partner was breached?  Of course, under most legal statutes, Santander would be in the hock but still...Santander is not really the one breached.  If you're looking for a remedy, you won't find it by quizzing and prodding Santander.

    Last but not least, there is always the chance that the email provider was hacked.  Of course, this scenario is less likely under these specific circumstances seeing how all the complaints have one thing in common: Santander.

    Is the ICO Capitulating?

    Has the Commissioner decided to give Santander a break...or worse, bowing under pressure?  I don't think so.  The evidence – a unique email address combined with a last name – is quite tenuous.  If that were enough to identify the "breachee" then an argument could be made than an IP address is enough to identify an internet user; we all know the latter is not quite right.  Neither is the former.

    Related Articles and Sites:


  • HIPAA Desktop Encryption: Sutherland Healthcare Solutions Breach Affects 340 K, Reward Offered

    Sutherland Healthcare Solutions (SHS), a billing contractor for the Los Angeles County, has offered a reward of $25,000 for the return of computers stolen from their offices.  The data breach was initially reported as affecting approximately 170,000 people; the number has been revised to 338,700.  All of this because HIPAA desktop encryption was not used to properly protect PHI.

    Eight Desktop Computers Stolen.  What About HIPAA?

    Previous reports on the SHS breach were vague on the details.  Further reporting two months down the line shows that the computers stolen from SHS offices are "computer towers," more specifically HP Pro 3400s.  According to the specs, the dimensions of this particular computer are 368 x 165 x 389 mm (or 14.5 x 6.5 x 15.3 inches) and weighing a little under 16 pounds.  In other words, it's the size of a big encyclopedia volume.

    Installing HIPAA data encryption software is a cinch.  And, the use of data encryption provides safe harbor from HIPAA's Breach Notification Rule.  So, why were these computers not protected?

    The argument is often made that desktop computers do not need encryption because (a) HIPAA technically doesn't require the use of encryption and (b) desktop computers are not easily stolen.  Furthermore, it would be incredibly easy to spot such a theft, preventing the breach from occurring while it happened.

    Except that that is not how it usually unfolds.  The article that covers the breach at shows one man who's suspected of stealing the computers.  In the individual frames of the surveillance footage that were made available, he's holding a black bag that was undoubtedly used to moving the desktops to and fro, one by one.

    He probably made eight trips, at least – earlier reports noted that computer monitors were also stolen – meaning that there were at least eight individual instances where, in theory, he could have been stopped.  Anecdote may not be proof, but instances where desktop computers are stolen from offices are so common that the myth of "desktop computers cannot be easily stolen" should die a fiery death.

    Is Encryption Really "Not Required"?

    Now that we've covered aspect (b) of the argument, let's turn our eyes to aspect (a) of the " desktop computers do not need encryption" argument.

    Is encryption really not required under HIPAA rules?  The technical answer is no.  Under HIPAA Security rules, the use of encryption is an "addressable" issue, not a required one.  However, "addressable" differs from a layperson's definition.  "Addressable" under HIPAA really means "it is required unless you can prove that something else will work just as well."

    Consider this other found at on whether encryption is mandatory under the Security Rule (my emphases):
    No. The final Security Rule made the use of encryption an addressable implementation specification...and must therefore be implemented if, after a risk assessment, the entity has determined that the specification is a reasonable and appropriate safeguard in its risk management of the confidentiality, integrity and availability of e-PHI. If the entity decides that the addressable implementation specification is not reasonable and appropriate, it must document that determination and implement an equivalent alternative measure, presuming that the alternative is reasonable and appropriate. If the standard can otherwise be met, the covered entity may choose to not implement the implementation specification or any equivalent alternative measure and document the rationale for this decision.
    As you can see from the above, encryption is not required...but you need to use an "equivalent and alternative measure" to secure the data.  What people are confusing is interpreting "encryption is not mandatory" with "data security is not mandatory."  The latter is required, the former not...but, then again, the latter is required if one wants to take advantage of the safe harbor under the Breach Notification Act since only encryption and data destruction are apply.

    Related Articles and Sites:
  • Kentucky Data Breach Law Signed

    The number of US states that haven't signed a data protection law has dropped to three.  According to, the state of Kentucky is the latest state to sign a bill that is aimed at protecting personal data of Kentuckians.  Like many similar state laws, the use of data encryption provides safe harbor from reporting data breaches to consumers.

    Safe Harbor, Personal Information Defined

    Like many state laws concerning data security and data privacy, the law makes exceptions for information protected with encryption software.  First, a "breach of the security system" is defined as:
    unauthorized acquisition of unencrypted and unredacted computerized data that compromises the security, confidentiality, or integrity of personally identifiable information maintained by the information holder as part of a database regarding multiple individuals that actually causes, or leads the information holder to reasonably believe has caused or will cause, identity theft or fraud against any resident of the Commonwealth of Kentucky
    The one twist I can immediately make out is that the law requires the breach to be directly linked to ID theft or lead the "information holder to reasonably believe" it will happen.  I can understand the need to put limits – after all, most data breaches fizz out with nothing happening – but the latter requirement literally puts the fox in charge of the hen house.  Wouldn't it be in most information holders' interest to believe that ID theft will is not in the cards when data is lost or stolen?

    Second, the law clearly states that the breach of unencrypted data will be followed with a notification "in the most expedient time possible and without unreasonable delay."  The logical conclusion is that information that is encrypted does not require a data breach notification (which is only natural, seeing how the breach of a security system has been defined).

    Student Data Also Protected

    Being at the tail-end of the breach legislation game has its own rewards.  The Kentucky legislature has made it a point to ensure that student data is protected.  Among other things, it is now illegal to "process student data for any purpose other than providing, improving, developing, or maintaining the integrity of its cloud computing service."

    This is no doubt directed to certain services that acknowledge data-mining student information for profit, financial or otherwise.

    No Breach Law, More Expensive Insurance Policies

    An interesting factoid that I learned while reading of Kentucky's data breach law, courtesy of
    insurance companies were charging Kentuckians more for cyber-security policies in the absence of any state laws requiring such notification after incidents such as the Target and Neiman Marcus data breaches.
    I cannot even begin to fathom why this would be so, but apparently it's a thing.  Assuming this has a causal link with legislation, I guess this is another reason why the US should have a federal data breach law.

    Related Articles and Sites:
More Posts Next page »