This Blog




AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.


AlertBoot Endpoint Security

AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.

May 2012 - Posts

  • Smartphones and Medicine: Should Doctors Stop People From Taking Pictures?

    The proliferation of smartphones, tablets, and other ultraportable digital devices is, from a data security perspective, a bad thing.  It will inevitably lead to leaks and breaches of data, assuming proper BYOD security is not used -- and, in some cases, even if it is used.

    One question that the American Medical News is asking is "should doctors stop patients from taking smartphone pictures?"  This does not imply by the way, that taking pictures using something other than a smarpthone is OK (for example, tablets also have cameras, as well as plain phones.  And cameras can take pictures, of course).

    Not HIPAA Violation: Patients Breaching Privacy

    As the article points out, a ban on patients taking pictures is a tricky thing:

    If picture-taking is left unfettered, patients could feel violated and sense that a practice doesn’t take patient privacy seriously. On the other hand, if patients want to break out the smartphone for a few shots, is a practice just picking a fight by instituting a no-pictures policy?

    Plus, as the article points out, one patient violating another patient's privacy is...well, it's not a HIPAA violation.  Nor a violation of any state or federal laws; at least, I don't think there are any.  Yet, "ultimately, practices are duty-bound to do all they can to create an environment that respects patients and their privacy" so some kind of arrangement must be reached.

    Some highlights of the article:

    • There is a real risk of pictures of patients being distributed.
    • The key to creating a "no photo zone" means no exceptions.
    • There is a risk of a HIPAA violation: pictures could be distributed that show PHI.
    • Don't give patients access to your WIFI.  If you're providing it, make sure it's separate from what you and your staff use.

    Smartphones in the Workplace: Policies Required

    When people speak of protecting data in the age of smartphones and tablets, the discussion generally tends to veer towards technological solutions, like mobile antivirus, data compartmentalization, phone tracking, etc.

    However, an important component of data security still lies in creating usage policies.  This rather quaint and antiquated exercise is imperative because, among other things, it is (or at least, it should be) an analysis of where you need to secure data and why, and how you will achieve it.

    Related Articles and Sites:

  • Data Encryption: Two Of The Six Lasting Legacies Of 2006 VA Data Breach

    According to, the Veterans Affairs Department has gone from being the icon of incompetence to "model for how to effectively integrate touch safeguards into its daily operations."  It has listed six "lasting effects" (positive ones) from the 2006 VA data breach that affected 26.5 million veterans.  Two of those slots are reserved for the use of encryption, specifically including laptop encryption software like AlertBoot.

    Best Practices at VA

    According to an interview with, Roger Baker, VA CIO, has listed the following as  "among the best practices...[that the] VA has established to shore up its information security protections" (my emphases):

    1. VA has an independent privacy breach analysis team made up of legal, technology, business and privacy officers who examine each incident that is reported to Congress, how it was handled and what else can be done to prevent it in the future;

    2. VA encourages reporting of near-misses, a technique learned from NASA, without repercussions unless it was egregious or violated laws in order to fix problems before they become bigger;

    3. Transparency on data breaches helps to drive employee training because they have read about it in the press, and they don’t do it anymore;

    4. All VA laptops are encrypted;

    5. Personal data does not flow outside the VA unless it’s encrypted according to the latest federal information processing standard from the National Institute of Standards and Technology (NIST);

    6. VA CIO reports daily to the VA secretary about any information protection incidents.

    You'll notice that the points 4 and 5 involve encryption.  Seeing how the 2006 data breach was triggered by the loss of a laptop computer and external hard disk, it shouldn't come as a surprise that encryption is featured prominently as a security measure.  But, the use of encryption software is not mere window-dressing for placating critics.  As long as laptop computers are being used, and as long as employees are authorized to take these same laptops home, the use of disk encryption will be the solution that prevents a sizable chunk of potential data breaches.

    You should also notice that the rest of the points actually concern best practices in safeguarding data, such as running an analysis of weak points and ensuring that employees are trained and updated on security issues.  I especially like point #2.

    The VA's Come A Long Way

    The Veterans Affairs Department should be congratulated.  It did take a while, but it finally got there.  Along the way, I learned quite a bit covering their progress.

    For example, it took the VA approximately 5 years to encrypt all of their laptops.  Things were complicated by the fact that the VA is not actually one organization situated in one building (there were geographic boundaries to be covered) plus the usual set of complications, like computer hardware specs that a solution like AlertBoot managed encryption software would fix in no time (we deploy the encryption software via the web using a centralized cloud-based console, and the solution automatically checks for incompatibilities before attempting the installation).

    On the whole, it didn't look like it ought to be taking half a decade.  But, it dragged out for five years because certain laptops used with medical applications were incompatible with the use of disk encryption. Plus, there was the unusual situation where contractors to the VA refused to use encryption (and not just a handful, but 578 of them).

    The VA breach was also one of the first cases I know of where a lawsuit was filed  (for $20 million) and settled.

    The Lesson

    Coulda, woulda, shoulda: don't get caught with your pants down when a data breach hits you.  Learn from the mistakes of others.  Prepare for a data breach, not only by having a battle plan -- who to contact when it happens, who should be contacted, etc. -- but by putting up the proper defenses.

    Following the six best practices listed above are a pretty good way to get started.

    Related Articles and Sites:

  • Data Protection: What Does Happen To Customer Data When Startups Fail?

    I came across an article titled "Dismantling A Dream: What Happens When Startups Fail" at  Basically, the writer wonders "what happens to all my data when a startup fails?" and finds out.  The answer turns out to be: it depends.  Thankfully, there are some whose heads are screwed on right.  There are those who aren't.  I particularly feel sorry for one entrepreneur who, apparently, uses "password protection" instead of data encryption software to safeguard data (I feel sorry for his customers).

    Three Interviews, Two Responses

    I'd recommend reading the article but essentially, the three interviewees had two different ways of taking care of customers' (self-submitted) information:

    • Interviewee #1: Keep it.  He stores all of his customers' email addresses and login information on his laptop, protected with password protection.  And by "password protection," I think it actually refers to "password protection," as opposed to mistakenly calling encryption software "password protection."

      He has also lost a hard drive with customers' data.  He just doesn't know where it is.  It's not mentioned whether the data was protected at all.

    • Interviewee #2: Keep it.  He plans on keeping his customers' data, just like interviewee #1.  His rationale?  "Why not"?  One hopes that he'll use adequate protection.

    • Interviewee #3: Destroy it.  "I think it's a violation of user privacy to continue storing email addresses if you tell users your service is shutting down...I sign up for a service, and if you're no longer providing that service, why should you keep my information?"

    And the Right Attitude Is....

    We may live in a world of gray, but some things are surprisingly black and white.  Interviewee #3 has it right.  You have to destroy that data.  It's the only ethical thing to do.  Customers didn't sign up with you, the entrepreneur.  They may have cheered you on, but it's the service they were really interested in.  When your project fizzles and pops, that's it.  Finished.  Kaput. Finito.  The tit for tat is over because there is no "tat."

    Plus, I don't know about others, but I for one would not appreciate it if an unknown service's introductory email was addressed to me with personal details or whatever.  What's the first thing I'm going to think?  That I got spammed...or that someone's database got hacked.  Hardly a salubrious way to commence a relationship.

    Furthermore, there is the issue of legality.  In the US, what interviewees #1 and #2 are doing is not illegal (at least, I don't think it is).  In other parts of the world, like Europe, it is illegal: data can only be collected with one's consent, and only for the stated reasons.  If an organization decides to share the information, it has to gain consent upfront or get it later on.  Here's an interesting question: how's a defunct company going to gain consent?  It doesn't exist anymore.

    Of course, you can get around these legal troubles by pursuing projects that are limited to only those countries where going bust doesn't mean having to destroy personal data.  The count of such countries is becoming smaller with each passing year, though.

    And you certainly don't want to "deal with it later."  That's essentially what Google did with their Street View for Google Maps -- vacuuming up data and deciding to deal with it later...allegedly -- and look where it's gotten them.

    Encrypt that Stuff

    If you're an entrepreneur whose projects involve the collection of people's data -- be it sensitive or not -- and you do end up having to fold and move on to your next megabucks project, do yourself and your past clients a favor: encrypt any data that you've collected, assuming you're not going to destroy them.

    Related Articles and Sites:

  • Laptop Encryption Software: Senator Franken Wants It To Protect Medical Data

    Minnesota Senator Al Franken (and former SNL alumn) is considering legislation, at the state or federal level, that would require the encryption of laptops containing private medical information.  In other words, a solution like full disk encryption from AlertBoot.

    Consequence from Accretive Health Data Breach

    Various sources, including, are reporting that Sen. Franken has expressed his interest in pursuing "legislation or federal regulations requiring encryption of all laptops containing private medical information " after he questioned executives from Accretive Health and Fairview Health Services.

    I've pointed out many times in the past that current legislation does not mandate the use of encryption software when it comes to securing sensitive medical data.  Even HIPAA, as amended by HITECH, only strongly recommends its use.

    In reality, HIPAA / HITECH mandates the use of encryption but in name only.  You'd think this would prompt everyone to use encryption, but no; when you give some wiggle room, you always get people who try to get through it.  Which is why the Department of Health and Human Services -- charged with enforcing HIPAA -- should just come out and make it mandatory.  I mean, why are they not taking the ultimate logical step?

    Well, honestly, I can see how cost would be an issue, especially for the smaller organizations and private practitioners.  But, then, it's not the Department of Health, Human, and Hospital Finance Services, is it?

    Will It Help, Though?

    The problem with requiring the use of laptop encryption on all portable computers?  It's not a silver bullet:

    Sen. Franken asked numerous questions about the stolen laptop and other missing laptops reported by Accretive. All but one laptop was encrypted, Accretive replied, and that was due to the oversight of a single employee in its IT organization who has since been fired. Accretive has put into place new policies and procedures to insure redundancy to make certain all laptops are encrypted. [, my emphasis]

    Of course, one has to wonder whether Accretive is telling the truth.  After all, honest companies don't get roasted by a Senator and buy the wrath of the state Attorney General.  On the other hand, verifying the veracity of the statement wouldn't be hard (at least, not with a solution like AlertBoot where you get real-time laptop encryption status reports), so I can't imagine Accretive being less than forthright on this matter.

    On the other other hand, what are the chances that the one laptop that was not encrypted happened to get stolen? (As my stats professor used to say, probably low but not entire impossible.)

    But, that's not the point.  The point is, a mandate that all medical laptops be protected with whole device encryption does not guarantee that data will be protected.  You can have mistakes like the one above or companies that outright ignore the law.

    And, yet, it's the only logical step to take.  Encryption is a de facto requirement under HIPAA.  And, while not the perfect weapon against data loss, the use of encryption does reduce data breaches: they're almost 100% effective when it comes to stolen or missing laptops, which account for over half of all data breaches reported to the HHS that involve more than 500 people.

    P.S. - As an aside, does the Washington Examiner think this is a joke?  Why would their article on Sen. Franken's desire for mandatory laptop encryption pop up under "entertainment"?

    Related Articles and Sites:

  • Data Encryption Software: South Shore Hospital Pays $750K In MA, HIPAA Settlement

    Numerous sources are reporting that South Shore Hospital, based in Weymouth, MA, has settled a lawsuit brought by the Massachusetts Attorney General.  The settlement is being quoted as $750,000 but this figure is not quite correct: $275,000 of the total figure is a credit for security measures South Shore has already taken.

    The breach has mobilized South Shore to do something that they have done a long time ago: use data encryption like AlertBoot to secure all of their sensitive data.  I guess paying nearly $500,000 in fines will prompt you to do that.

    A Little History

    In the summer of 2010, South Shore Hospital went public with the knowledge that they had suffered a data breach.  While details were not as forthcoming then, we can now summarize the events as follows.  Previous posts on the South Shore breach can be found here, here, and here.

    South Shore contracted Archive Data Solutions to erase and sell 473 backup tapes.  The hospital failed to mention what was in those tapes; had the nature of the data been brought up, it could possibly have prompted the contractor to ask the contents to be encrypted, although this is merely speculation on my part.

    Archive Data Solutions in turn subcontracted the work to a firm in Texas.  However, the subcontractor never received the full shipment -- only one box out of three was received.  The courier company that was charged with delivering the boxes suggests that the missing two boxes of tapes were buried in a landfill, as per the courier's disposal policies.

    The hospital's inability to obtain certificates of destruction eventually led to Archive Data Solutions to admit to the breach.

    The information on the tapes affected patients, as well as employees, physicians, volunteers, donors, vendors, and other business partners affected.  A total of 800,000 people were affected.

    "Little to No Risk that Information...Could be Acquired"

    These are the words that South Shore used when filing the breach with the Massachusetts AG.  In light of the settlement, it's a fun little bag of mixed messages.  After all, the hospital did just agree to settle a lawsuit for $750,000 -- not chump change.

    The $750K breaks down as a $250,000 civil penalty, plus $225,000 destined to a fund for promoting education in the protection of personal information and protected health information, and a consent agreement that credits $275,000 for security measures South Shore has taken since the breach.

    You might asking, what security measures?  Well, for one:

    Since the breach, "we've actually put in a great deal of new measures to protect personal information," said hospital spokeswoman Sarah Darcy. "Everything — everything — is encrypted now."[]

    It sad how people prioritize things: had South Shore put in the effort to encrypt everything, EVERYTHING from the beginning, it wouldn't be suffering the effects of the data breach now.

    Related Articles and Sites:

  • Smartphone Scam: Fake Angry Birds Result in £50,000 Fine

    The majority of malware attacks on smartphones come in the form of spyware and SMS trojans.  A UK company was fined £50,000 (approximately $78,000) for dabbling in the latter.  One key aspect of smartphone data protection is to download apps from reputable sources only.

    Secretly Spoofed Apps Send Premium SMS

    According to, a company uploaded trojans into app markets.  These trojans were in the form of popular apps like Angry Birds and Cut the Rope.  Unlike the legitimate apps, once the app was downloaded to a smartphone, it secretly ran code in the background.

    Each victim's phone sent three premium SMS at £5 each (about $7).  1,391 UK citizens were affected (which equals to £27,850, or $43,500), but it is estimated that there were over 14,000 downloads worldwide.

    The fraud was put to stop by PhonepayPlus, a UK telephone industry regulator.

    SMS Trojan and Spyware Account for Majority of Attacks

    According to the Juniper Networks 2011 Mobile Threats Report, the majority of mobile threats come as spyware and SMS trojans.  There are other threats, too, such as fake installers, browser-based threats (such as a drive-by infection, where all a user has to do is visit the site), connectivity hacks, and the traditional data breach source: loss or theft of a mobile device.

    But, spyware and trojans occupy half the known attacks, and of the two, SMS Trojans are a sure-fire way of getting paid for one's illicit activities (spyware would require a second step where a hacker takes the data and tries to monetize it).

    How does one protect from such threats?  Seeing how the apps look no different from their legitimate siblings, it's not easy.  The use of mobile antivirus is one way to protect oneself.  Another is to forego downloading apps from third-party sites.

    If the goal is to minimize the risk of a data breach and achieve high levels of data security, though, you also need to pay attention to other threats, including the potential risk of your smartphone (or tablet) being stolen.  Ensure your encryption is activated and that your password is strong.  And don't leave your smartphone lying unsecure, such as in an unlocked gym locker.

    Related Articles and Sites:

More Posts Next page »