This Blog




AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.


AlertBoot Endpoint Security

AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.

May 2011 - Posts

  • Laptop Encryption Software: Spartanburg Regional Hospital Announces Laptop Theft

    Spartanburg Regional Hospital has begun notifying clients that the theft of a laptop from an employee's car has triggered a data breach.  The laptop was not secured with disk encryption software like AlertBoot.

    Laptop Encryption not Used?

    According to a statement by an executive vice-president at the hospital,

    "On March 29, 2011, we were informed that a computer was stolen from an employee's car the previous night," he wrote. "The employee was authorized to have possession of the computer. We have reported this to the proper authorities and an investigation is ongoing." [, my emphasis]

    It was further noted that the computer was password-protected but there is no news on whether encryption software was used to protect the contents of the laptop.  However, the fact that:

    1) the hospital has begun notifying their patients; and,
    2) it has done so close to the 60-day mark since it found about the data breach

    leads me to believe that they may not have used encryption.  Which is terrible because the laptop contained "names, addresses, dates of birth, medical billing codes, and Social Security numbers."  This is not the type of data you want to authorize your employees to carry about in an unprotected laptop computer.

    60-Day Limit per HIPAA / HITECH

    This is what Spartanburg told regarding the breach:

    Its [sic] important to note, the hospital says they have no reason to believe that any information has been misused. They sent out a notice however, to be proactive.

    To be proactive?  That's debatable.  Such words imply that the hospital sent out the notifications...shall we say, out of concern for the patients; however, they are required to do so under the latest HIPAA and HITECH rules whether they want to be proactive or not (again, assuming encryption was not used.  If it was used, then Spartanburg can justify the claim).

    According to the "Breach Notification Interim Final Regulation" of August 2009, HIPAA-covered entities have up to 60 calendar days beginning from the discovery of the breach to notify patients of a data incident.  Based on this yardstick, one could claim that they've been anything but proactive: how else would you classify waiting until the last possible moment, before the regulating authorities start getting involved?

    Taking a Softer Stance

    Perhaps Spartanburg is not to blame.  At least, not for the lack of encryption software on the laptop.  The reality is that HIPAA / HITECH doesn't mandate the use of encryption, as I already noted earlier this month.  In other words, the use of encryption is not required, just really strongly encouraged.

    As I mused two weeks ago, it's probably because the term "protected health information" is overly broad and includes not only truly sensitive information like billing information (Medicare numbers, SSNs, credit card numbers, insurance account numbers, etc.) and medical information (prescriptions, mental evaluations, STD test results, etc), but also the less sensitive ones.

    A broad term means hospitals have to use to their judgment in deciding what should be encrypted and what shouldn't.  Of course, I'm not sure how you could come to the conclusion that SSNs weren't important enough to encrypt...but, hey, why can't you have the rules basically state: if you're storing SSNs on a portable device, you must use encryption?  It's not as if there are less sensitive SSNs or non-sensitive SSNs, and the rules are already heavily tipped towards the use of encryption!

    If the rules take on a strong stance, you also have less room for interpretation, which (in theory) means you wouldn't have ridiculous breaches like the above.

    Related Articles and Sites:

  • Notebook Computer Encryption Software: Exclusive Footwear Laptop Stolen...During Business Hours

    A shoe boutique in the UK has made an appeal to thieves, asking for the return of an aged notebook computer that was used as a customer database.  The main concern in this case doesn't appear to be data security -- which could have been easily resolved by the judicious use of laptop encryption software like AlertBoot -- but the fact that the data was not backed up.

    Computer Stolen from Busy Store

    A 17-inch Toshiba laptop was stolen from Exclusive Footwear, a shoe store based out of Gillygate, York (UK).  The laptop computer contained client details for the store's on-line business.  It's not mentioned what type of details were included, although one assumes that the usual information for client relationship management would be present: names, addresses, phone numbers, email addresses, and possibly dates of birth.

    The device had not been backed up for the past three months, meaning that any customer information that had been collected during that time is now lost.  And, my understanding of boutiques is that they specialize in particular items and styles, so the loss of three months' worth of potential business is quite significant (unlike, say, a supermarket, where chances are that the customers will come back again, soon).

    The owner has offered a reward for the safe return on the device -- or, at least, the hard drive with the client data; although, truth be told, it could take on any type of form: CD, DVD, email attachment, etc. --  which is old and distinctively cracked.  And heavy, although that's to be expected from a 17-incher (makes one wonder why anyone would steal it to begin with).

    Gone in 15 Minutes

    According to the owner, she left the shop in charge of colleague during a busy afternoon.  Fifteen minutes later, the colleague noticed that the device had disappeared.

    Such thefts are not uncommon.  What's unusual about the case, though, is that someone stole it while there were other people present, and that the device was heavy.  I myself own a 17-inch Toshiba, and I can tell you that there is nothing "portable" about this particular notebook computer.  It's about as portable as one of those leather-bound unabridged dictionaries from yore.

    Regardless, it just highlights the fact that people are willing to steal anything that's not fixed to the ground.  The case also offers lessons in the importance of backing up data (always recommended -- a cloud-based solution might be ticket here) and using encryption software to ensure that sensitive information is not divulged to the wrong people.

    The information in this case is not acutely sensitive or personal.  However, if the wrong criminals are involved, it could mean more than lost data to Exclusive Footwear.

    For example, how are they sure that the thieves don't have friends who are hackers that will use the email addresses in the client records (admittedly an assumption on my part that these exist in the stolen laptop) for spamming purposes.  Not just regular spam, but a socially-engineered spam that leads to a Trojan for further stealing data?  After all, the thieves know that all of the people in the database are or have shown interest in being clients of the boutique shop, and can use that to their advantage.

    Sometimes, it's not just sensitive data that needs to be secured, but secondary data that could act as vectors.

    Related Articles and Sites:

  • Data Encryption: Skype Encryption Cracked? Kinda

    Today I found an entry on with the headline "Chapel Hill Computational Linguists Crack Skype Calls."  An introduction to the story claims that computational linguistics has been "used to crack Skype encryption."  Is this true?  Well, your mileage may vary, but after reading all I could about the situation, I'd have to say "yes, it is true."

    At the same time, it's also true that the encryption used in Skype communications -- AES-256, also used to power AlertBoot's laptop encryption software -- remains intact and unbroken.

    At this point, you might be scratching your head and saying, "huh?"  Some of you might even be reminded of Schrodinger's cat, he of dual life-states until the box is opened.  The situation is quiet easy to explain.  It's a matter of terminology.

    Skype Cracked, AES-256 Unaffected

    Skype makes use of AES-256 to encrypt its calls.  Skype encryption has been cracked; and yet, I claim that AES-256 remains unbroken.  What gives?

    Well, it's the way that Skype's encryption was broken.  What the computational linguists have done is best summed up thusly:

    The simple description is: By looking at the size of the encrypted data packets you can guess what phonemes were spoken. Yes, that's all there is to it. They are just looking at how much data is sent and guessing what might be said that reasonably fits in that size. [Anonymous Coward]

    That's really the gist of it.  Phonemes are the building blocks of speech, if you weren't aware (I wasn't).

    To put it in another way, although Skype's transmissions are secured with encryption (in this case, AES-256), it's a moot point because the size of each encrypted data packet gives enough clues to figure out what's in that encrypted packet.

    So Encryption is Not Broken?

    Erm...not quite.  It's complicated, as this heated argument shows (possible NSFW language).  It's a matter of how you want to define "encryption is broken."

    In Skype's case, it's not incorrect to say that the encryption is broken (and broken because, in hindsight, it was badly implemented), since the protected message's contents can be figured out from the scrambled message itself.  I mean, if you can consistently figure out the actual message from the encrypted message itself, that's the definition of busted encryption.

    At the same time, the integrity of AES-256 is unaffected: there might be another VOIP provider other than Skype where this issue does not pop up while using the same cipher.  So, the weakness lies with Skype (although, in its defense, most if not all VOIPs suffer from the same problem, apparently).  Hence, the "dual state" where Skype encryption is broken and yet not broken (perhaps the explanation would be easier to comprehend if AES-256 had been broken; the world would be a sad place for it, though).

    Of course, it's also false to say that Skype's encryption is broken when you think about it: the researchers found that their method is quite effective...when it works.  Consistency seems to be the key obstacle, but we all know how technology progresses, right?

    For the time being, I wouldn't worry about the privacy of my Skype conversations, although Skype has been given a very pressing reason to go back and check their software design.

    Related Articles and Sites:

  • Drive Encryption Software: Loyola University Medical Center Has Data Breach

    Loyola University Medical Center has begun notifying patients about the theft of a flashdrive from an employee's car.  The USB device did not make use of drive encryption, potentially exposing protected health information.

    Car Break-In

    The USB flashdrive was stolen from an employee's car, along with a number of other items.  The incident sounds like a smash-and-grab, which implies that the flashdrive was not the target of the break-in; however, this does not mean that the contents of the flashdrive are safe from prying eyes: one can make the argument that, for example, laptops are wiped clean of their data and sold as quickly as possible because thieves don't want to be caught with a hot item; however, that's only true because laptops are visible to the naked eye.

    Who's going to be as concerned about a small item such as a flashdrive?  One might decide to keep it around, in his drawer, and wait to see what's in that thing.  And, when that happens, bonanza!  because the missing flash device contained names, addresses, phone numbers, dates of birth, and Social Security numbers.

    It's not known how many patients were affected; the hospital declined to give out any particulars other than that less than 100 patients were involved.

    Legitimate Use?

    One of the recurring questions that I hear when medical information is breached is "what was my information doing on a thumbdrive?"  Well, the medical center has an answer for that:

    Loyola says employees need access to transplant patient information at all times. In a statement, a spokesman says: "We are reviewing our portable electronic device policy and re-educating employees about securing information. We also are assisting the affected patients to protect against any possible unauthorized use of their information." []

    I can see the argument there.  If you suddenly need to do a transplant, having the information with you always beats waiting for it to download, or to be sent from some central repository, etc.  On the other hand, why's a SSN necessary for a transplant?  It might be necessary for billing purposes, but for an actual transplant?

    That doesn't make sense at all.  I'd be grateful if anyone out there can illuminate on the importance of SSNs being present for a transplant surgery.  For example, do doctors use SSNs as another identifier to ensure that they're operating on the correct patient?

    HIPAA / HITECH Breach Even Before It was Stolen

    Another thing that doesn't make sense is that, even without the breach incident, carrying around protected health information (PHI) on a flash drive -- which was not protected with encryption software -- is a clear violation of HIPAA.

    I mean, how can you justify compliance with the Security Rule under such circumstances?  You've got a violation of the Physical Safeguards section (access control); a violation of the Technical Safeguards section (access control, person authentication, data security); and possibly a violation of the Administrative Safeguards section.

    I guess you could argue that the USB memory device belonged to someone, so that takes care of the access control and authentication (only the owner will access it); and that the person used to be an ex-Navy SEAL, so that takes care of the data security...but would the argument stick when audited?

    If flashdrive encryption was used, though, there would be no HIPAA violation.  For starters, data security is ensured because only those with the correct password would be able to access the information, which takes care of the access control as well.  So, despite not having proper "physical security" (it's a USB drive, after all) a hospital would be in compliance with HIPAA.  I imagine that the Administrative Safeguard is also met because I'm assuming the hospital would have given the password to the right people only.

    I just don't see what there is to "review" on Loyola's part; things should have been in place since day one.  If it's necessary for transplant information to be on an employee's body at all times -- say, as a hospital policy -- then you also have to come up with ways to ensure ePHI is secure in those instances as well (hint: encryption).  You can't just shove the responsibility on employees by educating them and calling it a day.  Given enough time, things are bound to get lost.  Or stolen.

    Related Articles and Sites:,0,6973521.story

  • Data Encryption Software: Nevada Non-Profits Not Required To Comply With NRS 603A? Of Course They Are Required To!

    I heard a very disturbing assertion yesterday.  Apparently, some time ago, an offer was made by yours truly's company, AlertBoot: a non-profit in Nevada was presented with free deployment of our disk encryption program.  The offer was turned down, and we were told that they didn't need it because non-profits don't need to comply with NRS 603A -- i.e., what's called by many the Nevada Data Breach Notification Law.

    When I heard of this, I responded with a "uhh....that doesn't sound right."  After all, if a bank has my SSN and a non-profit also has my SSN...well, it's the same SSN.  How is the breach of my info from one organization less risky than from the other one?  If anything, I'd say that a breach from a non-profit is riskier because, if someone stole something from a non-profit, I have to assume this person has even less moral fiber than your average thief.

    Anyhow, I did some searching on-line, and I can see why people would make that mistake regarding non-profits.  But, things clear up quite nicely once you do some searching and read the actual law.  That notwithstanding, in the course of my research I ran across this, and thought it might be a good idea to throw it in here:

    Disclaimer: The below codes may not be the most recent version. Nevada may have more current or accurate information. We make no warranties or guarantees about the accuracy, completeness, or adequacy of the information contained on this site or the information linked to on the state site. Please check official sources.

    Nevada Businesses Need to Use Encryption!  Erm, Not Quite

    A couple of things must be cleared up before we go on.  Many articles on-line point out how NRS 603A (and its shortcoming and replaced predecessor NRS 597.970) requires encryption of all customer data.  There are many things wrong with that statement and it doesn't take a lawyer to figure it out.  For example, I'm not a lawyer, and I see it (this also means that the following is not legal advice, etc).

    First off, reading NRS 603A clearly shows that the use of encryption software is only required for:

    • Credit card payment information (i.e., adherence to PCI DSS).
    • Personal Information that is sent electronically via a non-voice medium, except for faxes (in other words, if you rattle off someone's SSN over the phone to the wrong person or send it via fax, technically that's not a breach per NRS 603A).
    • Personal Information in electronic format is moved "beyond the logical or physical controls of the data collector or its data storage contractor."  That's an actual quote from the law books.  In other words, if you store personal data in a laptop that's glued to your store's counter, you don't need to use encryption.

    Notice, by the way, the use of "data collector."  We'll come back to that later.

    My point here is: a business does not necessarily need to use encryption.  For example, if all of your customer info is written down on a notebook -- and hence it's not computerized / electronic data -- there's no need to encrypt.

    Also, the implication seems to be that you don't need to encrypt electronic data on your computer as long as it's never taken out of your business venue in any way whatsoever.  I guess the assumption is that it will always be safe within your business (a lame assumption.  Insider data breaches are the fastest growing type of information theft in the US).

    (As a non-lawyer, I don't know what happens if under bullet #3, an unencrypted laptop -- never meant to go beyond the data controller's (i.e., your) control -- gets stolen by a thief.  I'd assume, based on the purpose of the law, that it would be labeled a data breach.)

    A second point of confusion: people think the law is about encrypting customer data.  It's not.  It's about personal information, which is defined as follows:

    NRS 603A.040  "Personal information" defined.  "Personal information" means a natural person’s first name or first initial and last name in combination with any one or more of the following data elements, when the name and data elements are not encrypted:

    1. Social security number.
    2. Driver’s license number or identification card number.
    3. Account number, credit card number or debit card number, in combination with any required security code, access code or password that would permit access to the person’s financial account.

    The term does not include the last four digits of a social security number or publicly available information that is lawfully made available to the general public.

    Whether that information belongs to your customer, your employees, your contractors and other outside parties...none of that matters.  Did you lose a flash drive containing the unencrypted names and SSNs of the janitors that clean your business's bathrooms?  Well, those janitors are not customers, but you're in breach of NRS 603A.

    Another point of confusion: only businesses need to encrypt, which brings us back to whether non-profits -- which are not classified as businesses in Nevada (NRS 76.020 Definition of Business in the State of Nevada) -- need to comply with NRS 603A.

    Well, it's a matter of following the trail:

    NRS 603A.020  "Breach of the security of the system data" defined.  "Breach of the security of the system data" means unauthorized acquisition of computerized data that materially compromises the security, confidentiality or integrity of personal information maintained by the data collector.

    Data collector!  There it is again!  What is a data collector?

    NRS 603A.030  "Data collector" defined.  "Data collector" means any governmental agency, institution of higher education, corporation, financial institution or retail operator or any other type of business entity or association that, for any purpose, whether by automated collection or otherwise, handles, collects, disseminates or otherwise deals with nonpublic personal information.

    So, it looks like businesses are definitely covered under the law.  But what about non-profits?  There is no direct mention of non-profits in the definition of a data collector.  However, a governmental agency is definitely not a business; it's arguably a "non-profit," your stance on taxes notwithstanding.  Also, institutions of higher education are largely set up as non-profit organizations.

    So, why are some non-profits included but not all?

    A Non-Profit Corporation in Nevada is Still a Corporation

    Ah, but all non-profits are included.  Under NRS 82.016, a corporation is defined as:

    NRS 82.016  "Corporation" defined.  Unless the context otherwise requires, "corporation" means a corporation organized or governed by this chapter.

    Chapter 82 governs -- wait for it -- Nonprofit Corporations.

    There you have it: a non-profit is a corporation which can be a data collector which is required to follow NRS 603A.  I say "can be a data collector" because if you don't collect data or save it in electronic / computerized format, you don't have to comply.  Of course, this is also true for for-profit organizations as well, which just reinforces the fact that, when it comes to data breaches, a non-profit is held as accountable as any regular company.

    Like I alluded at the beginning: data is data, and a breach is a breach.  Why would a non-profit not need to comply with the personal information security law?  I mean, what's next, you run over someone by accident in the company car and argue that you shouldn't face manslaughter charges because you're a non-profit?

  • K-12 Data Encryption: Student Records Need To Be Better Protected

    Student information needs to be better protected, be it via data encryption software or something else.  It just doesn't make sense to keep things the way they are.  It just doesn't.

    Criminals are Targeting Student Information

    I came across another ID theft ring story where student info was exclusively used for committing fraud, and, to be honest, I finally snapped.  How many such stories do we have to come across before something is done?

    According to, two women in Memphis, TN were arrested for identity theft trafficking.  The two are accused of stealing names, dates of birth, and SSNs of more than 350 Memphis City School (MCS) students (although, the term "accused" gives them more leeway than they deserve: the "stuff" was on the counter when police broke into their house).  It is currently unknown how they obtained the information, but we do know the two women applied for fraudulent tax returns, scoring hundreds of thousands of dollars.

    Special Agent in Charge Rick Harlow suggested that parents keep an eye on their children's credit reports; HOWEVER, this may not be the best advice.  According to an investigation by the Today Show on NBC, checking your kids' credit reports on a regular basis will prompt credit bureaus to create reports on the children, increasing the risk of something going wrong (see embedded video in the article.  The article and video is an eye-opener on the subject of children's ID theft and worth a read).

    There have been many other cases involving students' IDs, of course.  A short list of instances covered by this blog:

    The breaches involve CDs, external hard disks, USB sticks, etc, proving that data breaches come in all sizes and forms.  There are more, of course -- it's just that I haven't covered them all, and of those that I have covered, I've declined to include university data breaches.  In all the cases that I've read where K-12 students are involved, none of the schools has ever admitted that the target of a theft could have been the students' data.

    It's always, "hey, the thieves were probably targeting the laptop" or USB flash drive, or whatever.  If that's the case, how did the above two get and use student information exclusively?

    No Legal Obligation

    Is there are a requirement to encrypt student data?  The answer is no, as I've already explained in a past post regarding K-12 schools and HIPAA: for a public school, the ruling law is FERPA, not HIPAA.  In fact, to those who are interested in the HIPAA aspects when it comes to K-12, the Joint Guidance on the Application of FERPA and HIPAA to Student Records has this to say:

    At the elementary or secondary school level, students’ immunization and other health records that are maintained by a school district or individual school, including a school-operated health clinic, that receives funds under any program administered by the U.S. Department of Education are “education records” subject to FERPA, including health and medical records maintained by a school nurse who is employed by or under contract with a school or school district.

    Some schools may receive a grant from a foundation or government agency to hire a nurse.   Notwithstanding the source of the funding, if the nurse is hired as a school official (or contractor), the records maintained by the nurse or clinic are “education records” subject to FERPA.

    Student records involve more than health information, though.  So, what does FERPA have to say about student record encryption?  To date, not much.  It will point out, for example, that sending information via unencrypted email is not recommended:

    The US Department of Education has determined that, in general, communication between faculty and students over e-mail "is considered to be an insecure means of transmitting protected information under FERPA" unless some form of encryption is used... [email from US Department of Education to BYU, per ] (Thanks Google!)

    So, the DOE definitely has qualms about protected information (a terminology that is most peculiar.  It doesn't mean the information is actually protected, it means that it needs to be protected) falling into the wrong hands.  At the same time, it doesn't make the use of encryption software a key resolve, unlike HIPAA/HITECH  (or at least, if the DOE has something under FERPA regarding encryption, I haven't been able to find it so far).

    Of course, that doesn't prevent educational institutions from actually using encryption to protect student records.  A Google search of "FERPA encryption" results in numerous entries, some of them linked to policies regarding student information encryption.  However, you'll soon notice that these are at the university level -- which is understandable considering the criticism the US colleges received in the past five years or so, as one institution after another was forced to declare a data breach (and in some cases, more than once).

    But, K-12 needs encryption, too.  It's obvious that criminals are targeting their data -- especially because it could be years before anyone realizes anything is amiss -- and when solutions like AlertBoot encryption are readily available, it doesn't make sense not to use it.

    Thankfully, it looks like the Department of Education might address the issue:

    The NPRM [Notice of Proposed Rule Making] emphasizes that the State or local educational authority or an agency headed by an official listed in § 99.31(a)(3) is responsible for using reasonable methods to ensure that any entity designated as its authorized representative complies with FERPA.  The NPRM seeks input on how reasonable methods should be defined.  The Department intends to issue guidance on the best practices for written agreements, reasonable methods, and other related matters.

    Of course, there are a lot of proposals within the NPRM that are controversial, depending on one's viewpoint; however, in this particular case, I'd imagine that "encryption" would be met by many as a reasonable method of protecting student records?

    Related Articles and Sites:,0,630112.story

More Posts Next page »