This Blog




AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.


AlertBoot Endpoint Security

AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.

January 2014 - Posts

  • Laptop Encryption: Coca-Cola Data Breach Affects 74,000

    A recent data breach came to my attention, and while I normally criticize companies that cause these information security incidents, I cannot honestly bring myself to do that when it comes to Coca-Cola's situation.  It's one of those grey areas where you have to give the company the benefit of the doubt.  Even if they were caught with a laptop that wasn't secured with strong encryption software.


    The reason why I'm equivocal in joining other security experts in denouncing Coca-Cola's mishap lies in the following:
    Coke spokeswoman Ann Moore said the laptops were stolen by a former employee who had been assigned to maintain or dispose of equipment. []
    This is one of the hardest situations a company can protect against.  It involves (1) an insider, (2) intent (to steal), and (3) equipment end-of-life.

    The problem with insiders is that, well, you have to trust them on some level.  A company can't treat all of their employees like criminals that need to be observed around the clock.  If it did, productivity would plummet, costs would rise, operating margins would crash, turnover would soar, people would be unhappy...  You need some controls, yes, but at the end of the day, it's mostly trust that you rely on.

    The problem with intent to steal is that, well, you generally can't stop people who want to get their hands on something.  I mean, really want to get their hands on it.  Plus, when your target happens to be an everyday good, there aren't significant barriers to someone looking to steal it.  You just have to bid your time.
    The problem with end-of-life equipment is that nobody really cares about such hardware.  It just gets locked up – if even that – and the only person who really pays any attention to it is the guy charged with disposing of it.  Combine it with the insider problem, and you've got a recipe for disaster.


    Which is why I have a problem blaming Coca-Cola for the data breach.  Yes, the machines were not protected with encryption software.  And, it appears that perhaps the laptops were unencrypted from start to finish, which is inexcusable...but understandable. (AlertBoot was built from the ground up with an integrated reporting engine, which is one way around the problem of security lapses.  But, other FDE solutions may have added the reporting and monitoring aspect as an afterthought, leading to a less robust monitoring module).

    But even if the machines had been encrypted at the beginning of their life as a Coca-Cola asset, would they have still been encrypted once they were tagged as having reached end-of-life?

    In my experience, here's what generally happens: a company purchases encryption licenses, which can be expensive.  They encrypt a laptop computer so they can be in compliance with any laws and regulations.  The laptop is used, and if everything goes well (i.e., doesn't go missing), the laptop is replaced a number of years later with a newer model.  The old laptop is prepared for disposal, meaning that the machine is decrypted in order to retrieve the expensive encryption license, which is used to encrypt the new laptop.

    How does this help prevent laptop thefts by trusted employees who work in the equipment disposal department?  Generally, it doesn't.

    (One way to get around this problem is to use something like AlertBoot's full disk encryption.  You can activate the data wipe and ask us to retrieve the license.  This ensures that the laptop remains encrypted and inaccessible, and you get to keep you license, too).

    Security is a Process

    While the notes that Coca-Cola wouldn't provide details on how they figured out who had taken the laptops – or how they had realized that the laptops were missing at all – it doesn't take a genius to figure out what happened.

    At the end of the day, under such situations, the only way to prevent (or recuperate from) a data breach is to keep an accurate log of the equipment and its parts, and perform a physical audit until the equipment is actually disposed of.

    Coca-Cola was able to figure out that hardware meant for disposal was in fact stolen; this is evidence that the company has pretty sophisticated computer security policies in place.

    What they are guilty of, it appears, is that they don't have perfect security.  Not many companies have that, though.
    Related Articles and Sites:
  • Laptop Disk Encryption: Why FDE Beats Device Tracking When Things Hit The Fan

    Why encrypt a device, like a laptop, when you can track it? (not his real name but a nom de web), who's traveling across the world and working out of his backpack, can attest to why: sometimes, tracking doesn't work as advertised.  There's a couple of reasons for this.  Whereas, disk encryption like AlertBoot works all the time – as long as you've made the decision to install it, that is.

    The Irony (and the Creepy)

    In a post titled "How I went from 100 to 0 things (or how I was robbed of all my stuff)," makes the following ironic observation: during the nine months that he traveled all over the world, he's never had something stolen from him.  It took one week for his stuff to be stolen once he returned home to Holland.

    The entry – horrific and insightful – shows what he had to do with the ramifications of a home burglary.  To make a long story short: as a person whose life is intricately tied to the digital realm, it took him a while to ensure that his personal information would not become public information.

    Among the many insights he had (such as 12 hours to reset passwords at all the online accounts he had), this is of note for this blog:
    I had Apple iCloud's "Find My Device" and Prey enabled on both my iPhone and MacBook in case they were stolen, but to no avail. Thieves aren't born yesterday. They know they shouldn't connect to WiFi, thereby making it impossible for your device to alert iCloud or Prey of its location. Not a single report came in. They're good services, but if the thieves are smart the odds you're getting anything back are slim. []
    I agree, they're good services.  But, at the end of the day, they require that the computer be connected to the internet in one way or another.  The problem with such a strategy is that thieves are becoming ever-aware of these services and purposefully ensure that devices do not connect to the net.

    Methods to prevent devices from appearing on the internet may involve taking out the battery, shutting down the device, or placing them in a Faraday cage – also known as layers of aluminum foil (my personal experiments show that three layers is more than good enough).  Also, booting up a device where there are no Wi-Fi spots, such as in the middle of nowhere, or in a basement.

    What About Encryption?

    Encryption, on the other hand, is ideal in such situations.  No, it will not help you recover the device.  Encryption is not magic.  However, it will prevent someone from accessing the contents of your laptop and other digital devices.
    Related Articles and Sites:

  • Disk Encryption In University Settings: U Of MN Law Professor's Laptop Stolen

    The use of laptop encryption in academic settings is important for myriad reasons.  You may be dealing with sensitive, personal data.  You may be working on a project that involves intellectual property or classified information.  If a college professor, your students may have an interest in accessing your computer somehow.

    As noted, the reasons are myriad.  Sometimes, not using encryption software can have serious consequences.

    U of Minnesota Laptop Stolen

    According to, a law professor's laptop was stolen.  The laptop theft affected approximately 300 people:
    A prominent juvenile justice scholar was collecting data from closed case records for a study on law enforcement interrogation techniques when the laptop, a scanner and external hard drive were taken last February. []
    You read that right.  Last February.  As in February 2013.  Letters alerting the affected of the data breach went out earlier this month.  Why did it take so long?  The list of people to contact had to be recreated from the original data source.

    (As an aside, this is why data security laws usually require that periodic data backups be made.  For example, HIPAA requires that breach notifications be sent within 60 calendar days of becoming aware of the data breach.  This is only possible if you have a backup you can analyze.  It's nearly impossible otherwise).

    The information was quite sensitive:
    The records stem from 2005 criminal cases involving murder, rape and aggravated robbery in the state's two most populous counties. Much of the data was previously considered public, but some would have been kept private if Feld hadn't obtained special access for his study. He said his research assistants were six weeks into scanning records from archived files when the theft occurred. They hadn't begun analyzing the data, which complicated efforts to determine the scope of the breach. []

    Why Not Use Encryption?

    This is what I don't understand.  Why was encryption not used?  The information was obviously sensitive.  Plus, as a legal scholar, the professor must have known about data breach laws and ethics involving the disclosure of such information.  And, yet, here we are.

    My personal guess is that the professor and other researchers felt pretty secure in their environment.  I never did understand that attitude; my own college and grad school experience plainly showed me that academic buildings are not exactly Fort Knox.

    It doesn't matter in what kind of environment you're working (unless that environment was specifically designed for security).  If you have any sensitive data on a laptop, chances are that you will need to encrypt it.
    Related Articles and Sites:
  • Data Security: 2013's Worst Password Is 123456

    What is the worst password?  According to, the top worst password is 123456.  The site goes on to note that it finally "dethroned" perennial worst password winner password.  Familiar entries, for those who follow this kind of stuff, comprise the rest of the "worst passwords" in their top 25 list.  A hint: if you're using data encryption software please refrain from using any of these as your password.

    It'd be like getting all the latest security technology for your home, only to leave the master key under the welcome mat.

    Abode Breach Affects Results

    The biggest breach in 2013 was the Adobe breach.  Like a tidal wave rushing the shore and decimating everything that comes before it, the breach affected the worst password results.  Here's that list: 
    1. 123456
    2. password
    3. 12345678
    4. qwerty
    5. abc123
    6. 123456789
    7. 111111
    8. 1234567
    9. iloveyou
    10. adobe123
    11. 123123
    12. admin
    13. 1234567890
    14. letmein
    15. photoshop
    16. 1234
    17. monkey
    18. shadow
    19. sunshine
    20. 12345
    21. password1
    22. princess
    23. azerty
    24. trustno1
    25. 000000

    As you can see, references to Adobe and its software offerings are peppered throughout the list.  (This is not unexpected.  When RockYou had its data breach, rockyou was one of the top ten passwords).

    Perennial favorites like iloveyou (#9), password1 (#21), and trustno1 (#24) also were present.

    Also notice the presence of azerty (#23), which is a weird entry for people accustomed to a US keyboard layout, but not to so much for European residents (azerty is the qwerty of European keyboards).

    As Easy as 1, 2, 3

    Then there are the numbers: 123456 (#1), 123456789 (#6), 1234567890 (#13), 1234 (#16), and 12345 (#20).  Also, 00000 (#25), but I exclude it because it breaks the 1234 pattern.  Why do I bring this up?

    I'd say that these passwords are actually one and the same, and reflect something else: the minimum password limits that different websites place on their users.  A string of consecutive numbers is the easiest password you can get, after all.  Password length requirement is 6 characters?  123456 is your password.  At least 8 characters are required?  12345678 is your password.  And so on.

    We have to assume that the Adobe hack must weigh heavily on the results, but it looks like most passwords are required to be at least six characters in length (#1, #6, and #13 in my sample.  The list of 25 shows the top 15 to be at least 6 characters in length with the exception of #12, admin).

    Kind of makes one wonder who's allowing passwords that are shorter than 6 characters in this day and age.  It was only in 2010 that researchers showed 12-character passwords to be minimum when it comes to acceptable security.  Four years later, you can bet that passwords need to much longer now.
    Related Articles and Sites:


  • Disk Encryption: Customer Notices Security Gaff, Prompts ISP To Investigate

    There are many reasons to properly protect data on laptops.  Among them is this overlooked one: a customer might catch you being less than secure, as reports, something that I've never given consideration before.

    On the other hand, is that so surprising?  After years of hearing how companies and government organizations have lost people's sensitive data, it's only natural that people will take an interest in data security issues.

    Hull's #1 Telco Involved

    The story involves one Chris Hill who signed up with KC, an ISP that's apparently the dominant service provide in the region of Hull, UK.  An "engineer" showed up at Hill's home and,
    He used a laptop to connect to the router and as he came to the user ID and password for my connection he opened a spreadsheet and looked my phone number up in it. There was my user ID and password, in plain text, along with everyone else's. He tried to shield it from me when he realised I was looking at the list.
    Scandalous?  Maybe.  This is the thing:  We're talking about a guy (the engineer) who already has access to this data as part of his work.  If he doesn't have that data, he can't complete his job; so, he has to have it.  Thus, the fact that he's able to access it is also not wrong.  Furthermore, that he has a list of them is also (possibly) not wrong.  Maybe he has to visit five locations and do the same job over and over.  Or maybe his work schedule has been prepared for the entire week.  That the passwords were visible in plain text?

    Also Not Wrong

    That's right, it's not wrong that the engineer had the list showing in plain text.  I mean, the guy's a person, for Chrissakes!  Are we to assume that he can read the password in encrypted form?  Of course he had it in plain text!

    The real question, from a security standpoint, is this: Was encryption software used to protect the spreadsheet that contained the list?  If not, was the computer protected with laptop encryption software?  (The answer to the latter appears to be "yes").

    Based on the story I've read, it almost appears that Mr. Hill was having exception with the presence of plain text files.  Storing passwords in plain text form is bad – a bad idea, a bad practice, a bad policy, a bad etc.; however, that doesn't mean that passwords must always stay encrypted.  That's stupid.  It's like finding out that the safest car is the one that's not moving, so you buy a car and keep it parked forever.  That's not how things work.

    If someone has to hand you the password (the user at home), it's going to have to be in plain text.  If someone has to program the password to a device, it's going to have be in plain text.  The presence of plain text is not necessarily a data breach.

    It Happens to FDE, Too

    Consider full disk encryption, a tool that is relied upon by doctors, lawyers, accountants, and other people who make it their business to collect and work with personal data, and thus risk a data breach on a daily basis.  Full disk encryption (FDE) is a marvelous, simple tool; however, it does have a weakness.  It's not active while you're using a computer.

    While you're working on your computer (sending an email, playing a game, checking Facebook), the computer is not encrypted.  And rightly so.  Like our engineer above, you're unable to read or work with encrypted data (at least, I suppose you're not able to).

    The moment the laptop is shut off, however, the FDE goes into "active" mode again.  This includes instances where the battery runs out or if the computer was only running from a power outlet but got plugged out.

    So, there is a window of opportunity (or risk, depending on how you look at it) where data could be breached even if you are using FDE.  But how else would you have it?  Would you never turn on your computer?  Or not use FDE?  (That's a terrible idea, by the way).
    Related Articles and Sites:


  • BYOD Security: Starbucks Mobile App Saves Passwords In Clear Text

    What is the one thing we've learned about data security in the past three years?  There's been a lot of things (such as, smartphone encryption is a good thing.  Same goes for laptops), but I'd say the one definitive thing is this: you do not store passwords as clear text.


    So, it's mighty surprising to find that one of the most beloved apps in the US is doing exactly that.  The app in question?  The ubiquitous Starbucks mobile app.

    Convenience v. Security

    According to, a security researcher found that the Starbuck app stored usernames, email addresses, and passwords in clear text within a smartphone.  The decision to do so, it is presumed, was due to convenience:
    The issue appears to be an example of convenience trumping security. One of the reasons for the Starbucks mobile app's popularity is its extreme ease of use. Customers need only enter their password once when activating the payment portion of the app and then use the app to make unlimited purchases without having to key in the password or username again. (Only when adding money to the app is the password required.)

    Starbucks could have chosen not to store the password on the phone, but users would then be forced to key in their username and password every time they wanted to use the app to make a purchase.
    As a person working in the security arena, this is scandalous.  But, as a consumer of coffee – mermaid-branded or otherwise – I can see how typing in a password each time you want to drink coffee would be a hassle and then some.

    Plus, think about credit cards: they don't exactly come with passwords on them.  The numbers are on the face for everyone to see.  So, one almost feels like Starbucks can be excused for choosing convenience over security.

    Almost.  Because there are significant differences.  First, the Starbucks app does come with an email address and a password.  If you lose a Yahoo! branded credit card, the thief won't be able to break into your email account.  The same is not true for the Starbucks app.

    Second, a credit card comes with fraud protection, limiting losses if a thief decides to start using your card.  I'm pretty certain that this is not true of the Starbucks app.  If someone decides to clear out your Starbucks account by going on a coffee binge, you're losing 100% of that money.  Furthermore, losses could be even greater if the auto-replenish option is turned on, as the article goes to point out.

    Give the People a Choice

    So, it sounds like Starbucks was caught between a rock and a hard place.  On the one hand, there are security concerns, which are important.  On the other, the app wouldn't have been as successful if the company made it "hard to open up the wallet," if you will.

    What to do?  It seems to me that the answer is quite simple and staring them in the face: give the people a choice.  (And Starbucks knows choice.  This is a company whose coffee offerings reach 87,000 different combinations.)

    Those who don't want to bother with the extra security can use the Starbucks app like they've always done, knowing that they're at risk.  Those who don't mind punching the virtual keyboard a bit and prefer the extra security (and peace of mind) get their way, too.

    Of course, implementing a two-prong approach would make things more complex, but Starbucks has a $56 billion market cap.  They can afford an engineer or two who can make this a reality.
    Related Articles and Sites:

More Posts Next page »