in

This Blog

Syndication

Tags

News

AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.

Archives

AlertBoot Endpoint Security

AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.

December 2009 - Posts

  • Data Encryption Software: RockYou Sued After Data Breach

    RockYou, a developer of online applications and services--such as slideshow apps and games for Web 2.0 sites like Facebook and MySpace--has been named in a class action lawsuit this past Monday.  Among the complaints lies the fact that RockYou did not use encryption like AlertBoot to safeguard personal information against data breaches.

    Will RockYou have a fighting chance against this lawsuit?  Unfortunately, I'm not a lawyer, so I can't give any legal weight on the matter; but, I do know from past cases that suits brought against companies are struck down if actual harm cannot be established.  And, someone having your information is not considered to be harmful in its own right.  It's like when you can't jail a serial killer just for having a knife in his possession; he actually has to commit a crime with said knife.

    Likewise, it must be established by people whose information was breached that they suffered some kind of crime/harm: having to pay off credit card bills, for example, because someone opened credit cards in their name and painted the town red with that plastic.

    Of course, it must also be established that the identity theft stemmed from the breach at RockYou, a very difficult thing to do.  I mean, data breaches occur left and right.  How can the plaintiffs guarantee that there is a direct link between RockYou and some crime involving personal information?

    The most certain way would be to catch the criminals; trace the chain of events, such as the buying of the personal information, all the way to the hackers themselves; and then see if it leads back to RockYou.  The odds of this happening are nil.

    I think there may be a flaw in the suit itself, though:

    The lawsuit alleges that RockYou maintained its customers’ email account and password information, as well as the login credentials for social networking sites, in an unencrypted and unsecured database.  As a result, according to the lawsuit, hackers were able to harvest all of this information by utilizing a well-known and easy-to-prevent exploit. [my emphasis]

    Like I said, I'm not a lawyer.  However, I have read (too) many legal documents, including legislation and state and federal bills, and I have never come across instances where e-mail addresses, passwords, and usernames are considered to be "sensitive personal information."

    Of course, this case could end up being the watershed moment when such information is considered to be sensitive personal information.  I wouldn't count on it though.


    Related Articles and Sites:
    http://www.databreaches.net/?p=9196
    http://www.prnewschannel.com/absolutenm/templates/?z=0&a=2046

     
  • PHIPA Encryption Checklist

    The factsheet published by the Information and Privacy Commissioner of Ontario (May 2007), contains a checklist on the use of encryption for health information, as a well a detailed explanation of the different types of encryption that can be used to secure data.

    I thought I'd go over the checklist, and make some comments (you know, from a data security perspective.  What can I say?  It get kind of slow towards the end of the year).

    Encryption Checklist According To Ontario Information and Privacy Commissioner's Fact Sheet

    • I have minimized the amount of PHI that I have on portable devices (preferably none in identifiable form).

      Always recommended, even if you end up using encryption on your data or computer.  Why?  Well, there's always the chance that the username and password for accessing the contents protected contents are on display somewhere...like stuck to the bottom of the laptop.  (Happens more often than you think.)

      In such a case, having the information redacted would prevent a data breach.  The problem, though, is that unless you're into clinical research, you probably need to tie any medical conditions and histories to some kind of identifier.  I mean, technically, if you had the information redacted so there's no sensitive information, there would be no need for encryption software.

    • I delete PHI from all portable devices as soon as I have finished working with it.

      Same reasons as the above.  You can't have a breach of something that's not there.

    • I know what PHI is stored on each of my portable devices.

      Most people have a pretty good idea, but they're never certain.  The general theory is to encrypt only those machines that contain sensitive data (in this case, patient health information), and to make sure that everyone makes sure sensitive data doesn't find its way into unencrypted machines.

      Theory diverges with reality in life, and the truth is that it's impossible to tell what type of data is on what type of machine.  In this day and age, it just makes sense to encrypt any portable devices, assuming it's technically feasible, regardless of whether you "know" whether PHI is stored or not.

      This way, the ramifications of losing a device are pretty much nixed.

    • I have enabled my operating system encryption.

      Say, what?  What's an operating system encryption?  The factsheet seems to be referring to built-in encryption.  For example, under Windows XP, you can encrypt the contents of a folder/directory.  That would be helpful, except for the fact that, as the factsheet has pointed out:
      But while these options are easy to use, because they rely on the user’s login password, they provide only limited protection and are insufficient, in and of themselves, for the safeguarding of PHI. They are vulnerable in that if a person gains access to the user’s password, they will then have access to the data.
      Also, depending on what version of the software you're using--be it the operating system or an application with built-in encryption--it may be the case that the encryption offered is not up to snuff.  Some will say it's better than nothing.  But, the same argument could be made as someone hands you a stick when facing a rabid grizzly bear. (You'd probably feel better with some real protection.)

    • I have purchased a system with whole disk encryption. OR I have purchased software to implement whole disk or virtual disk encryption on my laptop or PDA.

      No comment on this one.  Well, except, perhaps, that it's not enough that you've purchased the encryption package.  You also have to use it--something that I can tell you from experience that it doesn't always happen.  That's right, folks: there are companies out there that will get themselves an encryption license and not use it.

    • If I use portable storage devices like USB keys, I buy them with encryption installed, or install encryption on them before I use them to store PHI.

      Same as the above point, except they've included the caveat that you've got to install it.  (Does the Privacy Commissioner view USB keys as more critical breach sources than laptops?)

      You can encrypt information that's already saved to a USB device; however, that's like setting up a bank and collecting deposits first, and having a bank vault delivered for installation afterwards.

    • If I use a password to access encrypted data, it is a strong password AND it is different than the password that I use to login to my computer.

      Ah, strong passwords.  I've covered some issues on this post.  Basically, the need for strong passwords lies in the fact that, since encryption itself is very strong, the weak link lies upon passwords: even with the best encryption in the world, if the password for accessing that encrypted content happens to be "iloveyou"...well, let me tell you that hackers will be all over that data.

      Generally, you want different passwords for everything.

    • I never write my password down.

      Not always a bad idea to write down passwords.  There is a school of thought out there that you can write down passwords--just don't keep them close to where you use them.  For example, if you write down the password for accessing an encrypted laptop, don't keep that password anywhere near your desk (and definitely not stuck to the laptop).

      This is especially true of systems that are seldom used so a person doesn't have the chance to memorize a password due to constant use.

      I know, the recommendation is to make something long and complex that can be easily memorized.  But, if it can be easily memorized, it's probably not long and complex enough....  You just can't win!

    • I do not share my password with anyone.

      Self-explanatory.  However, quite often not followed.

    • If I don’t use whole disk encryption, I can identify where ALL of the PHI on my system is stored.

      No you can't....  I guess the point is, well, if you're not going to use whole disk encryption, then you should at least keep track of PHI and make sure it doesn't end up on unsecured devices.  This, to me, is just another way of saying use disk encryption: It's virtually impossible to keep track of where sensitive data ends up in this modern age of ours.

      Will some people be able to follow the above?  Sure; think of general practitioners who see maybe five patients a year and keep everything in a spiral notebook.  But in modern practices, keeping track of all information (and note how the capitalized ALL) is just not possible.

    • I only store PHI on the encrypted disk.

      Self-explanatory.  However, you must remember that encryption is not a magic bullet.  For example, even if PHI always stored on encrypted disks--possibly because only encrypted disks are used--if passwords are shared or written down and taped to the specific devices, encryption is not going to provide any data security.

    • I regularly verify or audit that my encryption policies are, in fact, being implemented and followed.

      If you've read all of my comments above, you'll see why this is probably the most important item in the checklist.


    Related Articles and Sites:
    http://www.ipc.on.ca/images/Resources/up-fact_12e.pdf

     
  • Data Security: PSU Alerts 30,000 Of Data Breach

    Penn State University has a potential data breach on their hands--and not because of a stolen computer that lacked drive encryption software such as AlertBoot.  In this case, it's malware that's causing the (potential) problem.

    Letters Sent Out December 23

    On December 23, databreaches.net posted news that Penn State was sending out data breach notification letters to approximately 30,000 people.  Malware infections were found at the Eberly College of Science (7,758 records); the College of Health and Human Development (6,827 records); and one of Penn State’s campuses outside of University Park (roughly 15,000 records).

    It was noted that the letters were being sent out as compliance measures under the Pennsylvania Breach of Personal Information Notification Act.

    According to a story at the Pittsburgh Tribune-Review that was published today, the sensitive data that got compromised was people's SSNs, although it's not certain whether the data was actually accessed.

    It sounds like PSU has decided to err on the side of caution and assume that the presence of malware and SSNs on the same computer was not a good thing.  PSU is still in the process of determining whether that's actually the case (for all we know, the malware could have been used to direct DDOS attacks against a particular network, and not designed for scraping information).

    Data Security Involves More Than Encryption

    As I frequently note, encryption is not a panacea when it comes to data security; the theft of laptops and USB disks is not the only way of illicitly acquiring data.  That being said, file encryption would have helped in this case:

    "The Social Security numbers were in archived files that people didn't realize were on their computers," said Mountz [spokeswoman for Penn State]. She did not know the types of computers that housed the data.

    When you don't know where your files are, how do you protect them?  That's a trick question.  The answer is to know where your files are, and you won't have to ask yourself the above.  This, however, is easier said than done.

    The truth is that it's virtually impossible to keep track of sensitive data.  That's why sensitive files ought to be encrypted.  This way, even if they end up somewhere where they shouldn't be, access to the information is restricted.

    Of course, sometimes it's impossible to encrypt files, not because of a technical limitation, but because it disrupts the flow of work.  Then again, not running with scissors also seemed to disrupt the flow of work back then when my biggest hazard was not eating glue....


    Related Articles and Sites:
    http://www.pittsburghlive.com/x/pittsburghtrib/news/education/s_659851.html
    http://www.databreaches.net/?p=9042

     
  • Data Encryption On Cell Phones Broken?

    While you won't have to do it often, there will probably be a point in your lifetime when you've got to upgrade your data encryption software.  And, I emphasize, you won't have to do it often, perhaps never at all, but you will have to when required...or suffer the consequences.

    It seems to me that some people just don't get the message, though.

    GSM Encryption Effectively Broken

    Mr. Karsten Nohl (Dr. Karsten Nohl?  The guy does have a Ph.D. after all), has announced to the world that the widely-used encryption behind GSM, the A5/1 algorithm, is effectively broken.

    People have taken their stands, with the industry noting that as of right now it's just a theoretical possibility, and security experts saying it's more than your regular theoretical possibility.  They're not saying it's a practical possibility--yet.  One of them has noted that companies should "assume that within six months their organizations will be at risk."

    Who's right?  More probably than not, the security experts that are being cynical.  However, the point is kinda moot.  There's already a more secure encryption method called the A5/3, a 128-bit successor to the 64-bit code powering the A5/1.  It's just a matter of using it, something that hasn't happened because most cellular network operators have declined to make the investments (hm...isn't that beginning to sound familiar?  I'll explore that in the next section).

    Based on the New York Times story, it sounds like a rainbow table attack is being used in this case, where tables of preconfigured information are used, seriously cutting down on the time required to gain illegal access to protect information.

    An entry in Wikipedia under GSM refers to an announcement by Pico Computing--back in February 2008--that such an attack was possible.  Think about it.  A second announcement which, I assume, includes nearly two more years of research?  This issue has grown wings and flown away from the theoretical platform forever.  People should be worried here.

    Denial, Denial, Denial...That's What Happened in TJX Fiasco, Too

    The cellular industry has decided to come with their denial guns blazing:

    This is theoretically possible but practically unlikely,” said Claire Cranton, a GSM spokeswoman, noting that no one else had broken the code since its adoption. “What he [Nohl] is doing would be illegal in Britain and the United States. To do this while supposedly being concerned about privacy is beyond me.”

    and

    “We strongly suspect that the teams attempting to develop an intercept capability have underestimated its practical complexity,” GSM said in a statement. The association noted that hackers intent on illegal eavesdropping would need a radio receiver system and signal processing software to process raw radio data, much of which is copyrighted.

    Cracking encryption is illegal--fine, I'll give you that.  Copyrights and other "protections" in place? Effective against law abiders.

    Since when has terms like "copyright" and "illegal" stopped criminals?  Don't they, by definition, do illegal stuff?  This is more than philosophical musing.  Without researchers like Nohl, we would have no idea what the bad guys could possibly do.

    I mean, the industry certainly isn't doing its job--note how more secure encryption, already available, is not being adopted because it's going to take money to upgrade the networks.  Clearly they need a push.

    Other complaints on my part regarding the GSM industry's points:

    • "No one else had broken the code since its adoption" - Uh, no.  No one else has broken the code and made an official announcement of it.  Again, it's up to researchers like Nohl that we're able to see what types of weaknesses are possible out in the wild.  Criminals don't go around making such announcements unless they're stupid, a factor many are unwilling to rely on for their security needs.

    • The talk about illegality and copyrights - A rehash of what I said above, but criminals don't care about legality and copyrights.  Heck, if the music industry has shown us anything is that non-criminals don't care about legality and copyrights.  It's not just bad people who'll ignore such issues; regular joes will do so as well.  Copyrights and legal statuses--they provide ammo for prosecution and lawsuits; they don't really protect in the sense of "prevention" when it comes to criminal deeds.

      Also, think about it: If only law abiding people composed the entire population of the world, what's the use of encryption?  Just make the eavesdropping of cellular phone calls illegal, and end of story.  Oh, you know what?  It already is.  And yet we're discussing encryption.  I guess there's a reason why the existing encryption is in place--and by extension, why old, weak encryption standards must be replaced with stronger ones.

    Remember how I said that the issue sounds familiar?  That's because the story parallels what turned out to be the greatest data breach in history to date: TJX.

    In the TJX data breach, the head honchos decided not to upgrade their wireless communications encryption from WEP (old wireless internet security technology) to a newer, more secure standard.  They knew their encryption was weak and they didn't have the money to make the upgrades.  Then disaster struck.

    Granted, the parallels aren't exact.  But, the denial of the security threat; the continued use of what proved to be weak encryption; and the decision not to pursue better security over money issues?  All there, and all leading to a predictable end.

    Rarely will a person have to upgrade in their lifetime the encryption algorithms they're using.  However, it behooves people to do so when the writing is on the wall.  And the writing is on the wall.  Heck, there's a spotlight on the writing on the wall.  It's just that most will ignore it until the flashing arrows and siren calls are added.


    Related Articles and Sites:
    http://www.nytimes.com/2009/12/29/technology/29hack.html?_r=2&pagewanted=all
    http://en.wikipedia.org/wiki/GSM
    http://en.wikipedia.org/wiki/Rainbow_table
    http://blogs.techrepublic.com.com/wireless/?p=206

     
  • An Alternative To Deleting Data? The Rfiddler Looks Cool, Sounds Cool, Perhaps Can Do The Job

    You know, there's zapping data and then there's zapping data.  I've often noted how disk encryption software can be used to securely toss used hard drives (as opposed to destroying the drives or degaussing them), but encryption is not exactly fun, is it?

    Hence, I present you with the rfiddler, a RFID frying gun that's strong enough to take out data on USB drives.  And, fry your retinas as well, is the claim, so perhaps a little discretion is advised when using it?

    Of course, from the embedded video at both of the links down below, it's appears to me that this is perhaps not quite powerful enough to zap disk drives, unless you're able to get to the platens directly (at which point, why even bother with this bona fide ray gun?  Because it's cool, that's why.  Well, until one of your retinas are cooked). (Note: most people so far recommend pointing this thing away from your computers.)

    Relying on the above to dispose of your data--even if it were strong enough--means your data is still at risk while you're using a particular device, though.  Remember, the use of encryption software means that your data is protected even as you access the data for your everyday work and other needs.

    Whereas, a data destruction device is only useful once you're ready to retire a device or the actual data itself.


    Related Articles and Sites:
    http://gizmodo.com/5433018/rfiddler-rifle-zaps-usb-sticks-and-rfid-chips-into-oblivion
    http://technabob.com/blog/2009/12/22/rfiddler-rfid-zapper-gun/

     
  • Drive Encryption Software: Canada's PHIPA Requires The Use Of Encryption On Mobile Health Data

    The Durham data breach from last week (and reported earlier this week) has been met with incredulity by the Ontario Information and Privacy Commissioner.  A directive to use data encryption has been issued, prohibiting the transfer of sensitive data if encryption is not used.

    As you'll recall, the loss of a USB key meant the breach of 83,000 patients who had received flu shots in the Durham Region.

    PHIPA 2007

    The Commissioner has pointed out that the Personal Health Information Protection Act (PHIPA), passed in 2007 expressly for Ontario, directs that "health information custodians not...transport personal health information on laptops or other mobile computing devices unless the information was encrypted."

    You'll notice that this implies PHIPA is much more strict that PIPEDA when it comes to the encryption of sensitive data.  While, per the above, PHIPA requires the use of encryption, PIPEDA, under 4.7.3 (c) only seems to recommend it ("the method of protection should include..." is how it reads.  "Should include" is not the same as "must include").

    What To Do If You're Not In Compliance In Ontario

    The easy answer--perhaps even flippant--is to go ahead and encrypt your laptops and other portable devices that contain sensitive data (such as external hard disk drives).  Granted, depending on the solution used, you may have to wait for someone to visit you after you sign up for the service (but not so in other cases).

    But if you must really, really transport those unencrypted sensitive files using something like a USB memory stick, the Commissioner has "advised that any unencrypted personal health information that needs to be transported,must be in the physical possession of the person responsible, at all times, until it reaches its secure location. This is only an interim measure until full encryption processes can be put into place."

    Hold on to that thing really, really tight.  Or, you could just set yourself up with encryption right away.

    Related Articles and Sites:
    http://www.phiprivacy.net/?p=1716
    http://laws.justice.gc.ca/en/P-8.6/FullText.html

     
More Posts Next page »