in

This Blog

Syndication

Tags

News

AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.

Archives

AlertBoot Endpoint Security

AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.

September 2007 - Posts

  • Ensure There Is No Gap When It Comes To Laptop Encryption

    The Gap, the San Francisco-based clothes retailer, has just issued a press release that a laptop containing the information of 800,000 job applicants has been stolen.  In all fairness, this is not the fault of The Gap, but one of the vendors that they hired to manage the job applicant data.

     

    Just when I thought the week was winding down.  (Hmmmm… companies tend to issue bad news before the weekend so that their stock won’t get hit.  Maybe I’m just being too cynical?  Although, TJX took a heavy hit in the stock market when they released the news that customers’ credit cards were compromised, if I’m not mistaken.)

     

    Speaking of which, since The Gap hasn’t really released too much information as of yet (including who exactly lost the laptop), a lot of news agencies and bloggers are trying to tie this incident to the TJX news from earlier this week.  I guess the point in common is the fact that both companies sell clothes, and there was a lack of encryption.

     

    Except, the situations are completely different.  Earlier this week (and I have blogged about this) the Canadian government released their findings on the TJX fiasco.  Among their findings was that TJX was not using the proper wireless encryption standard, WAP, but an easily broken encryption, WEP.  Essentially, some guy was reading the data in the airwaves.  There was also criticism that TJX was collecting personal information above and beyond what was necessary, and that they were keeping it stored on their servers for an inordinately long period.    I don’t recall any criticisms regarding not encrypting data.

     

    With The Gap, the vendor had its laptop stolen.  And it sounds like it was from their offices.  Per the agreement with the vendor, the laptop was supposed to be encrypted.  For an as-of-yet-unknown reason, said laptop was not encrypted. 

     

    I’ve already read several commentaries where security professionals (real and wannabes, the latter probably quoting the former) ask the question “what was the data doing on a laptop?”  I find this question to be reactionary and irrelevant: let’s face it, more and more people and companies are migrating to laptops as permanent or semi-permanent “desktop machines.” If they’re actually implying that desktops are somehow physically more secure machines, I can tell you from personal experience that it’s not.  In fact, I’ve seen instances where a very security-minded company issues laptops as workplace computers, and these are locked in strongboxes at the end of the day.  Security-wise, I’ll take that over unsecured desktops any day.  Plus, you should take a look at desktop computers nowadays.  The size of one of my co-worker’s Dell machine is tiny; a 6-year old could steal that after unhooking the LCD monitor and pulling the plugs on the USB keyboard and mouse.  I’ve seen hamburgers bigger than that Dell.  My guess is that any pros were quoted out of context.  Laptops might imply mobility, but any machine is mobile.  It might be harder to steal an IBM mainframe, but is it really impossible?  I, for one, remember the case of a pair of drunk guys stealing an ATM machine in its entirety—it weighed over two tons, if I’m not mistaken.

     

    The correct question is “why was there no encryption?” which is what the pros tend to ask after asking the first question above.  In fact, I believe this question doesn’t get asked enough for laptops, desktops, and other devices that are not physically chained down to the floor.

     

    The fact that The Gap had an agreement with the vendor requiring their data to be encrypted implies, at least to me, that The Gap was doing its homework.  They also must have had an auditing body as well, which is also part of the homework.  And seeing how the vendor is still handling data for The Gap (no word on whether ties would be severed with the vendor), I have to assume they must have passed these audits.  So, why?  How?

     

    Time will tell (I wish I could trademark those words).  But if my research is any clue, it’s probably because most encryption programs are challenging to implement and enforce enterprise-wide, and more importantly, to maintain it effectively on an on-going basis.  The impact on workflow can be more than substantial once a security measure is implemented.

     

    The beauty of AlertBoot is that laptop encryption (or any other type of device encryption) is actually a very easy process to implement.  Plus, you get a comprehensive report suite to audit the encryption status of all your devices.  And if you buy new devices, the process of implementing encryption takes a matter of minutes.

     

    My guess, though, is that content encryption for protecting individual files would have been the best solution in this case.  In AlertBoot, an administrator can specify that a certain file type always be encrypted; as an end-user creates new files by copying data from one file to another, these are encrypted without further effort on the end-user’s part.  And if these files happen to be passed around the office for work-related purposes (which, let’s face it, will always regardless of corporate policy), it won’t matter as much if they end up in an unencrypted laptop: no password means no access to data—which is the idea behind encryption.

     

    The Gap has to alert affected job applicants because it’s the law—even if the third-party vendor screwed up, the data ownership belongs to The Gap.  As is the usually the case in such situations, whichever law enforcement agency that was contacted about the theft probably asked the companies involved not to announce specifics of the data breach.  However, The Gap uses multiple vendors to handle job applicant data, so I guess they can get away with announcing a breach as quickly as possible (winning points in the customer PR department) while not revealing the third-party’s identity while the investigation continues.

     
  • Here's Why All Data Should Be Encrypted - Regardless Of What The Official Policies Are

    I’ve been revisiting some security breach cases from earlier this year, and one of the more confounding ones is the data breach that happened at the VA in Birmingham, Alabama.  A hard drive, with some of the files encrypted but most of them not, was stolen from the premises.  The VA Office Inspector General released a report in June, so I’ve been reading up on their findings and conclusions.

     

    This particular case stands out for a number of reasons. 

     

    First, the hard drive was in the premises of the VA.  The hard drive in question was found missing and subsequent investigations showed that there was no forcing into the premises or into the safe box where the drive was supposed to be secured.  It almost sounds like an inside job.  The FBI and other investigators definitely tried to assess the possibility: for an eye-popping account of what lengths they went to, take a look at page 22 of the report (link: www.va.gov/oig/51/FY2007rpts/VAOIG-07-01083-157.pdf).  After reading this, I think most people would curtail their diatribes of the government not taking such cases seriously.  As to whether it was an inside job, who knows? To this day no one’s been arrested.

     

    Second, the employee who was directly responsible for the data on the hard disk lied to investigators about the contents of the computer and deleted files on his computer in an effort to cover up the extent of the damage and, I suppose, the fact that he was lying.  The stolen drive was being used as a backup to the computer he was using.  I guess he didn’t realize the government would come out in full force, including forensics experts from the FBI.  They confronted the IT guy with evidence, and he capitulated and confessed to deleting and encrypting files in a moment of panic.

     

    Third, subsequent investigations showed that the employee shouldn’t have had access to the files to begin with.  The employee was an IT specialist, although I can’t find any details on what his actual job was.  Either way, it’s no wonder the guy lied: the extent of the damage could potentially affect over 250,000 veterans as well as 1.3 million medical providers.  And it seems the IT specialist knew it.  He deleted the incriminating files the same afternoon he reported the drive missing.

     

    Of course, the fact that the drive is missing is not as big an issue for those affected as the fact that the data is “somewhere out there” (the government is probably focused on how the drive got stolen, though).  There had been other VA security breaches before, so why wasn’t the hard drive encrypted?  It was not due to a lack of policy because the government already had issued directives to encrypt portable devices with sensitive data (VISN 7 Automated Information System Operational Security Policy Memorandum 10N7-115, August 7, 2006). 

     

    Apparently, the same IT guy who’s in trouble told the person in charge that the VA had not approved encryption software for external drives (it’s a long story, but it sounds like the regulations and directives in place were nebulous at the time).  And his supervisor decided it was okay not to encrypt the drive because it was not to be taken out of the premises, as per the regulations in place (We all know that regulations are followed without exceptions, right?)  How could they not consider theft an issue when the only barriers to the office were the front door and the office door?  There were other mistakes made as well.  A slew of other regulations were broken, including giving the IT specialist patient data that he shouldn’t have had access to.

     

    Could AlertBoot have helped prevent this scenario?  Yes.

     

    There a number of ways, but to begin with, port control might have been useful, and might have prevented this entire fiasco.  Port control allows an administrator to specify which devices can connect to computers; whether they have read and/or write ability; and whether a person has the authorization to connect the device, including USB connected ones.  Considering that the external hard drive that is missing was purchased independently by the department, higher-level administrators would have been contacted to ensure access to the drive from their government-issued laptops; otherwise, the drives wouldn’t work.  And at that point, if the administrators knew what they were doing, they could have devised other solutions.  Based on the report, it sounds like administrators would have picked up on the security issue.

     

    Aside from the above, device encryption would’ve come in handy, since the entire hard drive would have been encrypted and secured with a password; this would have made the drive compliant with policies calling for full drive encryption for portable devices.  Also, the data would have been unreadable by the perp or perps who removed the drive from the premises.  Plus, there’s ways to disable the password and lock people out from the device in the event that the drive is not recovered (and was taken by someone who knew the password).

     

    Content encryption, the encryption of individual files as opposed to the encryption of the entire device itself, would also have been necessary to make sure that the personnel involved in this case were not in violation of the HIPPA privacy rules. Also, it works as a risk mitigator since there’s always a chance that the files might be sent to someone without authorization to view them, or that it might be copied to a non-secure device.

     

    On a remotely related note, this incident also shows the dire straits employees might go to during a security breach.  That’s probably why most security experts recommend that IT administrators not be in charge of auditing compliance: there’s always the temptation to hide the evidence if something goes wrong.  A separate body should be in charge of reviewing audits and should work with IT administrators to develop a course of action based on the audits.

     
  • State of Connecticut Latest Victim of Ohio Tape Theft - Why Data Encyption Should Be a Key Weapon In Your Security Arsenal

    The state of Connecticut is back in the news due to another security breach.  You’ll remember that a state laptop was stolen earlier this month, and I remarked that laptop encryption by AlertBoot would have helped in this case.

     

    The Connecticut government is not to blame in this particular instance, however.  The state is getting ready to sue IT consulting firm Accenture for the theft of a backup tape containing records tied to state agency bank accounts.  This case is actually tied to the Ohio backup-tape theft case earlier this year.  You might remember that an Ohio state intern was instructed to take backup tapes home as part of an ill-advised (to say the least) compliance effort in order to satisfy policies requiring backups to be kept off-site.

     

    So, how did Connecticut’s data end up in Ohio’s backup tape?  Apparently, Accenture employees working under contract for the state of Connecticut copied the data without authorization to the now-stolen tape.  The Ohio data breach was in June, but Connecticut government officials were not notified until September.

     

    According to a statement by Accenture, the Ohio Inspector General has determined, in so many words, that retrieving the data stored on the stolen tapes is a complex procedure and that there’s a very low probability of extracting the data.  While I was not able to find the IG’s statement and read it directly, I’m sure that his office took into consideration the usual suspects who want to use the data for illegal activities and will attempt to retrieve it.  And based on what I’ve read, the incentive is certainly there.  The information stored in the backup tapes include, according to Connecticut Gov. M. Jodi Rell’s press release:

     
    …information on nearly every bank account held by [Connecticut] state agencies - including checking accounts, money market accounts, time deposit accounts, savings accounts, trust fund accounts, treasury and certificates of deposit - which could total billions of taxpayer dollars. The tape lists agency names, account numbers, bank names and types of accounts.
    Also, the Social Security numbers of 58 taxpayers have been compromised as well. 

    Why did Accenture have Ohio’s backup tapes?  Turns out that Ohio hired Accenture to develop a payroll and inventory system similar to the one developed for Connecticut. The problem with the development of any kind of system is that it must be tested with real data.  Data is data is data…but data is stored in different ways, and a programmer must ensure that the system they’re creating will be able to read and use existing data.  My guess is that somebody at Accenture picked up the wrong tape and recorded data for testing purposes.  This is assuming that the two systems were in development at the same time; otherwise, it opens up a whole can of worm such as, “why was Accenture holding on to this information after finishing up Connecticut’s system?”  And I can’t think of any legitimate reason for that.

     
  • Encryption is a Key Ingredient when Limes and Wires Mix

    Looks like another bank has been struck by a data breach.  The victim in this case is Citigroup, more specifically the ABN Amro Mortgage Group unit.

     

    An employee at the mortgage unit inadvertently leaked the Social Security numbers and credit information (not “credit card” information) of 5208 customers.  How did the data breach happen?  The information was leaked via a peer-to-peer file-sharing network.  Apparently the employee had LimeWire running on her computer, although it’s not possible to tell from the article whether the leached file was on a personal or a company computer.  Along with the customer data, the employee’s personal data was exposed as well.

     

    Could Citi’s breach have been prevented?  Based on the press release, it sounds like Citi already had policies in place dictating that confidential information must be stored on devices managed by the company.  I’m sure they must have other policies in place designed to prevent data breaches and other forms of security lapses.  After all, a company with over 300,000 employees worldwide cannot afford to be exposed to data theft, internal or external.  They can’t afford accidents either, but contingency plans that cover all accidents are impossible to design.

     

    Nevertheless, there might have been steps that Citi could’ve taken.  I am hoping that they have taken them, since these are some of the more basic steps to be taken. 

     

    One, is the encryption of sensitive files themselves.  Anything that includes Social Security numbers, for example, should be encrypted no questions asked.  There are plenty of programs out there that make it easy to encrypt files.  With services offered by AlertBoot, an administrator can specify whether all files of a certain type (e.g., extension of *.doc) are automatically encrypted, as well as have end-users encrypt files on an individual basis. 

     

    Two, ensure that unauthorized devices can not connect to company computers.  Software programs allowing an administrator to enforce port control is available as well.  This way, if somebody wants to hook up their iPod to the computer, and perhaps use the hard disk utility to copy company files, she won’t be able to. 

     

    Three, assuming that the data breach happened from a company-issued computer, there should have been an application control module.  Application control defines which software programs are authorized to run on a device.  In the Citigroup case, LimeWire, as well as any other external P2P file-sharing programs, would have been blacklisted because it is a huge security issue. 

     

    Plus, I think it’s safe to say that most companies of Citi’s size do not have a legitimate use for your run-of-the-mill P2P applications such as LimeWire.  Aren’t most of the files offered mostly pirated movies and music?  And when a company deals mostly with private and confidential information, the last thing that they want to be installing in their computers is an application that would allow them to easily share that data with the outside world.  I’m pretty sure Citi’s IT department would have nixed the idea of installing P2P applications or having internal files anywhere close to a device with such an application installed.

     

    So…another instance of an employee not following company policies?  Sounds like it.

     
  • Data Breach Insurance Might Dampen The Need For Data Security? I Say Otherwise

    I’ve recently been reading up on some opinion pieces regarding insurance for security data breaches.  There seems to be some controversy regarding such insurance.  A subsidiary of AIG insurance, for example, has been offering a policy called netAdvantage for some time now.   Apparently, people have been signing up left and right in light of the ongoing security breaches.

     

    Some people say that this is not a good development.  The thinking is that this doesn’t give the companies incentives to shore up crummy security practices. Also, companies might not feel a sense of urgency knowing that their financial losses are limited to a degree.

     

    I’m not sure if I can agree with the above.  It’s repeatedly been shown in the public realm that the consequences of losing sensitive customer data involves more downsides than financial hits, in the stock market or in expenditures (such as offering affected customers credit monitoring services).  There is the ensuing PR efforts.  The litigations.  The anti-business regulations that get enacted post-haste.  I can’t really see any disincentive not to make sure that a company keeps its data secure.

     

    Plus, it also assumes that the insurance businesses would let their insurers to continue with bad practices.  No insurance company wants to lose money (and, the horror stories imply that sometimes they’re loathe to pay legitimate claims as well.  The Rainmaker anyone?)  For this reason, insurance companies regularly dictate what steps and conditions have to be taken and met for them to offer coverage.  If the conditions are not met, they’ll just walk away.  In fact, my own belief is that the insurance companies will probably converge upon a set of minimum security practices that businesses have to follow in order to get coverage.  And in light of all the pending regulations holding companies liable, I’m assuming businesses will want such coverage.

     

    It looks like the security practices to be followed will cover hardware, software, and employee behavior.  In other words, making sure only authorized people can physically access certain computers or servers; that firewalls and other security equipment are installed; that sensitive data be encrypted or that the entire hard disk encryption be implemented; etc.

     

    Some say, well, if you’re going to follow good security practices, you won’t need the insurance.  I don’t find this to be necessarily true.  Mistakes will happen even if you have good security practices.  And isn’t that what insurance is all about?  Accidents?  Anything that cannot be controlled or foreseen? 

     

    I’d say that the rising interest in companies looking for insurance is a good thing.  This can only be another force guiding businesses to invest in a minimum of security practices regarding endpoint security, such as hard disk encryption and any kind of mobile security such as laptop encryption.  AlertBoot can help.  It offers robust encryption solutions for protecting your company’s data, be in one file, a set of files, or the entire device.  Plus, there are added benefits such as port control via black and white lists to specify which peripheral products can be attached to your company computers.

     
  • A Mobile Office Requires Endpoint Security

    According to the article at SecurityPark.net, a SonicWALL survey shows that more and more business leaders and managers are getting comfortable with the idea of remote working.

     

    The reasons for this increasing positive attitude seem to be heavily geared towards employee motivation and the cost of doing business (the savings on office space! What a deal!)  There are concerns as well, however, such as the impact on employee productivity and building strong teams.

     

    But the one concern that has surged to the top of the list is security.  No doubt, the unending reports of security breaches reported in the media are making business leaders sit up and take notice.  Not only does it seem that one wakes up to a report of a new security breach each day, but the level of actual and potential damages—to the business and customers—seem to grow each time.

     

    Despite the increase of awareness in the need for security, it looks like the implementation of security policies and procedures is still at a standstill for most businesses.  This is a dangerous development.  A growing mobile workforce with non-secure devices just increases the chances of security breaches.  Lost laptops and hand-held devices with customer’ private data seem to be in the news every week.  I’ve even read of instances of laptops being stolen at gun-point.  And what about burglarized homes?  The fact you’re using a desktop (and hence not a mobile computer) won’t help you in this instance.  Anything not bolted down is technically mobile.

     

    There is no mention why businesses are delaying the implementation of the necessary security measures.  Which is weird at best and irresponsible at worst, since theft and accidents cannot be foreseen by the victims, and businesses certainly cannot include them into their annual business plans and budgets (this is such a silly thing to point out).  And, it will happen.  It's a matter of when, not if or maybe.

     

    However, it seems that most businesses operate as if this can’t happen to them.  Or perhaps they’re under the impression that the username and password that they provide when their computers boot-up is enough protection (for those who aren’t aware, it’s not.  It can easily be overridden).  Encryption services provided by AlertBoot would help to increase the security of devices outside of the workplace.

     

    Businesses should also be afraid of internal attacks as well.  The instances of data theft by employees are growing, either because they think they won’t get caught or because they have an axe to grind.  Just hook up an iPod to the computer at work, and—presto!—you can copy sensitive files off the machine.

     

    Device encryption and application controls should be a basic requirement for businesses with a mobile workforce, especially if they’re equipping their employees with company-issued equipment.  By device encryption, I mean more specifically hard drive encryption.  This way, the contents of your computer cannot be easily accessed.  Port control, also offered by AlertBoot, allows you to specify which devices can be connected to computers.  The iPod I mentioned above?  Place it on a blacklist of unapproved devices, and it won’t be able to connect to your computer.

     
More Posts Next page »