This Blog




AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.


AlertBoot Endpoint Security

AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.
  • UK Encryption: Royal & Sun Alliance Insurance Fined £150,000 For Stolen Hard Drive

    The UK's Information Commissioner's Office (ICO) has fined an insurance company, Royal & Sun Alliance (RSA), a total of £150,000 for the theft of an external storage device with information on nearly 60,000 clients (and credit card details for 20,000 people).  

    Stolen From a Locked Room

    Unlike your run-of-the-mill hard drive theft cases, there are a number of wrinkles to RSA's data breach. To begin with, the external storage device in this case is a NAS (a network attached storage device).
    NASes are like external hard drives but also so much more. One of their key differentiators to the lay person is their size: despite the modern device's emphasis on miniaturization, the modern NAS is still pretty big, considering. It's not unlikely for them to be about as big as a Nintendo Cube (or bigger). Due to its physical size, it's not possible to surreptitiously steal one of these babies; some thought and strategy, possibly pre-planning, is needed when stealing such a device.
    The other wrinkle is that the NAS was stored in a data server room which can only be accessed with "an access card and key," leading to the belief that staff or visiting contractors stole the NAS.
    In other words, it wouldn't have been easy to steal the device.
    And yet, as subsequent events have shown, it would not have been impossible, either. While NASes can offer file encryption, the stolen machine's data was not encrypted – either because this particular NAS didn't offer it or because someone in IT did not deem it worthwhile; excusable, some may think, since it was under lock and key.  


    Well, it wasn't excusable. Far from it, as the six-figure fine shows. It's one thing for your average Joe to not encrypt his sizable storage device that he keeps locked up. A multinational insurance company, on the other hand, has responsibilities, and keeping the same data security practices as your average Joe is contemptible.
    Especially when you consider that up to 40 people were allowed unsupervised access to the room storing the NAS, or that nobody realized that the device had gone missing for over two months.
    This is exactly the type of situation where you want any sensitive data to be encrypted.  

    Giving a Break Where They Shouldn't

    Only the ICO knows how the fine's final amount was calculated. However, they note under "mitigating features" that the "personal data held on the device was not easily accessible."
    There must be some confusion here, since the lack encryption makes access to the data quite easy. It's true that you probably can't just access the information directly from a computer; however, a simple search in Google will provide more than helpful links for getting to the data, instructions that your average middle-schooler can follow while half-asleep.
    Imagine what staff or contractors that were given access to a data server room, literally a room where techie types go into, could do with access to the internet and a few keystrokes.  


    Related Articles and Sites:

  • Netherlands Officially Files 5,500 Breach Notifications In 2016

    The Personal Data Protection Authority of the Netherlands (Autoriteit Persoonsgegevens, "AP") revealed last week that they received nearly 5,500 data breach notifications in 2016, the first year of mandatory data breach notifications for the European country.
    This contrasts with the 980 data breaches in the same period for the US, compiled by the Identity Theft Resource Center (ITRC), which is not government-affiliated. When you consider that the US has somewhere around 320 million people vs. the Netherlands's 17 million, something feels very, very wrong here.
    I can think of two possible ways to interpret the situation:
    1. The Dutch are just terrible at data security. This seems unlikely. It is the US, after all, that holds various records when it comes to data breaches. Last year, for example, Yahoo was crowned with the largest data breach in recorded history.
    2. The US data is severely undercounted. Most probably the reason for the seeming anomaly.
    The latter is supported by the data breach reporting environment in the US.
    To begin with, the US does not have a central authority in charge of data protection. There is no federal law addressing it, although a number of federal agencies do dictate data security in their respective areas; e.g., medical entities and their contractors follow the Department of Health and Human Services requirements regarding data security and breach notifications.
    At the same time, states have their own laws governing data breach reports, governing what is and isn't classified as such. And, each body that overseas such reports have their own policies on whether a data breach should be made public. Some make it easy to find online; others, not so much.  

    Running Numbers

    The 5,500 reported breaches translate to one data breach per 3,090 Dutch citizens. For the US, the 980 translates to one per 326,000 people. That's a ratio of 105 to 1.
    Granted, this is not the best way to represent the figures since it's legal entities that have the duty to report data breaches. A search in Wolfram Alpha shows that the total number of registered businesses in the Netherlands and the USA were, respectively, 1.03 million and 5.156 million.
    This brings down the numbers to one data breach per 187 Dutch businesses, and 5,261 per American businesses. The ratio is now 28 to 1, a considerable reduction, but still very large. Some of the difference could be attributed to the stronger regulations governing data security in Europe: stricter laws, with a propensity to err on the side of caution (read: privacy), means that the Dutch would see a data breach where Americans don't. Also, it could be that the Dutch are more forthcoming with such things because the legal environment is not as litigation-happy.
    No matter how it's cut and dried, however, one thing is certain: 980 breaches reported in the US seems comically low. If we were to assume that the US is comparatively affected by data breaches as the Netherlands, with a similar rate of notification to the authorities, then one would expect 27,500 data breaches in 2016.
    At the end of the day, all the signs point to this: we don't have in the US a good idea of how big or bad the problem is. The best we're willing to do, apparently, is rig the system so that we lowball the number to a point it's not realistic.
    That's a real problem because, who would feel the need to marshal resources when the problem appears to be so small?  


    Related Articles and Sites:

  • US Government Committee Concludes (Yet Again) That Encryption Backdoors Undesirable

    As the year draws to a close – and what a year! – we finally have some good, sensible news: the US government has found that "any measure that weakens encryption works against the national interest," and so encryption backdoors are an untenable scenario. This should be the final and decisive nail to the coffin of an issue that brought encryption and encryption backdoors to the forefront of public consciousness in the US and the world.  

    Apple v. FBI

    Each year has its milestones but 2016 feels like it has had more than its fair share. Brexit; Trump as president-elect; a European Union that's showing signs of becoming fractured; Volkswagen and most of the car industry caught with its pants down; South Korea embroiled in scandal after a Rasputinesque figure is yanked from the shadows; US elections possibly influenced by outside influences; US elections influenced by a federal agency; the biggest data breach in history; the Panama papers… the list really does go on in the year of the Red Monkey.

    And in that list is an FBI that in essence asked Apple to create a backdoor of sorts to the iPhone's encryption, a consequence of the San Bernardino shooting in February 2016.

    The federal agency insists that they weren't asking for a backdoor but they, in fact, were. It was the encryption equivalent of "sorry not sorry." While the FBI ultimately backed off the case did trigger something else: the Encryption Working Group (EWG), a congressional investigation into the viability of encryption backdoors that was composed of both Democrats and Republicans.  

    The Encryption Working Group Conclusions

    1. Any measure that weakens encryption works against the national interest.
    2. Encryption technology is a global technology that is widely and increasingly available around the world.
    3. The variety of stakeholders, technologies, and other factors create different and divergent challenges with respect to encryption and the "going dark" phenomenon, and therefore there is no one-size-fits-all solution to the encryption challenge.
    4. Congress should foster cooperation between the law enforcement community and technology companies.

    The EWG found that strong encryption is vital to national interest in many ways – be it personal freedom or ensuring national defense or protecting infrastructure; the use of encryption is varied and widespread – and so anything that works to weaken encryption is a bad idea. Law enforcement's concerns regarding encryption are valid but an approach other than backdoors must be established.

    A solution may possibly lie in more cooperation between private industry and government, which is already present and established but could be furthered. Apple, for example, already provides law enforcement with data saved to the cloud. (In a case of cynically comical footinmouthitis, some in law enforcement used this cooperation as proof of Apple's "hypocrisy" regarding encryption.)

    The EWG noted that the FBI's approach request for a backdoor (or whatever it was they wanted to call it at the moment) was the wrong one since the use and provision of encryption is global and open source. Nothing would prevent "bad actors" from using encryption that is not crippled with backdoors. Any advantages the government receives from backdoors would be short-sighted and short-term.

    Aside from the above, the EWG also looked into whether "legal hacking" and compelled disclosure by individuals should be given more priority (working within a legal framework, of course).  

    Play Those Encryption Wars Again, Sam

    Of course, none of this is new. When the original "encryption wars" was "fought" in the 1990s, the issues and resulting conclusions were essentially the same. If anything, today's environment shows how prescient those conclusions from 20 years ago were. And, the issues debated back then haven't been supplanted by others in the interim. You know why that is?

    It's because the issues being debated were fundamental in nature, and now with plenty of supporting proof. Not that that's ever stopped anyone from challenging an issue.  


    Related Articles and Sites:

  • iPhone Encryption: FL Appeals Judge Says "OK" to Compel Password

    A new iPhone encryption case is making the headlines. Unlike many of the controversial ones to date, I think it can safely be said that in this case, the courts were right in compelling the suspect to unlock his smartphone.  

    Up-Skirt Videos

    A voyeur – we'll call him John Doe, although his name was revealed by the media – was caught using his iPhone to film up-skirt footage of a woman at a mall. He ran when confronted but the police were able to track him down using security footage.

    Doe initially agreed to a search of his smartphone but reneged at the last moment. A warrant was procured and granted for the phone, but Doe, one assumes, wouldn't cooperate when it came to accessing its contents. Doe, of course, is claiming his Fifth Amendment privileges.

    After legal haranguing, a Florida appellate court judge ruled that Doe must unlock his phone.  

    Only Protected When It's Testimonial

    The thing about the Fifth Amendment is that it's not legally ironclad, as I wrote back in 2012 regarding the Eleventh Circuit Court of Appeals ruling on decrypting hard drives and violating the Fifth. In that case, the court ruled that it was a Fifth Amendment violation; however, in a couple of other similar cases, the opposite was concluded.

    I said that they all made sense in a non-contradictory manner. Why? Because of what's known as foregone conclusion. Remember, the Fifth Amendment was designed to prevent "fishing expeditions" which has been described as:

    [a] method of putting the accused upon his oath and compelling him to answer questions designed to uncover uncharged offenses, without evidence from another source.
    This, in essence, is why compelling testimony against a person is unconstitutional: you arrest a guy for no reason, go through his personal life to see what charges you can stick on him (the "fishing") – if you can't find anything, then torture him so he'll confess under oath to something, either true or made up – and then jail him based on that. The Fifth exists to prevent things like this (the Brits used to do this a lot, so that's why its prohibition is enshrined in the US Constitution).

    But what if it's not a fishing expedition? What if the government is (to extend the theme) "spearing a whale," i.e., aiming for something that they know exists or happened? Well, it's different then.

    Because the government is not looking to find new evidence, or forcing a defendant to present new evidence that the government didn't know about, there is no violation of the Fifth. It's the difference between,

    "Who did you kill and where did you stash the corpse?"
    "Where is Mary's body? We have video footage of you putting it in your car's trunk and driving outside the city."

    In the latter case, it's a foregone conclusion that you know where Mary's body is, and the government can prove it. Revealing what happened to Mary's body, and where it is, is not testimonial. You can claim the Fifth and not cooperate, but you won't get the actual legal protection (probably). On the other hand, forcing a suspect to do the same under the former is definitely testimonial.

    That there are grey tones to the Fifth makes instinctual sense: If you're served with a warrant to examine the inside of your house, your don't claim the Fifth and stop the authorities from crossing the threshold. The law is legally allowed inside whether you like it or not.

    Same goes for the blood samples, handwriting samples, DNA samples, voice recordings, or standing in a line up. All of these acts work against a guy and claiming the Fifth doesn't do squat.  

    What About the Contents of the Mind?

    You might be thinking, well, that's all well and good but wasn't there a legal maxim that you can be forced to turn over the keys to a strong box's lock but never if it's a combination lock where the "keys" don't exist? And isn't the iPhone's four-digit code more like a combo lock?

    And the answer is "yes." As it turns out, though, all of that was part of a dissenting opinion (8 to 1, no less), so it's not precedent. Nevertheless, the judge in the voyeur's case made reference to it, and wondered whether differentiating between the two even makes sense, especially in this day and age.

    But even if it did, foregone conclusion applies to things like passwords as well. In one case, a man crossed into the US with a laptop that contained kiddie porn. The border agent saw it and the guy was arrested. The laptop's encryption, however, kicked in afterwards. The man was ordered to decrypt the laptop by a court. Why? The government already knew that it was there. No fishing necessary.

    In another case, a woman was recorded as saying that her laptop, which was already taken in as physical evidence, contained files that she didn't want the prosecutors to see. She also was ordered to decrypt her laptop. Why? The government already knew that it was there.

    There have been many other instances where "things in one's mind" have been compelled to be produced by a court. It ultimately seems to come down to: is the government fishing for evidence or not?

    In the voyeur's case, it appears that foregone conclusion kicks in. The cops have identified him and tracked him down. There is a witness (the woman who was wearing the skirt and confronted him). He ran when confronted. There is, apparently, footage of him at the mall (which explains how he was tracked down). A warrant was issued based on all of this. He pretty much all but admitted that the phone is his. And, as the judge noted, providing the PIN isn't testimonial – that is, it doesn't create new evidence nor would it be taken as admission of guilt.

    There is very little wiggle room for John Doe here.  

    Still, Problematic

    The situation is not without problems, however. What if Doe doesn't remember his PIN? Then he'll be found in contempt of the court for something he's truly unable to do. And that has to be just as bad as putting some guy in jail on trumped up charges.    


    Related Articles and Sites:

  • Laptop Encryption: Chesapeake Public Schools Laptop Theft Affects Over 10,000 Employees

    According to a couple of sources, Chesapeake Public Schools in Virginia is notifying employees about a potential data breach. Per their announcement, nearly 11,000 people could be affected by the theft of a laptop computer. It appears that laptop encryption software was not used to protect the contents. Password protection, however, was used.
    Assuming that the thieves (or thief) manage to work past the password protection, they will have access to names, Social Security numbers, and bank account numbers for past and present employees of CPS.  

    Password Protection: Useless

    This is one of those data breach stories that doesn't get too much coverage nowadays – either because the theft of unencrypted laptops doesn't happen too often (relatively speaking) or because it gets buried by much more sensational breaches, such as Yahoo's admission earlier this year that hundreds of millions of accounts were hacked a couple of years ago.
    I would like to imagine the dearth of such stories on the former reason. After all, it's been a while since the alarm has been raised regarding the lack of encryption on laptops that store sensitive data; this blog alone has spent 10 years on it, as have other sites including news outlets. It would be nice to know that ten or more years is enough time for information to diffuse throughout society and become general knowledge. You know, to become what they refer to as "common sense." (And if the controversy revolving around the FBI and iPhones is any indication, it has become common sense).
    Here's a factoid that hasn't reached such status: password protection is anything but. Just a simple online search will provide more than a handful of ways for overcoming or bypassing password protection on computers running the Windows operating system.
    And if the thieves manage to do it on the CPS laptop… well, it's not going to be pretty.  

    Tax Season is Upon Us

    What could a thief do with names, SSNs, and bank account info for 10,000-plus people? How much damage could he cause? Plenty.

    One of the ways that the above info gets used around this time of the year is in the IRS tax refund scam. Since nobody likes to file their taxes early, and most will wait until April is near, enterprising criminals have beaten real people to the punch by filing fake tax returns, routing the IRS checks to addresses they control (such as postal mail boxes) and cashing them.

    The IRS does what it can to prevent checks from being sent to scammers, but ultimately, they can't know for sure that a fraudulent tax return was filed unless the actual SSN holder (or his/her accountant) calls to complain – which, again, tends to happen closer towards April.

    Indeed, this could be one big reason why there are no signs of the data being misused so far. It's just a little too soon to do anything with it.  


    Related Articles and Sites:
  • UMass Amherst Settles HIPAA Violation for $650,000 and Corrective Action

    In 2003, the University of Massachusetts - Amherst (UMass Amherst) was embroiled in a health data security breach. A workstation computer was infected with malware, leading to a HIPAA violation involving patient data for 1,670 people. Skip to three years later, and UMass Amherst has settled legal actions related to the breach, brought by the US Department of Health and Human Services (HHS).
    According to the terms, the teaching hospital will cough up $650,000 and will implement a corrective action that will hopefully prevent similar and other future incidences.  

    Wrong Classification

    Malware. Just like you never know when some guy's going to stick a computer down his pants in order to boost it, malware can attack you in the most unexpected ways, in the most unexpected of times. (Incidentally, people have been caught sticking desktop computers down their pants, not just laptops. And not just from Walmart, either. There's video footage floating around of a man doing so at a hospital).
    Combating malware is not easy, but it is made easier by following certain rules. Install a good antivirus software. Ensure that it receives updates regularly. Make sure your firewall is up and running correctly. Don't visit sites that have a high probability of hosting malware. Don't download and install untrustworthy apps and software.
    It looks like UMass Amherst had all of this in place, which is de rigueur for covered entities. However, they had decided – incorrectly in hindsight – that their Center for Language, Speech, and Hearing (CLSH) was not a "health care component," and hence not included under HIPAA compliance rules.
    In turn, this probably led them to be more relaxed when it comes to data security. And the rest, as they say, is history. A three year-long, half-a-million dollars' worth of history.  

    The More Things Change…

    About, say, 5 years ago, the big thing when it came to medical data security was the loss and theft of devices that held patient data: laptops, desktops, external and portable data storage devices, paper files, smartphones, etc.
    As businesses and organizations made greater incursions into the cloud, public and private, we've seen less (at least, there's been less reports) of the old-time data security breach and more of the "new" type: your average malware, your specialized malware (like ransomware which uses encryption), DDoS, accidental leaking of files, and so on. In the past month, St. Joseph Health also settled with the HHS (for $2.14 million) because their internal files were accessible to search engines.
    It feels like things have changed. And in a sense, it has. But not really.
    The need to properly protect patient data (not computers; the focus has always been on the data) has existed forever, and in the US it was made into law twenty years ago under Bill Clinton's presidency. We must assume that the law arose because there was a need at the time. Regardless, the government, it can be safely said, gave it a low priority and businesses proceeded to pay little to no attention to medical data security for the next 15 years or so.
    With the passage of HITECH, the government gave data security more attention. Businesses started taking notice. And then, Boston's Massachusetts General Hospital got fined $1 million in 2011 for a data breach. Businesses started doing more than taking notice.
    Because, if MGH can be fined a million bucks for less than 30 pages of patient print outs, it certainly won't hesitate in fining the loss of a computer or USB stick with information on thousands of patients. People started encrypting their laptops.
    Why? Because the loss and theft of laptops was trending in the news. Then USB stick losses started trending. External drives and USB sticks got encrypted. Basically, HIPAA covered entities have been playing catch-up ever since HITECH.
    And, in that sense, not much has changed.  

    Precision Security

    For a while, security experts were suggesting that covered entities encrypt anything that allowed digital storage. With people emailing, FTP-ing, uploading, downloading, backing up, and copying and pasting sensitive files, it was impossible to tell where a file would end up – so, just encrypt everything you possible can.
    Others suggested that a more thought-out method was necessary. It was the more realistic approach in terms of money and time. If a company has 1,000 laptops but only 10 need encrypting, why spend the time and money to protect the remaining 990? Furthermore, a company would be more likely to encrypt if faced with a less financially arduous assignment.
    In the end, the more precision-oriented approach won out. But as events show, the flip side of it is the increased need for constant vigilance and proper understanding of what organizations are doing, a feat that is near impossible once you reach a certain size.  


    Related Articles and Sites:

More Posts Next page »