The psychological concept of "implicit learning" is being touted as a way of "memorizing" very complicated passwords that these cannot be revealed via blunt-force threats, aka "rubber hose attacks." The password mechanism could mean enhanced security, I admit. After all, the weakest part of any mobile security software, including disk encryption software like AlertBoot, is, pragmatically speaking, the password. But, one thing has been bothering me since first reading of the benefits of this new password protection scheme: you must never underestimate the power of the rubber hose.
The psychological concept of "implicit learning" is being touted as a way of "memorizing" very complicated passwords that these cannot be revealed via blunt-force threats, aka "rubber hose attacks." The password mechanism could mean enhanced security, I admit. After all, the weakest part of any mobile security software, including disk encryption software like AlertBoot, is, pragmatically speaking, the password.
But, one thing has been bothering me since first reading of the benefits of this new password protection scheme: you must never underestimate the power of the rubber hose.
A rubber hose attack is, namely, the use of violence: You have a password. I want the password. I torture you. You give me the password. I record it and use it. It's as simple as that. The techno-centric webcomic site has an excellent summary of the concept: hit him with this $5 wrench until he tells us the password.Comics aside, this is actually a perfectly valid (and I assume actually employed) way of getting the password to encrypted data (N.B. - valid as in "it works" and not as in "it is acceptable"). And, unlike confessions, it's always verifiable: you type the password. If it doesn't work, back to the rubber hose it is. So, a password that is based on implicit learning -- riding a bicycle or playing a long piano piece is quoted in the arstechnica.com article as implicit learning instances, where "precise sequences are impossible for a human to articulate" -- appears as a reasonable way to safeguarding passwords. Except that if you give me a bicycle, I can ride it. And while I don't play musical instruments, my observation is that piano players that can play a piece on one piano can play it on another. In other words, the above five steps can be modified to look like this: You have a password. I want the password. I torture you. I make you ride a bike or play a piano sonata or whatever, i.e., you give me the password. I record it and use it. The only major difficulty would be finding a stand-in device for whatever the implicitly learned password happens to be. The arstechnica.com article mentions the use of a "Guitar Hero"-like interface. Assuming commercial products (I use the term loosely to mean "available if you have a decent amount of money") are released based on this latest password technology, I don't see how rubber hosing passwords can be prevented.
A rubber hose attack is, namely, the use of violence:
It's as simple as that. The techno-centric webcomic site has an excellent summary of the concept: hit him with this $5 wrench until he tells us the password.Comics aside, this is actually a perfectly valid (and I assume actually employed) way of getting the password to encrypted data (N.B. - valid as in "it works" and not as in "it is acceptable"). And, unlike confessions, it's always verifiable: you type the password. If it doesn't work, back to the rubber hose it is.
So, a password that is based on implicit learning -- riding a bicycle or playing a long piano piece is quoted in the arstechnica.com article as implicit learning instances, where "precise sequences are impossible for a human to articulate" -- appears as a reasonable way to safeguarding passwords.
Except that if you give me a bicycle, I can ride it. And while I don't play musical instruments, my observation is that piano players that can play a piece on one piano can play it on another. In other words, the above five steps can be modified to look like this:
The only major difficulty would be finding a stand-in device for whatever the implicitly learned password happens to be. The arstechnica.com article mentions the use of a "Guitar Hero"-like interface. Assuming commercial products (I use the term loosely to mean "available if you have a decent amount of money") are released based on this latest password technology, I don't see how rubber hosing passwords can be prevented.
Related Articles and Sites:http://arstechnica.com/security/2012/07/guitar-hero-crypto-blunts-rubber-hose-attacks/http://www.extremetech.com/extreme/133067-unbreakable-crypto-store-a-30-character-password-in-your-brains-subconscious-memory
Global Payments, a Sandy Springs-based credit and debit card processor that disclosed in March 30 that it was hacked, says that its data breach cleanup costs have reached $84.4 million so far. While it's debatable whether data encryption would have made a difference -- the company has advocated end-to-end encryption for credit card transactions since the breach -- it shows that data security is essential when it comes to protecting sensitive data.
Global Payments has released via www.2012infosecurityupdate.com the following details: Global Payments has made substantial progress in its investigation and remediation efforts. Based on the investigation to date, we continue to believe that a limited portion of our North American card processing system was affected, actual card numbers that may have been exported did not exceed 1,500,000 and any potential card exportation was limited to Track 2 data... Our investigation also supports our earlier findings that the cardholder information that may have been exported included only Track 2 data, not names, addresses or social security numbers. However, foxbusiness.com notes that "it supplied a larger number of card numbers to the payment networks for monitoring."
Global Payments has released via www.2012infosecurityupdate.com the following details:
Global Payments has made substantial progress in its investigation and remediation efforts. Based on the investigation to date, we continue to believe that a limited portion of our North American card processing system was affected, actual card numbers that may have been exported did not exceed 1,500,000 and any potential card exportation was limited to Track 2 data... Our investigation also supports our earlier findings that the cardholder information that may have been exported included only Track 2 data, not names, addresses or social security numbers.
Global Payments has made substantial progress in its investigation and remediation efforts. Based on the investigation to date, we continue to believe that a limited portion of our North American card processing system was affected, actual card numbers that may have been exported did not exceed 1,500,000 and any potential card exportation was limited to Track 2 data...
Our investigation also supports our earlier findings that the cardholder information that may have been exported included only Track 2 data, not names, addresses or social security numbers.
However, foxbusiness.com notes that "it supplied a larger number of card numbers to the payment networks for monitoring."
There is, however, a caveat to all of this. According to Global Payments's own press release, the cost that's being quoted around "includes an estimate of charges from the card brands and investigation and remediation expenses." It's not an unreasonable assumption that perhaps the cost of cleaning up the breach would have been lower had Global Payments been in some business other than finance.
There is, however, a caveat to all of this. According to Global Payments's own press release, the cost that's being quoted around "includes an estimate of charges from the card brands and investigation and remediation expenses."
It's not an unreasonable assumption that perhaps the cost of cleaning up the breach would have been lower had Global Payments been in some business other than finance.
Related Articles and Sites:http://www.zdnet.com/data-breach-to-cost-84m-for-global-payments-7000001674/http://www.ajc.com/business/breach-costly-to-global-1485468.htmlhttp://www.foxbusiness.com/news/2012/07/26/global-payments-takes-charge-84-million-for-data-breach/
The ottawacitizen.com reports that approximately 800 Ottawa pensioners were affected when a hard drive belonging to Towers Watson -- the American firm that specializes in human resources consulting -- went missing from storage in Manila, the Philippines. The drive in question was not protected with the likes of AlertBoot hard disk encryption.
Towers Watson, headquartered in New York, was in charge of the City of Ottawa's superannuation fund which was established in the municipality's early years and covered police, firefighters and city staff. Everyone currently in the plan is retired or is a surviving beneficiary of the pensioner, and is not a member of the Ontario Municipal Employees Retirement System. OMERS was a new plan created in the 1960s to handle the retirement benefits of local government employees across Ontario. [ottawacitizen.com] Seeing how the city has been around since 1826 -- and adopted its current name in 1855 -- that's one heck of a long-surviving pension program. Towers Watson, however, has only been in charge of the fund for the past 13 years. The computer hard drive was decommissions and was placed in a secure area prior to having its data wiped; however, it was stolen in early May when the company suffered a number of other computer equipment related thefts. The stolen drive, according to company spokesman Michael Millns, ...didn't include bank account details. It didn't include passwords or that type of stuff. ... You could probably get the same information by stealing a copy of someone's tax return. There's really not a lot there, which is why I think we think ... there's a very low risk. [ottawacitizen.com] I don't know how Canadians file their taxes, but I'm assuming that some type of SSN-like number is used. That would mean the theft actually represents a high risk, even if risks have been reduced by not including financial data.
Towers Watson, headquartered in New York, was in charge of the City of Ottawa's superannuation fund which was established
in the municipality's early years and covered police, firefighters and city staff. Everyone currently in the plan is retired or is a surviving beneficiary of the pensioner, and is not a member of the Ontario Municipal Employees Retirement System. OMERS was a new plan created in the 1960s to handle the retirement benefits of local government employees across Ontario. [ottawacitizen.com]
Seeing how the city has been around since 1826 -- and adopted its current name in 1855 -- that's one heck of a long-surviving pension program. Towers Watson, however, has only been in charge of the fund for the past 13 years.
The computer hard drive was decommissions and was placed in a secure area prior to having its data wiped; however, it was stolen in early May when the company suffered a number of other computer equipment related thefts. The stolen drive, according to company spokesman Michael Millns,
...didn't include bank account details. It didn't include passwords or that type of stuff. ... You could probably get the same information by stealing a copy of someone's tax return. There's really not a lot there, which is why I think we think ... there's a very low risk. [ottawacitizen.com]
...didn't include bank account details. It didn't include passwords or that type of stuff. ... You could probably get the same information by stealing a copy of someone's tax return.
There's really not a lot there, which is why I think we think ... there's a very low risk. [ottawacitizen.com]
I don't know how Canadians file their taxes, but I'm assuming that some type of SSN-like number is used. That would mean the theft actually represents a high risk, even if risks have been reduced by not including financial data.
Ironically enough, the data breach was made possible in part because Towers Watson was switching to an encrypted system. This happens more than often enough, you know? For example, Company A decides to improve their data security. They plan the upgrade, work out the logistics on how they're going to go about it, start installing encryption in batches so the IT department doesn't get overwhelmed, and they suffer a data breach when a laptop is lost halfway into the program. Unfortunately, it is impossible to complete prevent the above from happening. The risks of it taking place, however, can be severely reduced by speeding up the process of deploying encryption. Would your risk profile look different if you could deploy full disk encryption to 1,000 laptop computers over a period of one week versus three months? Of course it would. That's why AlertBoot was created as a web-based service: it allows disk encryption to take place from anywhere that there is an internet connection. So, instead of planning for the physical logistics where employees bring in their machines and temporary place them with the IT department, the encryption deployment can be pushed out to all employees via email, eliminating a significant chokepoint and speeding up the process of securing an organization's data.
Ironically enough, the data breach was made possible in part because Towers Watson was switching to an encrypted system. This happens more than often enough, you know? For example, Company A decides to improve their data security. They plan the upgrade, work out the logistics on how they're going to go about it, start installing encryption in batches so the IT department doesn't get overwhelmed, and they suffer a data breach when a laptop is lost halfway into the program.
Unfortunately, it is impossible to complete prevent the above from happening. The risks of it taking place, however, can be severely reduced by speeding up the process of deploying encryption. Would your risk profile look different if you could deploy full disk encryption to 1,000 laptop computers over a period of one week versus three months?
Of course it would. That's why AlertBoot was created as a web-based service: it allows disk encryption to take place from anywhere that there is an internet connection. So, instead of planning for the physical logistics where employees bring in their machines and temporary place them with the IT department, the encryption deployment can be pushed out to all employees via email, eliminating a significant chokepoint and speeding up the process of securing an organization's data.
Related Articles and Sites:http://www.ottawacitizen.com/business/Should+they+worried+city+pensioners+personal+information+stolen/6990227/story.html
A man who bought a used computer claims to have found "thousands" of files with personal data on West Cheshire College students. The College is claiming otherwise. It's a situation that would have not developed if disk encryption like AlertBoot had been used.
As the BYOD trend begins to gather steam, mobile security tools like smartphone encryption are beginning to attract interest. But, the West Cheshire College story shows that such solutions are a rehash of an old problem. Namely, how do you ensure data security throughout a device's life? The computer the man bought was an old-fashioned tower (desktop) computer. According to ellesmereportpioneer.co.uk: the second-hand computer tower and hard drive for £5 from a sale at the Countess of Chester Hospital on May 13. The man, who does not want to be named, said he was stunned when he got home to find thousands of files containing personal information from the college [West Cheshire College] still on the computer’s hard drive. He claims it included names, dates of birth, emails, course details, exam results, work timetables and even photographs of students. The computer was also checked by the college's IT department. According to their investigation, "the contents of the hard disk and test dates including names and dates of births of less than 60 students were found on the disk with no further relevant information." Who's right? Well, the UK's Information Commissioner's Office (ICO) shouldn't have a problem determining this because the man who "unbeknown to the college, he had already made a backup copy of the drive which he is now planning to hand to the Independent Complaints Office."
As the BYOD trend begins to gather steam, mobile security tools like smartphone encryption are beginning to attract interest. But, the West Cheshire College story shows that such solutions are a rehash of an old problem. Namely, how do you ensure data security throughout a device's life?
The computer the man bought was an old-fashioned tower (desktop) computer. According to ellesmereportpioneer.co.uk:
the second-hand computer tower and hard drive for £5 from a sale at the Countess of Chester Hospital on May 13. The man, who does not want to be named, said he was stunned when he got home to find thousands of files containing personal information from the college [West Cheshire College] still on the computer’s hard drive. He claims it included names, dates of birth, emails, course details, exam results, work timetables and even photographs of students.
the second-hand computer tower and hard drive for £5 from a sale at the Countess of Chester Hospital on May 13.
The man, who does not want to be named, said he was stunned when he got home to find thousands of files containing personal information from the college [West Cheshire College] still on the computer’s hard drive.
He claims it included names, dates of birth, emails, course details, exam results, work timetables and even photographs of students.
The computer was also checked by the college's IT department. According to their investigation, "the contents of the hard disk and test dates including names and dates of births of less than 60 students were found on the disk with no further relevant information."
Who's right? Well, the UK's Information Commissioner's Office (ICO) shouldn't have a problem determining this because the man who "unbeknown to the college, he had already made a backup copy of the drive which he is now planning to hand to the Independent Complaints Office."
This entire controversy probably comes as a surprise to West Cheshire College because efforts were (supposedly) made to ensure data security (my emphasis): This particular computer was one of a handful of old computers donated to members of staff and though data is electronically wiped before disposal we have found that this particular computer had a physical issue preventing the full wiping of the disk. We have now strengthened our internal processes of disposing of old computers to ensure that our systems are 100% robust. I think we can surmise that the college really did engage in data wiping, as opposed to merely deleting files and "emptying the recycle bin." The latter may remove file icons from your desktop but does not necessarily mean that the data was erased. True digital data wiping involves overwriting data over each sector on a hard drive. What's funny is that implementing encryption software generally takes about as long as properly wiping data. And while wiping data is a "device end-of-life" process that sometimes can fail, encryption is not only as effective as a data overwrite, it protects the contents of a device during its serviceable lifetime as well (for example, if the computer were to be stolen). The use of a full disk encryption is a no-brainer in certain situations, especially if you are cognizant of the usual risks and take a long-term approach to security.
This entire controversy probably comes as a surprise to West Cheshire College because efforts were (supposedly) made to ensure data security (my emphasis):
This particular computer was one of a handful of old computers donated to members of staff and though data is electronically wiped before disposal we have found that this particular computer had a physical issue preventing the full wiping of the disk. We have now strengthened our internal processes of disposing of old computers to ensure that our systems are 100% robust.
This particular computer was one of a handful of old computers donated to members of staff and though data is electronically wiped before disposal we have found that this particular computer had a physical issue preventing the full wiping of the disk.
We have now strengthened our internal processes of disposing of old computers to ensure that our systems are 100% robust.
I think we can surmise that the college really did engage in data wiping, as opposed to merely deleting files and "emptying the recycle bin." The latter may remove file icons from your desktop but does not necessarily mean that the data was erased. True digital data wiping involves overwriting data over each sector on a hard drive.
What's funny is that implementing encryption software generally takes about as long as properly wiping data. And while wiping data is a "device end-of-life" process that sometimes can fail, encryption is not only as effective as a data overwrite, it protects the contents of a device during its serviceable lifetime as well (for example, if the computer were to be stolen).
The use of a full disk encryption is a no-brainer in certain situations, especially if you are cognizant of the usual risks and take a long-term approach to security.
Related Articles and Sites:http://www.ellesmereportpioneer.co.uk/ellesmere-port-news/local-ellesmere-port-news/2012/07/25/man-claims-hard-drive-bought-at-car-boot-sale-contained-personal-data-from-west-cheshire-college-55940-31464805/
The courts have yet again ruled that breached personal data does not equate harm. Earlier this month, the U.S. District Court for the Western District of Kentucky dismissed a lawsuit centered around a data breach at Countrywide Financial Corporation. The breach was one of those affairs where the use of data protection tools, like AlertBoot drive encryption software, have limited value because it was instigated by an insider. This fact was also the basis for tossing one of the claims by the plaintiffs: that Countrywide "furnished" information to third parties.
According to the details I've found online, Countrywide suffered a data breach which was initially reported in August 2008. At that time, an employee was arrested and charged for downloading information on 20,000 customers every week and selling it to mortgage brokers over a period of two years. Based on an initial settlement reached in December 2009, approximately 17 million people were affected by the breach, although I'm reading a conflicting report that only 2.4 million were affected. Regardless of the actual numbers, I think we can agree that a lot of people were affected. Some opted out of the 2009 settlement, which resulted in a brand new lawsuit. These new plaintiffs: alleged that they suffered injury from the data theft because they were forced to take measures to protect themselves from identity theft, such as enrolling in independent credit monitoring service (despite being offered free monitoring by Countrywide) and spending time researching identity theft; and forced to cancel their telephone service after being inundated with telemarketing calls. [infolawgroup.com] As the article at infolawgroup.com goes on to point out, they were essentially suing for future-oriented crimes: plaintiffs were seeking remuneration for what might happen, and not what had happened. And the courts have show time and time again that that's not going to happen. The result was no different in this case. Perhaps a silver lining for consumers is that the court did find the plaintiffs to have standing to sue. In most past cases, people suing companies over a data breach couldn't even get their day in court. However, I've recently read of a couple of cases where lawsuits over data breaches do go through an actual trial. However, the cases generally end up concluding that the plaintiffs don't have a leg to stand on, as in this case. Among the things pointed out which are just plain common sense: An employee that steals data from a company and resells that data to a third party doesn't equate to "the company providing data to third parties," which can be a violation of FCRA I'm not sure why that even has to be pointed out, but I'm glad to see that's been cleared up.
According to the details I've found online, Countrywide suffered a data breach which was initially reported in August 2008. At that time, an employee was arrested and charged for downloading information on 20,000 customers every week and selling it to mortgage brokers over a period of two years. Based on an initial settlement reached in December 2009, approximately 17 million people were affected by the breach, although I'm reading a conflicting report that only 2.4 million were affected. Regardless of the actual numbers, I think we can agree that a lot of people were affected.
Some opted out of the 2009 settlement, which resulted in a brand new lawsuit. These new plaintiffs:
alleged that they suffered injury from the data theft because they were forced to take measures to protect themselves from identity theft, such as enrolling in independent credit monitoring service (despite being offered free monitoring by Countrywide) and spending time researching identity theft; and forced to cancel their telephone service after being inundated with telemarketing calls. [infolawgroup.com]
As the article at infolawgroup.com goes on to point out, they were essentially suing for future-oriented crimes: plaintiffs were seeking remuneration for what might happen, and not what had happened. And the courts have show time and time again that that's not going to happen. The result was no different in this case.
Perhaps a silver lining for consumers is that the court did find the plaintiffs to have standing to sue. In most past cases, people suing companies over a data breach couldn't even get their day in court. However, I've recently read of a couple of cases where lawsuits over data breaches do go through an actual trial.
However, the cases generally end up concluding that the plaintiffs don't have a leg to stand on, as in this case.
Among the things pointed out which are just plain common sense: An employee that steals data from a company and resells that data to a third party doesn't equate to "the company providing data to third parties," which can be a violation of FCRA
I'm not sure why that even has to be pointed out, but I'm glad to see that's been cleared up.
Related Articles and Sites:http://www.infolawgroup.com/2012/07/articles/data-privacy-law-or-regulation/court-dismisses-countrywide-data-theft-suit/http://www.bankinfosecurity.com/countrywide-insider-case-bigger-than-initially-revealed-a-981http://www.scmagazine.com/parties-agree-to-settlement-over-countrywide-data-breach/article/160332/
Beth Israel Deaconess Medical Center is alerting 3,900 patients that a laptop computer with personal medical information was stolen. The article at bostonglobe.com reveals that a tracking device was used on the laptop, but it's not mentioned whether the device was protected with data encryption software like AlertBoot. Regardless, patients should be able to rest a little bit easier after a forensic firm has found that there was "nothing that would be used from an identity theft perspective."
The laptop theft took place on May 22 and took place at a hospital office. Despite the location where the laptop computer was stolen from, it's been revealed that it belonged to a physician. I'm not sure if Beth Israel is one of those firms engaging in BYOD (bring your own device), but if it is, it certainly hasn't approached the rollout correctly. BYOD requires, at the most basic level, that devices holding sensitive data be protected. If we were talking about a regular company, sensitive data could refer to corporate secrets, internal memos and other communications, client lists, etc -- whatever a company deems important enough to protect from prying eyes. Since Beth Israel is a medical institution, it stands to reason that their sensitive data includes patients' medical data, commonly referred to as "protected health information" or PHI. The type of data that falls under the auspices of the PHI label is very broad. It can include non-medical personal data such as personal addresses, phone numbers, and even a patient's hospital room number and phone extension. In fact, there is so little that is not considered PHI when it comes to patient data that, if you are dealing with such data in any way or form, it makes sense to protect it. One of the most effective and simplest ways to do so is to use encryption software, like full disk encryption, to protect the entire device. But, the hospital already knows this.
The laptop theft took place on May 22 and took place at a hospital office.
Despite the location where the laptop computer was stolen from, it's been revealed that it belonged to a physician. I'm not sure if Beth Israel is one of those firms engaging in BYOD (bring your own device), but if it is, it certainly hasn't approached the rollout correctly.
BYOD requires, at the most basic level, that devices holding sensitive data be protected. If we were talking about a regular company, sensitive data could refer to corporate secrets, internal memos and other communications, client lists, etc -- whatever a company deems important enough to protect from prying eyes. Since Beth Israel is a medical institution, it stands to reason that their sensitive data includes patients' medical data, commonly referred to as "protected health information" or PHI.
The type of data that falls under the auspices of the PHI label is very broad. It can include non-medical personal data such as personal addresses, phone numbers, and even a patient's hospital room number and phone extension. In fact, there is so little that is not considered PHI when it comes to patient data that, if you are dealing with such data in any way or form, it makes sense to protect it. One of the most effective and simplest ways to do so is to use encryption software, like full disk encryption, to protect the entire device.
But, the hospital already knows this.
Beth Israel Deaconess routinely protects information on company-issued computers by encrypting the material with software that makes it difficult to decipher [bostonglobe.com] And, because it already experienced a significant data breach one year ago, it has changed certain policies: We [Beth Israel Deaconess] have said to our employees that there is now a mandatory encryption program. So any device that is used in any way with our data, whether it is patient-related or administrative, it must be encrypted. [bostonglobe.com] According to the hospital's CIO, 1,500 personal devices may be in use (which leads me to suspect that BYOD is a truism at the hospital, even if it's not an officially sanctioned program). The process of encrypting these is expected to take three months as people bring their devices to "depots." Three months to encrypt 1,500 devices? With the use of web-based AlertBoot, that project would be complete in less than a month. I know because we currently hold an account with over 10,000 encrypted endpoints for a global finances firm that has offices spread across the world. It took us less than two months. But, in the case of Beth Israel, it makes sense that they don't do everything over an internet connection because, besides installing encryption, they're also looking to check for the installation of antivirus software and installing patches, jobs that are simplified by accessing the actual devices. Still, three months is an awfully long time. Generally, the less time it takes you to roll out data security, the better: there is more than a handful of companies and organizations that experienced a data breach while they were deploying encryption system-wide.
Beth Israel Deaconess routinely protects information on company-issued computers by encrypting the material with software that makes it difficult to decipher [bostonglobe.com]
And, because it already experienced a significant data breach one year ago, it has changed certain policies:
We [Beth Israel Deaconess] have said to our employees that there is now a mandatory encryption program. So any device that is used in any way with our data, whether it is patient-related or administrative, it must be encrypted. [bostonglobe.com]
According to the hospital's CIO, 1,500 personal devices may be in use (which leads me to suspect that BYOD is a truism at the hospital, even if it's not an officially sanctioned program). The process of encrypting these is expected to take three months as people bring their devices to "depots."
Three months to encrypt 1,500 devices? With the use of web-based AlertBoot, that project would be complete in less than a month. I know because we currently hold an account with over 10,000 encrypted endpoints for a global finances firm that has offices spread across the world. It took us less than two months.
But, in the case of Beth Israel, it makes sense that they don't do everything over an internet connection because, besides installing encryption, they're also looking to check for the installation of antivirus software and installing patches, jobs that are simplified by accessing the actual devices.
Still, three months is an awfully long time. Generally, the less time it takes you to roll out data security, the better: there is more than a handful of companies and organizations that experienced a data breach while they were deploying encryption system-wide.
Related Articles and Sites:http://bostonglobe.com/lifestyle/health-wellness/2012/07/20/patient-information-may-have-been-breached-after-laptop-stolen-beth-israel-deaconess/JobJhtGnm7C8z0QthhG5SP/story.html