in

This Blog

Syndication

Tags

News

AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.

Archives

AlertBoot Endpoint Security

AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.
  • HIPAA Breach Results In Lawsuit And Countersuit Between Aetna and KCC

    Reuters reported earlier this month that Aetna, the health insurance company, and Kurtzman Carson Consultants (KCC), an administrative-support services provider, have sued each other over a mishandled class action settlement notification.
    Last year, Aetna settled a number of lawsuits regarding the fulfillment of HIV medication prescriptions. With the legal issues finalized, it was up to KCC to mail the settlement notifications and finally close the book on the situation. Unfortunately, the notifications were sent using envelopes with transparent windows, the ones where names and addresses show through. But in this case, there was a little more:
    Most of the first sentence of the notification was also displayed - including the words "when filling prescriptions for HIV medications." [reuters.com]
    That's as private as private information can get. Naturally, Aetna was sued for breach of patient privacy, which the company quickly settled. In turn, the company sued KCC "to indemnify…the entire cost of the notification disaster," or nearly $20 million. Aetna claims that they didn't know that envelopes with transparent windows would be used, that private information would be showing, etc.
    Basically, it wasn't Aetna's fault.
    KCC, however, has countersued, stating that "Aetna and Gibson Dunn [the insurance company's legal representation] knew what the notifications would look like" and, allegedly, approved it prior to mailing out the settlement notifications.
    Obviously, someone has to be lying. The calamities don't end there, however.  

    No Encryption in this Day and Age?

    KCC has also averred in their suit that,
    When Aetna’s lawyers passed along the list of health plan members to be notified about the HIV prescription policy, there was no protective order in place. Nor did Gibson Dunn encrypt all of the data it sent to [KCC].
    In fact, KCC states that private health information (PHI) wasn't encrypted nor password-protected (not that password protection would do any good; it's certainly not a HIPAA-compliant PHI security measure). And, they further claim that "more data than was necessary to perform the noticed function" was sent to them… which is not necessarily forbidden under HIPAA but is definitely frowned upon. In fact, it might be one of those red flags that spark an investigation by the Department of Health and Human Services (HHS).
    On the other hand, passing sensitive patient data around without encryption? We all know how the HHS feels about that one. The Reuter's article summed up what's at stake for Aetna and KCC in this manner:
    For both Aetna and KCC, as you can see from the dueling complaints, responsibility for the botched settlement notifications is really an existential question. As a health insurer, Aetna has a moral and legal obligation to protect patient privacy. As a claims administrator, KCC is supposed to know – of all things! – how to mail out a settlement notification without violating recipients' privacy.
    The above is insightful and yet misses a number of observations.
    It should be noted that KCC received the data from Aetna's lawyers. So, if KCC's allegations are true, then Aetna has another business associate that's not paying attention to HIPAA/HITECH requirements. And, what's true for KCC – that they should know how to properly mail out notifications because it's their job – can also be said for lawyers that are sharing sensitive data that belongs to a HIPAA covered-entity. After all the law has specifics on how PHI data should be handled by business associates of HIPAA covered-entities. Business associates such as lawyers, who, by virtue of their profession and their client, should know not to pass around PHI unencrypted.
    Also, the allegations open up a another can of worms for Aetna, seeing how it now has two business associates that have contravened HIPAA/HITECH data security rules in less than one year. It can take very little to get the HHS to open up an investigation into data security violations. Having three HIPAA incidents in a one-year period must certainly attract attention, and KCC's allegations gives the HHS a reason to dig more in depth into Aetna's adherence to HIPAA privacy and security rules.

     

    Related Articles and Sites:
    https://www.reuters.com/article/legal-us-otc-aetna/kcc-sues-aetna-blames-gibson-dunn-in-hiv-settlement-notice-fiasco-idUSKBN1FR2WB
    https://www.reuters.com/article/us-otc-aetna/aetna-sues-claims-administrator-kcc-over-botched-notice-in-hiv-case-idUSKBN1FQ2SR

     
  • HIPAA Security Trickle-down? Notifications State Sensitive Information Not Contained In Stolen Devices

    According to databreaches.net, two medical entities recently alerted patients of a data breach: Eastern Maine Medical Center (EMMC) and Nevro Corporation.
    In the case of EMMC, an external hard drive went missing. For Nevro, a number of laptops were stolen during a break-in. Information contained in these devices was not protected with data encryption in either case, but then again, "sensitive information" was not stored on any of the devices involved.
    While the lack of encryption seems reasonable at first glance, the truth is that a number of HIPAA / HITECH regulations were probably broken.  

    Eastern Maine Medical Center

    In the case of EMMC, the data breach was triggered when a third-party vendor's hard disk drive disappeared. Bangor Daily News reports that the "missing hard drive contains information on 660 of the patients who underwent cardiac ablation between Jan. 3, 2011 and Dec. 11, 2017."
    The missing drive was last seen on December 19. Reportedly, the storage device contained:
    Patients' names, dates of birth, dates of their care, medical record numbers, one-word descriptions of their medical condition and images of their ablation… [but NOT] Social Security numbers, addresses and financial information.
    On the face of it, it looks like the data breach could be classified by most people as "small potatoes."  

    Nevro Corporation

    Unlike EMMC, Nevro was responsible for its data breach. And yet, the company cannot be strongly faulted for the data mishap: it's not as if the laptops were in an unsafe location (like an employee's car). The laptops were at the company's headquarters, which one assumes was reasonably secure against break-ins.
    Per Nevro's breach notification letter, "nearby business were also targeted" and laptops were stolen from them as well, so chances are that Nevro had comparable security in place. (Either that or most businesses in the area decided to dispense with security, a dubious assumption).
    The company noted that all of the stolen laptops were password-protected "although not all were encrypted." Yet, the silver-lining is that "limited categories of information" were stored on these devices and that none of them "contained, sensitive identifying information such as Social Security or other government-issued identification numbers or credit card or financial institution information."
    The "limited information" pertains to names, addresses, and other similar information listed by EMMC. Indeed, Nevro seemingly implies that it's only sending affected patients because
    applicable state law considers this type of information [limited information about your treatment relationship with Nevro] sufficient to warrant a notification.
    Again, most people would look at this as small potatoes (especially when you take into consideration what Equifax admitted to last September. That was definitely not small potatoes; heck, it went well beyond the tuber family).
    As pointed out in previous posts, such "not sensitive" information can still be used to carry out fraud and scams. Tech support scams, for example, are successful even though there is very little personal data involved. Can you imagine how much more convincing a phone scam would be if someone called a person about his or her cardiac ablation?
    That being said, there is a remote possibility of it happening. In contrast, the malicious use of SSNs and other information generally considered to be "sensitive" is more than possible. So, the lack of what most people would deem "sensitive personal information" should come as something of a relief to patients.  

    Could Still Be a HIPAA Breach

    It may not be, however, a relief for the two organizations. A cursory search on the internet seems to indicate that both fall under the purview of the Health Insurance Portability and Accountability Act (HIPAA). HIPAA has very strict definitions of what is and is not PHI (protected health information).
    As this link shows names, physical addresses, telephone and fax numbers, email addresses, etc. are considered to be PHI if combined with certain information, such as what medical treatment one was receiving. So, technically, it looks like the two organizations have a full-blown medical data breach in their hands.
    It goes without saying that the use of full disk encryption would have paid off wonderful dividends in both cases because HIPAA provides safe harbor if data is encrypted when lost or stolen. That not being the case, what will be the fallout?
    HIPAA / HITECH data security compliance is administered and overseen by the Office of Civil Rights (OCR) of the Department of Health and Human Services. The OCR has not been shy in dispensing monetary penalties, sometimes in the millions of dollars.
    And, as befitting such large sums, it often takes years to reach a decision on how to deal with HIPAA covered-entities that have suffered a data breach.
     
    Related Articles and Sites:
    https://www.databreaches.net/records-of-pain-device-patients-on-stolen-nevro-laptops/
    https://www.databreaches.net/eastern-maine-medical-center-notifying-660-cardiac-ablation-patients-after-vendors-hard-drive-discovered-missing-or-stolen/
     
  • Coca-Cola Laptop Theft Lawsuit From 2014 Still Ongoing

    Over at bna.com, Bloomberg Law reminds us that there are a number of "legal battles over workplace cybersecurity being waged" in the USA. For example, ENSLIN v. THE COCA-COLA COMPANY ET AL, which has been ongoing since 2014.
    The breach was covered here and here previously, and the short version is: A Coca-Cola employee stole laptops that were meant to be disposed of, triggering the data breach. This results in a former Coca-Cola employee, Enslin, suing the beverage maker because it failed to adequately protect employee information. In a complaint, Enslin says that he fell victim to identity theft and other crimes not long after receiving the breach notification letter.
    When the story originally hit the wires in 2014, there was a dearth of information. Three years and a bunch of court filings later, we have more to go on.  

    Quis Custodiet Ipsos Custodes? (i.e., Who will Police the Police?)

    As noted in a previous post, it would have been hard (possibly near impossible, depending on the circumstances) for Coca-Cola to prevent the theft of the laptops. The computers at the center of the breach – 55 of them, stolen from Coca-Cola over a period of six-and-a-half years – were meant to be disposed of… and the thief, another Coca-Cola employee, Thomas William Rogers III, was the person responsible for disposing of them.
    It was later reported that Coca-Cola only became aware of the situation when they received an anonymous tip (ajc.com):
    On November 17, 2013, the anonymous caller contacted Coke security and reported the company owned equipment was going to be moved at any moment due to a big fall out between the employee [Rogers] and his wife.
    Most of the computers were found in Rogers's home, but a number of them were given to acquaintances as well. The media reported that all the stolen machines were eventually recovered (but not necessarily, according to court documents).
    Sensitive personal information like Social Security numbers and driver's license numbers were found when the company performed forensic data analysis on the machines, triggering data breach notification laws.
    A number of months later, Enslin and his family found themselves mired with identity theft problems while on vacation. The problems gradually snowballed, with criminals using his information to obtain a job; purchasing thousands of dollars of merchandise; attempting changes of address to further scams; etc.  

    Contradictions and Errors

    Data breaches, no matter how straightforward, always contain an element of uncertainty in them and the Coca-Cola situation is no different.
    Initially, the media reported that 53 laptops were involved in the data breach. At some point, that was corrected to 55 laptops. The interval of the breach also increased, from 5 years to more than 6 years. Also, it's been reported that the stolen machines' prices ranged from less than $500 to $2500, leading one to ask whether it really was only decommissioned laptops that were stolen.
    Perhaps the confusion originates from Rogers himself, who,
    In a written statement to [Coca-Cola], Thomas Rogers stated he had "a couple of boxes full of laptops" but "didn't know how much equipment he had" because he had been accumulating it for five years [ajc.com]
    The implication, then, is that there could be more laptops out, even if it was reported that all computers were accounted for. The company admitted as much to the courts:
    After it learned of the breach, Coca-Cola sought to recover its missing hardware, and while it has "a very good feeling" that it has been able to recover it all, it cannot say for sure. [gpo.gov]

    Could Have Had Better Security

    In light of the above, could Coca-Cola be accused of being lax in their responsibilities? It would be hard not to.
    They could have easily prevented a data breach (not necessarily the physical act of the laptop theft itself) by employing disk encryption on all and any computers, be they laptops or desktops. Without the correct password, Rogers wouldn't have been able to access the machines, and so the personal, sensitive information would have been protected.
    Furthermore, the company could have designed their process for retiring computer equipment to include the deletion of the encryption key for each computer prior to giving it up for disposal. By doing so, the data would still be protected if the passwords were obtained by Rogers somehow.
    (And let's not forget that encryption protects companies from data breaches while the machines are being used in everyday life – break-ins and loss/misplacement have been prominent sources of breaches, too).
    Also, it is very troubling that this went on for more than six years. The fact that a domestic disturbance is how the breach was uncovered... well, that's just not how well-thought out security is supposed to work.
    There were some very obvious failures here.
    However, there is "being lax in their responsibilities" in a moral, ethical manner and being so in a legal manner. The courts so far seem to be of the opinion that, in the latter, Coca-Cola was not in the wrong.
    Regardless of what the ultimate outcome may be, one thing appears to be pretty clear: properly securing the data would probably have been cheaper than defending against a lawsuit that's taking more than three years to resolve.
     
    Related Articles and Sites:
    https://www.lexology.com/library/detail.aspx?g=257d13b9-68c6-4c1b-ab82-ae6cb7b4eea7
    https://www.bna.com/cocacola-hack-whos-n73014474058/
    https://www.databreaches.net/the-coca-cola-breach-and-whos-on-hook-for-security-of-employee-data/
    http://www.businessinsurance.com/article/20151013/NEWS06/151019945/pennsylvania-federal-court-rules-for-coca-cola-identity-theft-victim
    http://www.ajc.com/news/breaking-news/coca-cola-employee-accused-stealing-laptops/fCwIbS552KtbByqq6qBiVM/
    https://www.gpo.gov/fdsys/pkg/USCOURTS-paed-2_14-cv-06476/pdf/USCOURTS-paed-2_14-cv-06476-2.pdf
    https://www.gpo.gov/fdsys/pkg/USCOURTS-paed-2_14-cv-06476/content-detail.html
     
  • Penn Medicine Sending Breach Notifications To 1000 Patients Over Stolen Laptop

    Penn Medicine has revealed this past week that a laptop computer with protected health information (PHI) was stolen on November 30. While the details are meager (aside from a short entry at philly.com, which is referenced by databreaches.net, an online search comes up empty), the following was revealed:
    • About 1000 people were affected.
    • The laptop was stolen from a car at a parking lot.
    • Breached information includes "patient names, birth dates, medical records, account numbers, and some other demographic and medical information."
    • It does not include "Social Security numbers, credit card or bank account information, patient addresses or telephone numbers stolen."
    Penn Medicine promised to review procedures to ensure that patient information is protected on portable devices.  

    What is This, 2009?

    In an age when breaches can – and do – involve tens of millions of people, Penn Medicine's data breach almost feels quaint. And, yet, that's why it's also not so easy to forgive.
    Consider servers with massive amounts of data that are hooked up to the web, and thus, "hackable" by anyone with a decent internet connection, in both theory and practice. Indeed, a small group of network and security professionals are exploring the build-out of a separate, "better" (better secured?) internet, seeing how our current global communications web will be forever playing security catch-up to the bad guys.
    So, even if millions of people are affected by a breach, it's "understandable:" it shouldn't be happening, and we feel outraged when it happens, and lawsuits are going to be filed left and right, but we get it: there's very little that can be done unless we redesign everything.
    But when it comes to an individual laptop computer, there is a proven method of ensuring that its contents as a result of a burglary. It's the same method that led to the Apple vs. FBI face-off two years ago: full disk encryption. It's a very well established technology that's been around forever.
    Indeed, most hospitals, clinics, and medical practices routinely use full disk encryption to protect not only their laptops but also their desktop computers, which have been proven less than immune from theft. And, larger organizations have been more aggressive and thorough than smaller concerns, not in small part due to lawsuits brought by the federal government.
    For example, BlueCross BlueShield of Tennessee knows that they should encrypt any hard drives that are used to store phone call recordings, an insight they obtained after being embroiled in what was one of the largest data breaches in history at the time.
    This lesson was learned in 2009.
    So, when one reads, in 2018, that one of the bigger hospitals in the US is looking to review their procedures to ensure that patient information is protected on portable devices... it sounds tone-deaf. Technically, as a HIPAA covered-entity, they should be doing this periodically or whenever security conditions change.  

    Related Articles and Sites:
    http://www.philly.com/philly/health/penn-medicine-patient-information-stolen-identity-theft-hipaa-20180102.html
    https://www.databreaches.net/penn-medicine-computer-with-patient-info-stolen/
     
  • 24,000 Affected After UNC Health Care Desktop Computer Stolen

    We're on the cusp of 2018, yet data breaches that smell like 2008 are still making an appearance. According to various news outlets, UNC Health Care has announced a data breach that involved approximately 24,000 patients when a computer – a desktop computer – was stolen during a break-in.
    The breached data:
    …includes names, addresses, phone numbers, employment status, employer names, birthdates and Social Security numbers, said UNC Health Care, adding that it does not believe any treatment, diagnosis or prescription records were kept on the computer other than diagnosis codes used for billing. [bizjournals.com]
    That last part may be somewhat comforting, but SSNs, names, addresses, and birthdates… that information can be easily used for fraud, as pretty much everyone knows.

    Acquisition Headaches?

    It's hard to believe that an institution the size of UNC Health Care can still be embroiled in a data breach that involves an unencrypted desktop computer. It's been years since HIPAA regulators showed that they mean business when it comes to data breaches involving private health information (PHI), via the issuance of fines and other penalties.
    As a result, many HIPAA covered entities have gone a long way towards ensuring that they've at least fulfilled the minimum security requirements, which generally involves the use of full disk encryption for computers and laptops. Had the computer in question been encrypted – which it's safe to assume it wasn't, per the media coverage surrounding it – it would have been a non-event; tantamount to losing, say, a chair.
    On the other hand, when you see that UNC Health Care is a network of hospitals, and realize that such fragmentation brings its own challenges when securing data, perhaps it's not so surprising.
    And yet, safeguarding PHI, even in such situations, is not impossible. With the proliferation of wireless and mobile internet, logistical nightmares of years past are far from insurmountable. Deploying and installing disk encryption on endpoints, even those that never come in from the field, can be done quite easily.
    But, there's a twist here. Apparently, the building from which the computer was stolen was a relatively new acquisition, which tends to bring it's own set of problems:
    A break-in at the UNC Dermatology & Skin Cancer Center in Burlington resulted in the theft of a computer …. The center – formerly known as Burlington Dermatology Center or Burlington Dermatology – is located on Vaughn Road and was acquired by UNC Health Care in 2015. [chapelboro.com]
    For a lot of people, that last figure, 2015, would likely prevent them from giving UNC Health Care the benefit of the doubt on whether they were negligent regarding PHI security. Even if the acquisition had taken place in December of 2015, they had nearly two years to do something regarding the security of digital data. It's especially egregious when you consider that:
    UNC Health Care … ensured that all remaining computers acquired from, or kept for use by Burlington Dermatology have been properly secured. UNC Health Care has also implemented process improvements to ensure that future acquisitions of physician practices include a process to properly secure legacy computers and electronic patient information. [wfmynews2.com]
    The break-in occurred on October 8. The above statement was present in wfmynews2.com's article dated December 8. They managed to secure in two months what they did not in two years?

    Granted, it looks like they missed the boat because they had not set a process "to ensure that future acquisitions…include a process to properly secure legacy computers"… but why didn't they? Based on their patchwork of hospitals, it feels like Burlington is not their first acquisition. So, one imagines that they should have had something per HIPAA's Administrative Safeguards, where

    "…a covered entity must identify and analyze potential risks to e-PHI, and it must implement security measures that reduce risks and vulnerabilities to a reasonable and appropriate level."
    And if not, then there is the HIPAA Physical Safeguards, where
    "…a covered entity must implement policies and procedures to specify proper use of and access to workstations and electronic media. A covered entity also must have in place policies and procedures regarding the transfer, removal, disposal, and re-use of electronic media, to ensure appropriate protection of electronic protected health information (e-PHI)."
    And if not, then there is the HIPAA Technical Safeguards, where
    "A covered entity must implement hardware, software, and/or procedural mechanisms to record and examine access and other activity in information systems that contain or use e-PHI."
    (Per the government's HIPAA site).  
     
    Related Articles and Sites:
    http://www.newsobserver.com/news/business/article188757969.html
    https://chapelboro.com/news/crime/unc-health-care-notifying-patients-potential-privacy-breach
    https://www.bizjournals.com/triad/news/2017/12/08/unc-health-care-computer-stolen-from-triad.html
    https://www.databreaches.net/24000-unc-health-care-patients-affected-by-potential-security-breach/
     
  • Uber Being Investigated For 2016 Data Breach

    Uber, the ride-sharing Silicon Valley unicorn, is… still in the news. They say that all publicity is good publicity – even the bad ones – but Uber is really taking that saying to its limits, it seems.

    This week, it was revealed that the company had been hiding a massive data breach that occurred over a year ago. The breach involved personal information including names, email addresses, and phone numbers of 57 million customers worldwide. In addition, driver's names and their license numbers were illegally accessed as well (7 million in total; 600,000 drivers in the US alone). According to bloomberg.com,

    Here’s how the hack went down: Two attackers accessed a private GitHub coding site used by Uber software engineers and then used login credentials they obtained there to access data stored on an Amazon Web Services account that handled computing tasks for the company. From there, the hackers discovered an archive of rider and driver information. Later, they emailed Uber asking for money, according to the company.
    Unsurprisingly, many states – including Illinois, Massachusetts, Missouri, New York, Connecticut, and Washington – have announced an investigation into the matter. Data security regulators in other countries have done the same.  

    A Checkered Past

    It was just this past August that Uber agreed to a settlement with the FTC, closing a probe into how Uber misled customers regarding its privacy practices: the company allowed employees to access riders' personal information, including the details of trips, via a tool called "God View." The problem was described by some as a "lapse" in the ride-hailing company's security practices.

    In addition, the company had to deal with a data breach (smaller than the one being discussed here). The FTC looked into the issue and concluded, per recode.net:

    For years, Uber stressed it had taken great steps to protect its driver and rider data — all stored using Amazon’s cloud service. Until 2015, however, some of that information was saved as "clear, readable text, including in database back-ups and database prune files, rather than encrypting the information," the FTC said.

    The end result? Uber agreed to 20 years of oversight, the implementation of a comprehensive privacy policy, etc. The usual stuff that Big Tech companies agree to. An "onerous" slap on the wrist.

    (However, as recode.net points out, the settlement hasn't been finalized. The FTC must vote on it, and some lawmakers had urged the FTC to increase the penalties, perhaps even open a new investigation based on what the probe had revealed. This was before the latest revelation).  

    Hackers Bad. Lawyers Even Worse?

    When Bloomberg broke the news about Uber's latest transgression, two people were fired, including Uber's Chief Security Officer, Joe Sullivan. When approached by the hackers, Sullivan and Craig Clark, a lawyer with the company, made the decision to pay the attackers $100,000 to delete the data and to stay quiet about the incident.

    While none of that is illegal – paying off the hackers, asking them to be quiet, the hackers actually keeping quiet, and the hackers deleting the data they had acquired – what Uber did afterwards is.

    The US has 48 separate data breach notifications laws. Most of them are similar. For example, most have a specific definition of what "private data" is and is not, and generally require a notification to be sent within 60 calendar days of discovering the breach. Also, they provide safe harbor from notifying clients after a data breach if the data was encrypted.

    Unfortunately, not all states offer the same protection, meaning that if your business is big enough, you're going to have to come clean anyhow: while people may be willing to believe that a Brooklyn-located mom-and-pop store's data breach affected New York residents only, it'd be very unusual that only New York residents were affected by a Uber hack. So, it makes no sense to announce a data breach in New York only (assuming New York does not provide encryption safe harbor) because people excel at adding two and two together.

    In addition, the European Union has very extensive privacy safeguards in place, and data breach notifications, at least to regulators, are de rigueur. So, again, if your business is big enough that it traverses your home country's natural borders, then you're going to have to fess up. Because people also excel at adding deux and deux together.

    When Sullivan and Clark decided to conceal what had happened, they broke… the same law, essentially, oh-so-many-times. The fact that lawyers decided to take this approach (Sullivan, the unseated CSO, was a federal prosecutor earlier in his life) is surprising à la Schrodinger's meow – that is, knowing what we do about Uber, surprising and unsurprising at the same time.

    Things appear to be changing now that someone new is at the helm; otherwise, we may never have learned of the breach. And yet it feels as if the corporate miasma will take a while to disperse (from thenewstribune.com):

    In a letter to Washington Attorney General Bob Ferguson's office last week, an Uber attorney wrote that the company "now thinks it was wrong not to provide notice to affected users at the time" [of the 2016 Uber data breach].

    Really? Now they think it was wrong?

     

    Related Articles and Sites:
    https://www.bloomberg.com/news/articles/2017-11-21/uber-concealed-cyberattack-that-exposed-57-million-people-s-data
    http://searchsecurity.techtarget.com/podcast/Risk-Repeat-Uber-data-breach-has-implications-for-infosec
    http://www.thenewstribune.com/news/local/article187221548.html
    https://www.recode.net/2017/11/22/16690556/uber-data-hack-57-million-state-investigation

     
More Posts « Previous page - Next page »