This Blog




AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.


AlertBoot Endpoint Security

AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.
  • FBI Director Says Legislation Possibly A Way Into Encrypted Devices

    Last week, FBI Director Christopher Wray said that legislation may be one option for tackling the problem of "criminals going dark," a term that refers to law enforcement's inability to access suspects' data on encrypted devices. The implication is that, in the interest of justice and national security, the FBI will press for a law that will guarantee "exceptional access" to encrypted information. This most likely will require an encryption backdoor to be built on all smartphones, possibly on all digital devices that store data.
    It should be noted that the FBI emphatically denies that they want an encryption backdoor. One hopes they have taken this position because they're aware of the security problems backdoors represent; however, it's hard to ignore the possibility that the FBI is in spin-doctor mode. Their Remote Operations Unit, charged with hacking into phones and computers of suspects, uses terms like "remote access searches" or "network investigative techniques" for what everyone else would call "hacking" and "planting malware." Mind you, their actions are legally sanctioned, so why use euphemisms if not to mask what they're doing?
    If turning to legislation smells of déjà vu to old-timers, it's because this circus has been in town before. It set up its tent about 20 years ago and skipped town a couple of years later. And while many things have changed in that time, the fundamental reasons why you don't want encryption backdoors has not.  

    A Classic Example of Why You Don't Want a Backdoor

    The FBI has implied time and again that they are in talks with a number of security experts that supposedly claim the ability to build "encryption with a backdoor" that cannot be abused by the wrong people. These security experts are not, the FBI notes, charlatans. Perhaps it is because of these experts that the FBI has not desisted from pursuing backdoors. This, despite the overwhelming security community's proclamation that it cannot be done.
    It should be noted that Wray was asked by a Senator at the beginning of this year to provide a list of cryptographers that the FBI had consulted in pushing forth their agenda. To date, such a list has not been produced.
    (As an aside, according to, Ray Ozzie, arguably one of today's greatest minds in computing, has recently and independently proposed a way to securely install a backdoor without compromising the security of encryption. Here's one of the world's leading security expert's take on it: the conclusion, in a nutshell, is that it's flawed and mimics unsuccessful solutions proposed in the past).
    What is it about backdoors that their mention result in knee-jerk reactions by the security community? The answer lies in the fact that they have been looking into this for a long, long time. In the end, it's the unknown unknowns that are the problem: Encryption solutions run into surprises (bad ones) all the time. No matter how well-designed, it's impossible to prevent stuff like this or something like this from happening.
    In June 2017, it was reported that over 700 million iPhones were in use. Not sold; in use. It can also be assumed that there are at least an equal number of Android devices in use as well. That would be a lot of compromised devices if a backdoor was in effect and a bug was introduced.
    These issues cannot be legislated away. Furthermore, bugs merely represent one situation where a backdoor can lead to disaster. Others include the deliberate release of how to access the backdoor (think Snowden or Manning or the leak of CIA hacking tools); the phishing, scamming, conning, or blackmailing of the custodians of the backdoor; and the possibility of stumbling across the backdoor. Granted, the last one is highly unlikely, even more so than the others…but so are the chances of winning the lottery. And there have been hundreds, maybe thousands, of them across the world.
    The point is that the chances of the backdoor being compromised are higher than one would expect.  

    Moral Hazard = FBI's Pursuit of the Impossible?

    One has to wonder why the FBI is so insistent on pursuing the impossible dream of an encryption backdoor that doesn't compromise on security. It would be easy to dismiss it as a case of legal eggheads not knowing math and science, or not having the imagination to ponder how badly things could go wrong.
    But perhaps it's an issue of moral hazard. Basically, there is very little downside for the FBI if a backdoor is implemented. Everyone knows that, if the FBI gets what it wants, they won't have direct access to the backdoor; it wouldn't be politically feasible. For example, prior to suing Apple in 2016, they suggested that Apple develop a backdoor and guard access to it. When the FBI presents an iPhone and a warrant, Apple unlocks the device. The FBI is nowhere near the backdoor; they're by the water-cooler in the lobby.
    The arrangement sounds reasonable until you realize that the FBI doesn't take responsibility for anything while reaping the benefits. The FBI does not have to develop, test, and implement the backdoor. Once implemented, it doesn't have to secure and monitor it. If there is a flaw in the backdoor's design, the FBI dodges direct criticism: they didn't design it, don't control it, etc. Last but not least, the onus is on the tech companies to resist foreign governments' insistence on being given access to encrypted data. Which you know will happen because they know the capability is there.
    It's a classic case of heads, I win; tails, I don't lose much.
    Related Articles and Sites:
  • Most of the Used Memory Cards Bought Online Are Not Properly Wiped

    According to tests carried out by researchers at the University of Hertfordshire (UK), nearly two-thirds of memory cards bought used from eBay, offline auctions, and second-hand shops were improperly wiped. That is, the researchers were able to access images or footage that were once saved to these electronic storage units… even if they were deleted.


    Free and Easy to Use Software

    Popular media would have you believe that extracting such information requires advanced degrees in computers as well as specialized knowledge and equipment. These would certainly help; however, the truth is that an elementary school student would be able to do the same. The researchers used "freely available software" (that is, programs downloadable from the internet) to "see if they could recover any data," and operating such software is a matter of pointing and clicking.
    In this particular case, the recovered data included "intimate photos, selfies, passport copies, contact lists, navigation files, pornography, resumes, browsing history, identification numbers, and other personal documents." According to, of the one hundred memory cards collected:
    • 36 were not wiped at all, neither the original owner nor the seller took any steps to remove the data.
    • 29 appeared to have been formatted, but data could still be recovered "with minimal effort."
    • 2 cards had their data deleted, but it was easily recoverable
    • 25 appeared to have been properly wiped using a data erasing tool that overwrites the storage area, so nothing could be recovered.
    • 4 could not be accessed (read: were broken).
    • 4 had no data present, but the reason could not be determined

    Deleting, Erasing Wiping… Not The Same

    Thankfully, it appears that most people are not being blasé about their data. They do make an effort to delete the files before putting up their memory cards for sale. The problem is, deleting files doesn't actually delete files. (This terminology morass is the doing of computer software designers. Why label an action as "Delete file" when it doesn't actually do that?)

    The proper way to wipe data on any digital data medium is to overwrite it. For example, if you have a hard drive filled with selfies, you can "delete" all of it by saving to the disk as many cat pictures you can find on the internet (after having moved all of the selfies to the trash/recycle bin on your desktop). This is analogous to painting over a canvas that already has a picture on it, although the analogy breaks down somewhat if one delves into technical minutiae.

    Incidentally, this is why encryption can be used to "wipe" your drive: Encryption modifies data so that the data's natural state is scrambled / randomized. When an encryption key is provided, the data descrambles so that humans can read it. Once the computer is turned off, the data returns to its scrambled state. So, if you end up selling a memory card with encrypted data but without the encryption key, then it's tantamount to offering for sale a memory card that's been properly wiped.


    More of the Same

    This is not the first time an investigation has been conducted into data found on second-hand digital storage devices. As the article notes, similar research was conducted in the past:
    A study conducted in 2010 revealed that 50% of the second-hand mobile phones sold on eBay contained data from previous owners. A 2012 report from the UK's Information Commissioner's Office (ICO) revealed that one in ten second-hand hard drives still contained data from previous owners. A similar study from 2015 found that three-quarters of used hard drives contained data from previous owners.
    And these are but a small sample of the overall number of similar inquiries over the years. The world has seen more than its fair share of privacy snafus, be it a data breach or otherwise. Despite the increased awareness on data security and its importance, the fact that we're still treading water when it comes to securing data in our own devices could signify many things:
    • People don't really care, even if they say they do.
      • No surprises there.
    • We are too focused on spotlighting the problem and failing in highlighting the solution.
      • News anchor: "Yadda yadda yadda…This is how they hacked your data. Be safe out there. And now, the weather." Be safe how? What do I do to be safe?
    • People interested in preserving their privacy do not sell their data storage devices; hence, studies like the above are statistically biased to begin with.
      • Essentially, researchers are testing the inclinations of people who don't really care about privacy or don't care enough to really look into it (a quick search on the internet will show you how to properly wipe your data).
    • Devices sold were stolen or lost to begin with, so the sellers do not have any incentive to properly wipe data.

    Whatever the reasons may be for the continued presence of personal data on memory storage devices, regardless of how much more aware we are of privacy issues, one thing's for certain: It's not going away.


    Related Articles and Sites:

  • SCOTUS Says Cops Need Warrant For Location Data From Network Providers

    It's hardly a secret that the proliferation of digital devices has opened up opportunities and headaches for law enforcement. In the former camp, modern communication devices are de facto tracking devices that store and generate copious amounts of data; access to it could easily make or break a case. In the latter, the use of encryption and other security measures makes it challenging, if not impossible, to access that same data. And now the courts are making it even more onerous to obtain it.  

    Get a Warrant

    According to a recent ruling, law enforcement will "generally" require a warrant to obtain a person's location data from network providers. Before this ruling, the Third Party Doctrine stated that a person gives up their expectation of privacy when he shares information with a third party (like banks, ISPs, phone companies, etc). Hence, law enforcement could request data from these third parties without a warrant; they only had to prove that the information they were seeking could be pertinent in an investigation. For example, police could ask a bank to see a suspect's credit card transactions, since it would pinpoint a person's location at a particular time. In fact, they can still do this going forward.
    However, the Supreme Court has decided otherwise when it comes to your location as pinpointed by your cellphone, more specifically, the cellphone location data created and collected by telcos. It is hardly a surprising judgment. For example, bugging a person's vehicle with a tracker requires a court order because continuous tracking is considered a violation of privacy expectations.
    Of course, there is a difference between bugging a car and using a cell phone: be the phone dumb or smart, it's not the government doing the bugging – you're bugging yourself and paying a company every month to do so. The government could argue (and probably has) that they're merely picking up a trail that you've agreed to broadcast. It would be no different from you tossing evidence left and right as you flee from a crime scene, creating a trail to yourself as you make your getaway. There's nothing illegal in law enforcement following that trail. Indeed, they'd be remiss in not doing so.
    The thing is, though, that life for many now revolves around access to the services that telcos offer. Well over half the population is using either a dumb or smart phone, and these devices need to know your location. Otherwise, you wouldn't be able to receive calls or texts. This is also the case for accessing mobile internet.
    Furthermore, these devices are very rarely turned off, for obvious reasons. So, the data that's collected by telcos and shared with law enforcement would include information that traditionally requires a warrant anyway. The warrant requirement for bugging a vehicle was already mentioned. Even more sacrosanct is the privacy of a person in one's home, and law enforcement's incursion nearly always requires a warrant. Even pointing a thermal imaging device to a person's home without court approval is illegal, which technically does not involve "entering" the home but does involve obtaining evidence from it.
    So, an "information dump" of telco data over an extensive period would already come to loggerheads with such legal restrictions, it seems.
    However, the judges ruled, five to four, that a warrant is necessary when accessing telco location data because the data allows the type of surveillance from which the Fourth Amendment protects US citizens. Remember, the Fourth exists because the British authorities would look for evidence of wrongdoing whenever they felt like it, wherever and whomever it might be.  

    The Fourth Amendment

    The US Constitution provides protections from that sort of thing. Certainly, you could have committed a crime. And, certainly, evidence of said crime could be in your home. But, the law can only enter your home and search for that evidence, and only that evidence, if they have probable cause, which is the grounds for issuing a warrant.
    Consider, then, the aspects of the information that law enforcement claims it should be able to access without a warrant:
    • Location data is very accurate. Not as accurate as GPS but close enough – and the technology will only get better to the point that it will be just as good as GPS or better.
    • This data now covers a sizable number of the entire US population, seeing how Americans of all stripes and colors carry a cellphone.
    • The collected data is excessive. A person's location can be pinged by cell towers multiple times every minute. One can literally tell where a person is every minute of the day.
    • The data is retroactive. The location data is stored by telcos for up to five years. Change the law, and it could be ten years. Perhaps even longer if DNA storage finally happens. ('Cause, let's face it, the only reason why telcos don't want to keep this data long term is tied to storage costs).

    So, we're talking about data that's akin to what would be generated if you implanted a tracking chip on most Americans and let them go about their lives. And because the government didn't force anyone to do this, and third parties are involved, a warrant shouldn't be necessary when trying to get a hold of this data. This, in a nutshell, was their (very dystopian) argument.

    The courts (barely) disagreed.

    However, it follows a number of rulings over recent years where the courts have upheld privacy interests over law enforcement. It seems that slowly, but surely, as the effects and impact of technology begins to leave an imprint upon all – as opposed to the young or the hip or the tech-savvy – people are beginning to have a better understanding of what's at stake.  


    Related Articles and Sites:

  • Yahoo Penalized £250,000 By UK Information Commissioner's Office

    It was reported this week that the United Kingdom's Information Commissioner – the person whose department is in charge of upholding the nation's data privacy laws – has penalized Yahoo! UK Services Limited with the amount of £250,000.
    The penalty is in response to the global data breach Yahoo experienced, and hid, for over two years. Approximately 500,000 accounts in the UK were affected.
    Knowing what we do of the Yahoo breach, and keeping in mind that the ICO can issue a monetary penalty of up to £500,000, it sounds like a woefully inadequate amount. For example, the US's SEC, the Securities and Exchange Commission, fined Yahoo $35 million, a little over 10 times the ICO's penalty.  

    Data Breach Not the Issue?

    According to, Yahoo UK was not fined for the data breach. Apparently, what the ICO views as problematic is the long delay in notifying people of the data breach (two years!).
    Which is crazy if it's true.
    There was no "delay." Yahoo didn't fail to alert users of the data breach "in a timely manner." The company, for all intents and purposes, appears to have actively hid the data breach – which is the real scandal; data breaches involving hundreds of millions of people are not a rarity anymore, and neither is going public with the fact at the speed of molasses – of which not alerting affected users is a key component. To fine Yahoo UK for taking longer than usual in notifying people of a data breach is bonkers.
    Thankfully, it seems that the ICO took more than the so-called delay into account:
    • Yahoo! UK Services Ltd failed to take appropriate technical and organisational (sic) measures to protect the data of 515,121 customers against exfiltration by unauthorized persons;
    • Yahoo! UK Services Ltd failed to take appropriate measures to ensure that its data processor – Yahoo! Inc – complied with the appropriate data protection standards;
    • Yahoo! UK Services Ltd failed to ensure appropriate monitoring was in place to protect the credentials of Yahoo! employees with access to Yahoo! customer data;
    • The inadequacies found had been in place for a long period of time without being discovered or addressed.
    Still, the explanation doesn't quite make sense. In the past, the ICO has issued penalties as high as £400,000 for data breaches, as well as other violations of the Data Protection Act. Considering only instances involving data breaches, aside from Yahoo, none of the companies have swept incidences under the rug. They were accused of being technically negligent (same as Yahoo); of having financial, technical, and other means to ensure better data security (same as Yahoo); of not being aware that they were hacked, when they could easily have figured that out (same as Yahoo); etc. In most cases, if not all, less people were affected than in the Yahoo breach.
    So why is Yahoo UK's penalty so much lower? Especially considering that the other companies do not have the dubious reputation of actively hiding the fact that they were hacked? If anything, you would think Yahoo UK's penalty would have hit a new high in the history of ICO monetary penalties to date.
    Related Articles and Sites:
  • FBI Inflated Encrypted Smartphone Count

    Over a number of years, the FBI kept making the case for an encryption backdoor to smartphones. Of course, because "encryption backdoor" is a charged term, they said that they didn't need a backdoor per se, just a (secret) reliable way to get into encrypted devices when they obtained a warrant.
    This twisting of words is risible because "a reliable way to get into encrypted devices" is kind of the definition of a backdoor. Even the passwords set by smartphone owners are not reliable in the vein that the FBI wants them to be since people are prone to forgetting passwords: What if you went on a digital detox for a month and you actually did forget it? What if you changed it while drunk? What if you had a concussion? So, if you're looking for a method that will work 100% of the time, well… it's got to be a backdoor.
    As part of their case for notbackdoors, the FBI quoted the number of inaccessible devices that were at the center of unsolved crimes. In January 2018, the Director of the FBI, Christopher Wray, emphasized in a speech at a cyber-security conference that nearly 7800 devices could not be accessed in 2017.
    Last week, the Washington Post wrote that the figure was inflated, which was confirmed by the FBI. The actual number of devices that are inaccessible has not been released as of yet, but it's believed to be between 1000 to 2000, a range that is more in line with the 2016 figure: 880 encrypted devices.
    Why the sudden decrease? The FBI says they made an error when compiling their data, a result of having the data in three separate databases instead of one, central one.  

    Credibility Issues

    The FBI has credibility issues. In areas other than encryption, it could be because they're victims of concerted political smear campaigns. Who knows, really. But when it comes to encryption, the Bureau keeps painting itself into a corner.
    This month, it was the revelation of overinflated figures.
    In 2016, the FBI took Apple to court, arguing that they had exhausted all avenues for accessing a terrorist's encrypted iPhone. Towards the end of the legal battle, most experts were learning towards the opinion that the FBI would lose. Coincidentally, or not, the Bureau dropped their lawsuit at the eleventh hour, saying that they had found a third party that could crack open the phone's contents for them.
    Later that same year, the Office of the Inspector General reported that an internal miscommunication led the FBI to conclude that they had tried everything to crack the iPhone's encryption… but they hadn't. (So, technically somehow, the FBI wasn't lying when they said they had).
    And earlier this year, a second company announced discovering ways around iPhone encryption and began selling these techniques to law enforcement. At relatively affordable prices, one might add. So. Over the last couple of years, the FBI has essentially:
    • Mislead the public and Congress, probably not on purpose;
    • Tried to force a company to redesign a key component of their profit driver under the auspices of national security, as if we were living in a Soviet-era communist nation, despite the fact that said company hadn't done anything illegal (because, otherwise, why'd they drop the case? They should have continued even if they eventually found a way into the iPhone);
    • Passive-aggressively insinuated that the entire tech community is a group that encourages and enables criminals, evidenced by its unwillingness (and not mathematical impossibility) to create an encryption backdoor that's not a backdoor, because, you know, that's not what the FBI wants. This, despite the NSA and the CIA issuing declarations that backdoors and other forms of intentionally crippling security are a bad idea.
    The above, of course, does not cover scandals that involve the FBI that are not tied to encryption. It's becoming very hard not to view the FBI's action through a cynical lens.  

    Future Tools

    One has to admit that the problem of "going dark" is real. While it's anyone's guess how big a problem it currently is, it undoubtedly will grow bigger as time goes by. A solution may present itself in quantum computers.
    IBM warned earlier this year that advances in quantum computing could mean that today's ubiquitous encryption can be easily broken in five years' time. Their cost could ensure that only governments and large organizations can afford them for the foreseeable future – just like only they can afford supercomputers – satisfying the goal of not hamstringing cryptography as well as only allowing "the good guys" to break encryption when needed (and authorized).  
    Related Articles and Sites:
  • US Court Says Border Searches Require "Suspicion"

    As many travellers may know, people at US borders are subject to an altered set of laws due to the fact that… well, a border is a border. This includes "pseudo-borders" like airports that may be located well within US soil. The most obvious alteration is the seeming suspension of the Fourth Amendment, the Constitutional law that covers one's right against search and seizure.
    Last week, the media was slightly abuzz over a decision by the Fourth Circuit Court of Appeals in Virginia. The appeals court, many news sites mentioned, confirmed that US border authorities cannot search a traveller's cellphone contents without a warrant. Other sites, more law-oriented than general news sites, correctly noted that the court confirmed that phones cannot be searched without a reason (also referred to as cause or suspicion).
    The decision appears to be more nuanced than that, actually. At the border, the government is not bound to the same level of Fourth Amendment oversight as elsewhere in the US – which is why they're already able to go through your luggage for absolutely no reason whatsoever.
    Not only that, they're also able to go through you laptops, USB flash drives (the government has dogs that can sniff out electronics in your luggage), and your smartphones. And when it comes to the last one, they can poke, swipe, and press through it or even do a forensic analysis – all of it without a warrant.  

    What's a Manual Search and a Forensic Search?

    The difference between a manual and forensic search of your smartphone (or your laptop) is in whether a device for rummaging through your digital files was doing the searching. If a Customs and Border Patrol (CBP) agent looks through a phone's contents, not unlike what an average guy would do if he found a smartphone just lying on the street, that's a manual search.
    A forensic search usually requires the use of a separate computer or similar device to analyze your smartphone's files: it might go through all of your pictures, videos, emails, texts, apps, GPS data, etc. and analyze file names, file sizes, the existence of hidden files, possibly run facial recognition similar to what Facebook uses for tagging photographs, etc.
    (There is also, apparently, something that is considered to be a "deep forensic examination," although it's not detailed how it differs from a regular forensic search).
    A manual search is considered to be "routine," just like going through your luggage. A forensic search is "nonroutine." That is, you've got to have a reason for doing it. Among examples of nonroutine searches at borders, courtesy of the Appeals Court:
    overnight detention [of a suspect] for monitored bowel movement followed by rectal examination is "beyond the scope of a routine customs search" and permissible … only with reasonable suspicion [, which is the basis for nonroutine searches].
    If you're wondering why the government is detaining people to monitor their caca, it's because it involves a case where a person was suspected of being a drug mule. Notice how the word "warrant" does not show up anywhere. That's because a warrant is not necessary at the border. Whereas a warrant would probably be required in normal circumstances to carry forth what's quoted above, at US borders the government only requires "reasonable suspicion." What exactly is that, you may wonder. Per Wikipedia:
    reasonable suspicion is a legal standard of proof in United States law that is less than probable cause … but more than an "inchoate and unparticularized suspicion or 'hunch'"; it must be based on "specific and articulable facts", "taken together with rational inferences from those facts", and the suspicion must be associated with the specific individual.
    Probable cause, emphasized above, is the basis for obtaining a search warrant. At a border, you don't need probable cause; all you need is its more relaxed, less strict and chill brother – reasonable suspicion. This includes, according to the Fourth Circuit Court of Appeals, instances where the CBP wants to perform a forensic examination of your smartphone. In short, the Appeals Court confirmed that:
    • Forensic searches of phones are allowed at borders as long as authorities can validate their suspicion. This does not mean that they need a warrant. But, it does mean that they must have reason to believe that criminal activity is ongoing.
    • The Fourth Amendment gets suspended at borders.
    We already knew this. So, why all the hubbub? Because it's indicative that things could change at the border.  

    The Application of Riley

    This latest judgment is one of the handful of court decisions that declare that there is a limit to what border agents can do when searching through your devices' data. It is a reflection of US vs. Riley, where the Supreme Court ruled, in 2014, that a warrant is required to go through a phone's data. There are exceptions, of course, in exigent cases. But otherwise, a warrant is required, even if a police officer is merely conducting a manual search (and this applies to flip phones as well as smartphones). This ruling went counter to how police had operated, viewing the search of a detained person's phone as no different from a physical pat-down.
    Since then, the courts have been trying to decide whether Riley applies at borders and, if so, to what extent. Per
    Courts across the country have been struggling with how to apply the Fourth Amendment in this context, in an era when tens of thousands of people are subjected to searches of their electronic devices at the border each year. Today’s ruling from the Fourth Circuit joins an earlier decision from the Ninth Circuit Court of Appeals requiring at least reasonable suspicion for forensic searches of electronic devices seized at the border. In March, two judges on the Eleventh Circuit concluded that such searches should be treated the same as searches of physical luggage, which don’t require a warrant, while a third judge dissented, arguing for a warrant requirement. Earlier that month, a Fifth Circuit judge expressed strong skepticism that the traditional rationales for warrantless border searches should be extended to searches of electronic devices, but that court declined to set a rule.
    As you can see from the above summary, the issue is a contentious one. But if we were to make some projections based on what's happened so far, and what we know so far, it would appear that Riley cannot, and won't, be instituted exactly as it is at borders.
    As already noted, the law operates differently at the border just because it happens to be the border. A warrant has never been required at the border. That's over two hundred years of precedence. The counterargument goes that, well, we've never had the ability to carry through a border what constitutes the entire private contents that are found in your house (medical files, photo albums, correspondence, diary, etc) and then some.
    But just like invasion of privacy is greatly suspended at the border (honestly, how many people would say searching through a smartphone is more of an invasion than having one's bowel movements monitored and followed by a cavity inspection? Mind you, it has happened and was found legal by the courts), you can expect the courts to dilute the Riley findings when it comes to transnational crossings.  
    Related Articles and Sites:
More Posts « Previous page - Next page »