in

This Blog

Syndication

Tags

News

AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.

Archives

AlertBoot Endpoint Security

AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.
  • Third Circuit Appellate Court Says “OK” To Data Breach Lawsuit

    Recently, the US Court of Appeals for the Third Circuit concluded that "the improper disclosure of one's personal data in violation of FCRA [Fair Credit Reporting Act] is a cognizable injury for Article III standing purposes."

    In other words, people can go to court over data breaches and data breaches alone; there is no need to show that you were adversely affected by events following a data breach (for example, by proving that your data was misused by hackers).

    Of course, this doesn't guarantee that an individual will win in court. However, it does mean that anyone whose personal information was stolen as part of a data breach can, at least, see the inside of a court. For the past ten years or so, most (if not all) judges ruled that plaintiffs in such lawsuits didn't have "standing" and their cases were "summarily dismissed" from court. That's a fancy way of saying that the courts booted the cases and moved on to other stuff.

    When it comes to lawsuits revolving data breaches where personal information was compromised, this won't be happening anymore in the Third Circuit – which covers Delaware, Pennsylvania, and New Jersey. Hopefully, other districts will begin to see data breaches in the same light.  

    Theft of Unencrypted Laptops

    What led to this legal development? It started in 2013, with the theft of two laptops, Apple Macintoshes to be precise. These computers contained personal and medical information. Encryption was not used despite the fact that full disk encryption comes gratis on all Apple computers made since 2003. Not an ounce of hyperbole is added when I observe that the performance of a Mac is unaffected by the use of said encryption. Plus, since these were already "password-protected," users didn't have to jump through any additional "security hoops" to use their computers.

    Over 800,000 people were affected by the data breach. Oops.

    The owner of these laptops? Horizon Blue Cross Blue Shield of New Jersey, a company that's been involved in laptop-related data breaches before.

    Apparently, the only security was the cable lock that tied the laptops to their desks, and the computers' location on the eighth floor of Horizon's headquarters. Under HIPAA, this could have been perfectly adequate security.

    However, from the FCRA standpoint, it isn't. As the Third Circuit pointed out, the law behind FCRA focuses on consumer privacy. The fact that one's personal information has been transferred to persons unknown (that is, the data was easily accessible once the machines were stolen) means the company is potentially in violation of FCRA. The use of encryption, of course, could have laid this to rest three years ago, when the laptops were stolen. Instead, here we are.

    If things continue in this course, we could see a greater number of companies taking a careful look at the use of encryption, or lack thereof. Unlike federal laws and regulations like HIPAA that are limited in scope, or the patchwork of state laws that supposedly govern data security and privacy – which also fall prey to "standing" issues – FCRA affects many companies across many sectors.  

     

    Related Articles and Sites:

    http://law.justia.com/cases/federal/appellate-courts/ca3/15-2309/15-2309-2017-01-20.html
    http://www.nj.com/business/index.ssf/2013/12/horizon_bcbs_notifying_840000.html
    http://www.alertboot.com/blog/blogs/endpoint_security/archive/2013/12/11/hipaa-encryption-horizon-bcbs-of-new-jersey-data-breach-affects-840k-people.aspx
    https://www.databreaches.net/horizon-blue-cross-blue-shield-loses-round-in-data-breach-litigation/

     
  • UK Encryption: Royal & Sun Alliance Insurance Fined £150,000 For Stolen Hard Drive

    The UK's Information Commissioner's Office (ICO) has fined an insurance company, Royal & Sun Alliance (RSA), a total of £150,000 for the theft of an external storage device with information on nearly 60,000 clients (and credit card details for 20,000 people).  

    Stolen From a Locked Room

    Unlike your run-of-the-mill hard drive theft cases, there are a number of wrinkles to RSA's data breach. To begin with, the external storage device in this case is a NAS (a network attached storage device).
    NASes are like external hard drives but also so much more. One of their key differentiators to the lay person is their size: despite the modern device's emphasis on miniaturization, the modern NAS is still pretty big, considering. It's not unlikely for them to be about as big as a Nintendo Cube (or bigger). Due to its physical size, it's not possible to surreptitiously steal one of these babies; some thought and strategy, possibly pre-planning, is needed when stealing such a device.
    The other wrinkle is that the NAS was stored in a data server room which can only be accessed with "an access card and key," leading to the belief that staff or visiting contractors stole the NAS.
    In other words, it wouldn't have been easy to steal the device.
    And yet, as subsequent events have shown, it would not have been impossible, either. While NASes can offer file encryption, the stolen machine's data was not encrypted – either because this particular NAS didn't offer it or because someone in IT did not deem it worthwhile; excusable, some may think, since it was under lock and key.  

    Excusable?

    Well, it wasn't excusable. Far from it, as the six-figure fine shows. It's one thing for your average Joe to not encrypt his sizable storage device that he keeps locked up. A multinational insurance company, on the other hand, has responsibilities, and keeping the same data security practices as your average Joe is contemptible.
    Especially when you consider that up to 40 people were allowed unsupervised access to the room storing the NAS, or that nobody realized that the device had gone missing for over two months.
    This is exactly the type of situation where you want any sensitive data to be encrypted.  

    Giving a Break Where They Shouldn't

    Only the ICO knows how the fine's final amount was calculated. However, they note under "mitigating features" that the "personal data held on the device was not easily accessible."
    There must be some confusion here, since the lack encryption makes access to the data quite easy. It's true that you probably can't just access the information directly from a computer; however, a simple search in Google will provide more than helpful links for getting to the data, instructions that your average middle-schooler can follow while half-asleep.
    Imagine what staff or contractors that were given access to a data server room, literally a room where techie types go into, could do with access to the internet and a few keystrokes.  

     

    Related Articles and Sites:
    https://ico.org.uk/action-weve-taken/enforcement/royal-sun-alliance-insurance-plc/
    https://ico.org.uk/media/action-weve-taken/mpns/1625635/mpn-royal-sun-alliance-20170110.pdf
    https://www.databreaches.net/uk-150000-fine-for-insurance-company-that-failed-to-keep-customers-information-safe/

     
  • Netherlands Officially Files 5,500 Breach Notifications In 2016

    The Personal Data Protection Authority of the Netherlands (Autoriteit Persoonsgegevens, "AP") revealed last week that they received nearly 5,500 data breach notifications in 2016, the first year of mandatory data breach notifications for the European country.
    This contrasts with the 980 data breaches in the same period for the US, compiled by the Identity Theft Resource Center (ITRC), which is not government-affiliated. When you consider that the US has somewhere around 320 million people vs. the Netherlands's 17 million, something feels very, very wrong here.
    I can think of two possible ways to interpret the situation:
    1. The Dutch are just terrible at data security. This seems unlikely. It is the US, after all, that holds various records when it comes to data breaches. Last year, for example, Yahoo was crowned with the largest data breach in recorded history.
    2. The US data is severely undercounted. Most probably the reason for the seeming anomaly.
    The latter is supported by the data breach reporting environment in the US.
    To begin with, the US does not have a central authority in charge of data protection. There is no federal law addressing it, although a number of federal agencies do dictate data security in their respective areas; e.g., medical entities and their contractors follow the Department of Health and Human Services requirements regarding data security and breach notifications.
    At the same time, states have their own laws governing data breach reports, governing what is and isn't classified as such. And, each body that overseas such reports have their own policies on whether a data breach should be made public. Some make it easy to find online; others, not so much.  

    Running Numbers

    The 5,500 reported breaches translate to one data breach per 3,090 Dutch citizens. For the US, the 980 translates to one per 326,000 people. That's a ratio of 105 to 1.
    Granted, this is not the best way to represent the figures since it's legal entities that have the duty to report data breaches. A search in Wolfram Alpha shows that the total number of registered businesses in the Netherlands and the USA were, respectively, 1.03 million and 5.156 million.
    This brings down the numbers to one data breach per 187 Dutch businesses, and 5,261 per American businesses. The ratio is now 28 to 1, a considerable reduction, but still very large. Some of the difference could be attributed to the stronger regulations governing data security in Europe: stricter laws, with a propensity to err on the side of caution (read: privacy), means that the Dutch would see a data breach where Americans don't. Also, it could be that the Dutch are more forthcoming with such things because the legal environment is not as litigation-happy.
    No matter how it's cut and dried, however, one thing is certain: 980 breaches reported in the US seems comically low. If we were to assume that the US is comparatively affected by data breaches as the Netherlands, with a similar rate of notification to the authorities, then one would expect 27,500 data breaches in 2016.
    At the end of the day, all the signs point to this: we don't have in the US a good idea of how big or bad the problem is. The best we're willing to do, apparently, is rig the system so that we lowball the number to a point it's not realistic.
    That's a real problem because, who would feel the need to marshal resources when the problem appears to be so small?  

     

    Related Articles and Sites:

    http://blogs.dlapiper.com/privacymatters/the-netherlands-almost-5500-data-breaches-notified-in-2016-2/
    https://www.databreaches.net/the-netherlands-almost-5500-data-breaches-notified-in-2016/
    http://www.idtheftcenter.org/images/breach/ITRCBreachStatsReportSummary2016.pdf

     
  • US Government Committee Concludes (Yet Again) That Encryption Backdoors Undesirable

    As the year draws to a close – and what a year! – we finally have some good, sensible news: the US government has found that "any measure that weakens encryption works against the national interest," and so encryption backdoors are an untenable scenario. This should be the final and decisive nail to the coffin of an issue that brought encryption and encryption backdoors to the forefront of public consciousness in the US and the world.  

    Apple v. FBI

    Each year has its milestones but 2016 feels like it has had more than its fair share. Brexit; Trump as president-elect; a European Union that's showing signs of becoming fractured; Volkswagen and most of the car industry caught with its pants down; South Korea embroiled in scandal after a Rasputinesque figure is yanked from the shadows; US elections possibly influenced by outside influences; US elections influenced by a federal agency; the biggest data breach in history; the Panama papers… the list really does go on in the year of the Red Monkey.

    And in that list is an FBI that in essence asked Apple to create a backdoor of sorts to the iPhone's encryption, a consequence of the San Bernardino shooting in February 2016.

    The federal agency insists that they weren't asking for a backdoor but they, in fact, were. It was the encryption equivalent of "sorry not sorry." While the FBI ultimately backed off the case did trigger something else: the Encryption Working Group (EWG), a congressional investigation into the viability of encryption backdoors that was composed of both Democrats and Republicans.  

    The Encryption Working Group Conclusions

    1. Any measure that weakens encryption works against the national interest.
    2. Encryption technology is a global technology that is widely and increasingly available around the world.
    3. The variety of stakeholders, technologies, and other factors create different and divergent challenges with respect to encryption and the "going dark" phenomenon, and therefore there is no one-size-fits-all solution to the encryption challenge.
    4. Congress should foster cooperation between the law enforcement community and technology companies.

    The EWG found that strong encryption is vital to national interest in many ways – be it personal freedom or ensuring national defense or protecting infrastructure; the use of encryption is varied and widespread – and so anything that works to weaken encryption is a bad idea. Law enforcement's concerns regarding encryption are valid but an approach other than backdoors must be established.

    A solution may possibly lie in more cooperation between private industry and government, which is already present and established but could be furthered. Apple, for example, already provides law enforcement with data saved to the cloud. (In a case of cynically comical footinmouthitis, some in law enforcement used this cooperation as proof of Apple's "hypocrisy" regarding encryption.)

    The EWG noted that the FBI's approach request for a backdoor (or whatever it was they wanted to call it at the moment) was the wrong one since the use and provision of encryption is global and open source. Nothing would prevent "bad actors" from using encryption that is not crippled with backdoors. Any advantages the government receives from backdoors would be short-sighted and short-term.

    Aside from the above, the EWG also looked into whether "legal hacking" and compelled disclosure by individuals should be given more priority (working within a legal framework, of course).  

    Play Those Encryption Wars Again, Sam

    Of course, none of this is new. When the original "encryption wars" was "fought" in the 1990s, the issues and resulting conclusions were essentially the same. If anything, today's environment shows how prescient those conclusions from 20 years ago were. And, the issues debated back then haven't been supplanted by others in the interim. You know why that is?

    It's because the issues being debated were fundamental in nature, and now with plenty of supporting proof. Not that that's ever stopped anyone from challenging an issue.  

     

    Related Articles and Sites:

    https://it.slashdot.org/story/16/12/24/1649258/us-congressional-committee-concludes-encryption-backdoors-wont-work

    https://judiciary.house.gov/wp-content/uploads/2016/12/20161220EWGFINALReport.pdf

     
  • iPhone Encryption: FL Appeals Judge Says "OK" to Compel Password

    A new iPhone encryption case is making the headlines. Unlike many of the controversial ones to date, I think it can safely be said that in this case, the courts were right in compelling the suspect to unlock his smartphone.  

    Up-Skirt Videos

    A voyeur – we'll call him John Doe, although his name was revealed by the media – was caught using his iPhone to film up-skirt footage of a woman at a mall. He ran when confronted but the police were able to track him down using security footage.

    Doe initially agreed to a search of his smartphone but reneged at the last moment. A warrant was procured and granted for the phone, but Doe, one assumes, wouldn't cooperate when it came to accessing its contents. Doe, of course, is claiming his Fifth Amendment privileges.

    After legal haranguing, a Florida appellate court judge ruled that Doe must unlock his phone.  

    Only Protected When It's Testimonial

    The thing about the Fifth Amendment is that it's not legally ironclad, as I wrote back in 2012 regarding the Eleventh Circuit Court of Appeals ruling on decrypting hard drives and violating the Fifth. In that case, the court ruled that it was a Fifth Amendment violation; however, in a couple of other similar cases, the opposite was concluded.

    I said that they all made sense in a non-contradictory manner. Why? Because of what's known as foregone conclusion. Remember, the Fifth Amendment was designed to prevent "fishing expeditions" which has been described as:

    [a] method of putting the accused upon his oath and compelling him to answer questions designed to uncover uncharged offenses, without evidence from another source.
    This, in essence, is why compelling testimony against a person is unconstitutional: you arrest a guy for no reason, go through his personal life to see what charges you can stick on him (the "fishing") – if you can't find anything, then torture him so he'll confess under oath to something, either true or made up – and then jail him based on that. The Fifth exists to prevent things like this (the Brits used to do this a lot, so that's why its prohibition is enshrined in the US Constitution).

    But what if it's not a fishing expedition? What if the government is (to extend the theme) "spearing a whale," i.e., aiming for something that they know exists or happened? Well, it's different then.

    Because the government is not looking to find new evidence, or forcing a defendant to present new evidence that the government didn't know about, there is no violation of the Fifth. It's the difference between,

    "Who did you kill and where did you stash the corpse?"
    and
    "Where is Mary's body? We have video footage of you putting it in your car's trunk and driving outside the city."

    In the latter case, it's a foregone conclusion that you know where Mary's body is, and the government can prove it. Revealing what happened to Mary's body, and where it is, is not testimonial. You can claim the Fifth and not cooperate, but you won't get the actual legal protection (probably). On the other hand, forcing a suspect to do the same under the former is definitely testimonial.

    That there are grey tones to the Fifth makes instinctual sense: If you're served with a warrant to examine the inside of your house, your don't claim the Fifth and stop the authorities from crossing the threshold. The law is legally allowed inside whether you like it or not.

    Same goes for the blood samples, handwriting samples, DNA samples, voice recordings, or standing in a line up. All of these acts work against a guy and claiming the Fifth doesn't do squat.  

    What About the Contents of the Mind?

    You might be thinking, well, that's all well and good but wasn't there a legal maxim that you can be forced to turn over the keys to a strong box's lock but never if it's a combination lock where the "keys" don't exist? And isn't the iPhone's four-digit code more like a combo lock?

    And the answer is "yes." As it turns out, though, all of that was part of a dissenting opinion (8 to 1, no less), so it's not precedent. Nevertheless, the judge in the voyeur's case made reference to it, and wondered whether differentiating between the two even makes sense, especially in this day and age.

    But even if it did, foregone conclusion applies to things like passwords as well. In one case, a man crossed into the US with a laptop that contained kiddie porn. The border agent saw it and the guy was arrested. The laptop's encryption, however, kicked in afterwards. The man was ordered to decrypt the laptop by a court. Why? The government already knew that it was there. No fishing necessary.

    In another case, a woman was recorded as saying that her laptop, which was already taken in as physical evidence, contained files that she didn't want the prosecutors to see. She also was ordered to decrypt her laptop. Why? The government already knew that it was there.

    There have been many other instances where "things in one's mind" have been compelled to be produced by a court. It ultimately seems to come down to: is the government fishing for evidence or not?

    In the voyeur's case, it appears that foregone conclusion kicks in. The cops have identified him and tracked him down. There is a witness (the woman who was wearing the skirt and confronted him). He ran when confronted. There is, apparently, footage of him at the mall (which explains how he was tracked down). A warrant was issued based on all of this. He pretty much all but admitted that the phone is his. And, as the judge noted, providing the PIN isn't testimonial – that is, it doesn't create new evidence nor would it be taken as admission of guilt.

    There is very little wiggle room for John Doe here.  

    Still, Problematic

    The situation is not without problems, however. What if Doe doesn't remember his PIN? Then he'll be found in contempt of the court for something he's truly unable to do. And that has to be just as bad as putting some guy in jail on trumped up charges.    

     

    Related Articles and Sites:

    http://courthousenews.com/florida-court-denies-protection-for-iphone-passcode/
    http://www.bbc.com/news/technology-38303977
    https://apple.slashdot.org/story/16/12/13/2047234/florida-court-says-suspected-voyeur-must-reveal-his-iphone-passcode-to-police
    https://www.law.cornell.edu/supremecourt/text/487/201

     
  • Laptop Encryption: Chesapeake Public Schools Laptop Theft Affects Over 10,000 Employees

    According to a couple of sources, Chesapeake Public Schools in Virginia is notifying employees about a potential data breach. Per their announcement, nearly 11,000 people could be affected by the theft of a laptop computer. It appears that laptop encryption software was not used to protect the contents. Password protection, however, was used.
    Assuming that the thieves (or thief) manage to work past the password protection, they will have access to names, Social Security numbers, and bank account numbers for past and present employees of CPS.  

    Password Protection: Useless

    This is one of those data breach stories that doesn't get too much coverage nowadays – either because the theft of unencrypted laptops doesn't happen too often (relatively speaking) or because it gets buried by much more sensational breaches, such as Yahoo's admission earlier this year that hundreds of millions of accounts were hacked a couple of years ago.
    I would like to imagine the dearth of such stories on the former reason. After all, it's been a while since the alarm has been raised regarding the lack of encryption on laptops that store sensitive data; this blog alone has spent 10 years on it, as have other sites including news outlets. It would be nice to know that ten or more years is enough time for information to diffuse throughout society and become general knowledge. You know, to become what they refer to as "common sense." (And if the controversy revolving around the FBI and iPhones is any indication, it has become common sense).
    Here's a factoid that hasn't reached such status: password protection is anything but. Just a simple online search will provide more than a handful of ways for overcoming or bypassing password protection on computers running the Windows operating system.
    And if the thieves manage to do it on the CPS laptop… well, it's not going to be pretty.  

    Tax Season is Upon Us

    What could a thief do with names, SSNs, and bank account info for 10,000-plus people? How much damage could he cause? Plenty.

    One of the ways that the above info gets used around this time of the year is in the IRS tax refund scam. Since nobody likes to file their taxes early, and most will wait until April is near, enterprising criminals have beaten real people to the punch by filing fake tax returns, routing the IRS checks to addresses they control (such as postal mail boxes) and cashing them.

    The IRS does what it can to prevent checks from being sent to scammers, but ultimately, they can't know for sure that a fraudulent tax return was filed unless the actual SSN holder (or his/her accountant) calls to complain – which, again, tends to happen closer towards April.

    Indeed, this could be one big reason why there are no signs of the data being misused so far. It's just a little too soon to do anything with it.  

     

    Related Articles and Sites:

    http://wtkr.com/2016/12/04/chesapeake-public-schools-notify-employees-of-possible-data-breach/
    http://wavy.com/2016/12/03/chesapeake-public-schools-warn-about-data-breach/
     
More Posts « Previous page - Next page »