This Blog




AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.


AlertBoot Endpoint Security

AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.

May 2014 - Posts

  • HIPAA Lawsuit: SAIC/TRICARE Data Breach Case Mostly Tossed Out Of Court

    Another data breach lawsuit, another case that's tossed out of court for "lack of standing."  When TRICARE was saddled with one of the largest data breaches of all time – over 4.7 million people affected when backup tapes were stolen from a parked car – it was certain that SAIC, the vendor who was at the center of the breach, would be sued.  Less certain was whether the plaintiffs would see their day in court, although I had my own private reservations, seeing how the courts have been pretty consistent over the years: if you can't prove that you were harmed by an event, you can't sue for redress.  The latest claim further strengthens this position.

    But it's not all déjà vu: the memorandum opinion that put the kibosh on this lawsuit is a pleasure to read, explaining a number legal concepts in plain language.

    4.7 Million Affected, 33 Original Plaintiffs, 2 Remaining

    One of the most surprising things about this case: the judges do math.  After noting that "plaintiffs allege that data-breach victims in general are 9.5 times more likely
    than the average person to experience identity theft post-breach," they go on to show why the plaintiffs' odds of becoming identity theft victims essentially falls to zero in this particular case:
    In a society where around 3.3% of the population will experience some form of identity theft – regardless of the source – it is not surprising that at least five people out of a group of 4.7 million happen to have experienced some form of credit or bank-account fraud.

    So one would expect 3.3% of TRICARE's customers to experience some type of identity theft, even if the tapes were never read or misused. To quantify that percentage, of the 4.7 million customers whose data was on the tapes, one would expect around 155,100 of them to experience identity fraud simply by virtue of living in America and engaging in commerce, even if the tapes had not been lost.
    The fact that only a handful of people can even realistically claim that they experienced identity theft that is linked to the TRICARE/SAIC breach means that (a) the thief or thieves are incredibly patient, having waited at least 34 months, and counting, to unleash a torrent of financial pain or (b) these handful of people are attributing their identity theft to the wrong data breach, and the tape was probably never accessed.  Chances are, it's (b).

    And this, essentially, is what the entire case revolves around.  Having established that the odds of their data having viewed are close to zero, plaintiffs' claims are speculative and thus have no standing.

    You're Stressing Over Something that Could Be Something...or Nothing

    An "increased risk of identity theft" which, as far as I can see, the courts don't deny has happened,  is not sufficient, even if the anxiety surrounding it is real:
    This case presents thorny standing issues regarding when, exactly, the loss or theft of something as abstract as data becomes a concrete injury. That is, when is a consumer actually harmed by a data breach – the moment data is lost or stolen, or only after the data has been accessed or used by a third party? As the issue has percolated through various courts, most have agreed that the mere loss of data– without evidence that it has been either viewed or misused – does not constitute an injury sufficient to confer standing. This Court agrees.
    Likewise, arguing that companies who've suffered a data breach should at least compensate data breach victims for the time, energy, and money they've spent trying to rectify the situation is unjustified.  After all, you're basing your actions not on facts but on the speculation that something might happen.
    But the Supreme Court has determined that proactive measures based on "fears of. . . future harm that is not certainly impending" do not create an injury in fact, even where such fears are not unfounded....

    Put another way, the Court has held that plaintiffs cannot create standing by "inflicting harm on themselves" to ward off an otherwise speculative injury. Id. The cost of credit monitoring and other preventive measures, therefore, cannot create standing.
    Granted, that doesn't mean that you shouldn't do anything after a data breach.  You've got to take precautions.  It's the smart thing to do.  But "smart" does not the same thing as "having a leg to stand on in court":
    "objectively reasonable likelihood" of harm is not enough to create standing, even if it is enough to engender some anxiety.... Plaintiffs thus do not have standing based on risk alone, even if their fears are rational.

    Nor is the cost involved in preventing future harm enough to confer standing, even when such efforts are sensible.
    Sure, it seems a little crazy.  But them are the rules, and until the rules change, potential victims of data breaches can only wait to become actual victims of data breaches if they want to win in court.

    And even then, they have to be able to prove a causal link between the breach and their victimization.

    Related Articles and Sites:
  • HIPAA Server Security: Total of $4.8 Million HIPAA Fine For NY Presbyterian and Columbia U

    There are many articles out there claiming that the Office for Civil Rights (OCR) at the Department of Health and Human Services (HHS) has issued its largest HIPAA fine to date, a total of $4.8 million to New York Presbyterian Hospital and Columbia University Medical Center.  The story is inaccurate but raises interesting points when it comes to HIPAA monetary penalties.

    HIPAA Breach Penalties Capped at $1.5 Million?

    One of the most often quoted figures when it comes to HIPAA breaches is $1.5 million, as in the "maximum penalty amount of $1.5 million for all violations of an identical provision" (section §160.404 of the Final Rule).  This is generally interpreted as $1.5 million in fines is the limit when it comes to HIPAA data breaches.

    Which must lend the question: how did New York Presbyterian and Columbia University manage to get themselves fined $4.8 million?  Even if the penalty fell equally on both entities (it didn't.  Columbia was fined the "expected" $1.5 million, NY Presbyterian the remaining $3.3 million), you'd imagine that $3 million is the most that could be assessed.  Where is the remaining $1.8 million coming from?

    NY Presbyterian's resolution agreement with OCR does not offer clues:
    NYP agrees to pay HHS the amount of three million three hundred thousand dollars ($3,300,000.00) ("Resolution Amount"). NYP agrees to pay the Resolution Amount on the Effective Date of this Agreement as defined in paragraph 14 by automated clearing house transaction pursuant to written instructions to be provided by HHS.
    Many have noticed, however, that NY Presbyterian conceded to more instances of non-compliant conduct than Columbia University.  This leads me to believe that NY Presbyterian must have broken multiple rules via the one data breach.  From "Mandated Benefits: 2014 Compliance Guide" by The Balser Group:
    The final Enforcement Rule indicates that one act of noncompliance that violates more than one subpart of the [HIPAA] administrative simplification rules will be treated as separate, multiple violations.  So, for example, if a covered entity re-sells its used computers without scrubbing the hard drives that contain protected health information, this act may violate several separate legal obligations under the security and privacy rules.  In this scenario, the covered entity will have multiple violations and could be fined up to the maximum for each separate violation.
    Whatever the reason may be for NY Presbyterian paying $3.3 million, the lesson here is that $1.5 million is not the limit when it comes to HIPAA fines.  If someone out there is performing a financial risk analysis based on a theoretical cap of $1.5 million to decide whether, say, all laptops in their organization should be protected with HIPAA encryption software, your job just became harder.

    There is Another Bigger HIPAA Fine

    All of this hullabaloo about NYP-CU being the largest HIPAA fine is just hot air.  The honor goes to Cignet, which was fined $4.3 million by OCR in 2011.  Of the total, $1.3 million was for denying patients access to their own files, which is a HIPAA violation.  A further $3 million was levied for not cooperating with OCR's subsequent investigation.

    Yes, the total amount involved is less than the $4.5 million for NYP-CU.  But, the entirety of the $4.3 million falls on one covered entity.  NYP-CU involves two.  I mean, there's a reason why there are two resolution agreements for NYP-CU.

    Related Articles and Sites:
  • Alberta Health Information Act Updated For Breach Disclosure Notifications

    Alberta, Canada is updating its books so that the breach of medical information is disclosed ASAP.  The original legislation had a number of good points although it didn't include a mandate for the use of medical encryption software for laptops used by health information custodians.  On the other hand, the Information Privacy commissioner does require it for devices that store personal and sensitive information, so it's a moot point.

    A data breach last fall highlighted further problems, prompting the legislative change.

    Breach Remains Under Wraps for Months

    Alberta suffered its largest health information data breach to date on September 2013.  Over 620,000 people were affected.  The breached healthcare organization learned of the mishap right away and duly contacted the Information and Privacy commissioner.  And there the story ended until January of this year, when the Health Minister's office was contacted and it went public with the news.

    The controversy over the three-month-long delay revealed that the Alberta Health Information Act proved an impediment to the Information Privacy commissioner because it treats "private-sector companies and health providers very differently" and prohibits the commissioner "from disclosing a breach to anyone, or forcing the offending organization to disclose it."

    The updated law strikes out this obstruction; however, it is not without its own downsides and has raised concerns.

    Use of Encryption Mandated

    It's been a while since I've read any Canadian information security law, and what I've read wasn't comprehensive by any means – there's just too much out there, with each Canadian province and territory having its own set of laws.  So how do I know that the use of laptop encryption is mandated by the Alberta Information Privacy commissioner?

    Well, I ran into this site at the University of Alberta that deals with "encryption myths and realities."  According to the page,
    The Alberta Office of the Information and Privacy Commissioner and information management legislation such as FOIP, do require information custodians to adequately protect personally identifying information. The privacy commissioner specifically mandates laptop encryption for custodians of personal and sensitive information. []
    This law is quite unique.  Legislation that I've come across (and I've read a lot of them despite my status of non-lawyer) does not mandate the use of encryption on specific devices.  Some legislation require the use of encryption to protect data in general; others contort words so that encrypted data is protected from legal penalties and fines (but does not directly mandate the use of encryption).

    This brings up a very interesting question: when people don't follow the law in those regions where laptop encryption is specifically required, what chance do other regions have?

    Related Articles and Sites:
  • HIPAA External Drive Encryption: Larsen Dental Care Announces Breach

    HIPAA encryption software: Despite its many benefits, it's not uncommon to find people who are not taking advantage of the safe harbor it offers.  For example, Larsen Dental Care – in Pocatello, Idaho – has posted a notice alerting its patients of a stolen external hard drive.  Such a move would have been unnecessary if said device had been protected with hard drive encryption.

    Implement an Equivalent Alternative Measure

    According to the letter, the breach occurred on March 24, 2014.  The hard drive was stolen from an employee's car, exposing names, addresses, dates of birth, email addresses, phone numbers, dental records, medical histories, health insurance IDs, and SSNs.

    How did Larsen Dental Care know what was stolen?  They don't give details but do mention that a "forensic analysis" was conducted.  In all likelihood, they looked at the information that was last copied to the external hard drive and possibly took at the contents of the last backup as well.  The problem with such an approach is that there's always the risk that something was left out.

    Which is why any electronic device that holds PHI should be encrypted.  It's a well-known fact that, under HIPAA, the use of encryption is not a requirement.  What's less known, though, is that the rules require that a covered entity do the following:
    • Investigate whether encryption is an appropriate solution for protecting PHI (it rarely isn't),
    • In the (rare) occasion that encryption is not an appropriate solution, the HHS notes that one must (1) document the reason for not using encryption and (2) "implement an equivalent alternative measure".

    When put in this light, I think most people would agree that this is a classic case of "heads I win, tails you lose" or being asked to choose between resigning and getting fired.  The choices given are anything but: even if you decide not to use encryption, you must implement something equivalent to the protection that encryption affords.

    Some people don't interpret it this way, though.  They see the word "addressable" and think it means "optional"...which is not wrong.  However, the options are pretty limited: where can you find an alternative that would:

    • Take the FBI at least one year of trying to crack, if they get lucky?
    • Cost at least thousands of dollars to overcome said security?

    Why Choose the (Worse) Alternative Option?

    But why employ an equivalent alternative measure when encryption is already available?  Well, as it turns out, cost tends to be the determining factor in most cases.

    But, there's something to consider here: the alternative to encryption doesn't provide safe harbor benefits from the Breach Notification Rule.  In addition, it'll probably be more expensive than encryption to implement, if it truly is equivalent to encryption.  It'll probably be more cumbersome to implement.  And, it will cost more to maintain and operate.

    What will happen to Larsen Dental Care?  Chances are, not much.  An online search shows that there are 123 listings for dentists in Pocatello, so those concerned with data security will probably go to the competition.  Most won't, though: in my experience, once you find a dentist that you love, you stick by him or her, and the testimonials on Larsen's website show that it's a fine dental care establishment.

    On the other hand, who knows?  The HHS is ramping up their audits on covered entities this year.  It's a potential business risk that many try to avoid by lowering the risk of a data breach.

    Related Articles and Sites:
  • Canada PII Encryption: Southwest Community Business Development Has Laptop Data Breach, Plumbs The Depths Of Idiotic Remarks

    Oh, Canada.  The past twelve months have been a debacle for the northernmost country on the American continent, on many fronts, putting at risk the image it has cultivated over the years – a country of polite, maple-syrup-consuming, thoughtful, intelligent, and funny citizens.

    Here's an additional incident that makes one wonder what exactly is going up there: a spokesperson for the Southwest Community Business Development Corp. has stated that there was nothing the company could have done regarding a data breach.  It was triggered by the theft of a laptop computer from an employee's car.  (Hint: they could have used something like AlertBoot laptop disk encryption software to protect the sensitive information).

    "Needs to be Shared and Ridiculed"

    The story is so flabbergasting that the director of the site said that the CBDC's statement "needs to be shared and ridiculed worldwide."  What, specifically, was said to spark off such a reaction?
    "The laptop was not in plain view, it was put away, and someone decided that they were going to break into the vehicle and that is circumstances outside of our control. There’s absolutely nothing we can do in that particular circumstance," said Heather Hubert, the CBDC Southwest executive director.
    Would she have said the same thing if the stolen object was something other than a laptop full of sensitive data?  Really?  Nothing could have been done?

    Substitute the word "laptop" in the above quote with the following to see how ridiculous the executive director sounds.  Also keep in mind that the theft happened from a car that was parked overnight on some random street.
    • Original copy of the Canadian constitution.
    • Ten million dollars in rare gems.
    • A kidney to be used in an operation the next day.

    On the one hand, I can understand what Ms. Hubert is saying.  Yeah, there was no one there to stop the theft, so there is literally nothing they could have done in that particular circumstance, at that particular point in time.  On the other hand, it's quite evident that something could have been done to ensure that the conditions for that particular circumstance do not arise at all.  For example, couldn't one have taken the laptop out of the car and stored it somewhere safer?

    Honestly, if you have time to "hide" something in a car, and it weighs less than 20 pounds (approximately 10 kilos), you can probably take it with you, as opposed to leaving it in the car overnight.

    They Know What to Do

    The previous quote contrasts quite strongly with what the CBDC is doing.  According to
    CBDC Southwest and PETL [Post-Secondary Education, Training and Labour] are encrypting private data on laptops going forward.
    Does this sound like the actions of an organization that has exhausted all its options when it comes to data security and finds itself with "absolutely nothing [it] can do"?

    Perhaps CBDC deserves ridicule.  But, I believe that actions speak louder than words.  It's quite obvious that there is more that CBDC can do, and they're doing it.  And, at the end of the day, that's all we're really asking for.

    Related Articles and Sites:

  • UK Data Encryption: FOI Request Finds ICO Fines Lower, Breach Incidents Higher

    A freedom of information request, filed by ViaSat for data held by the UK's Information Commissioner's Office (ICO), has led to the conclusion that data breach incidents have grown in the British Isles in the past year by 37%.  The same data shows that monetary penalties by the ICO have fallen in the same period.  This has got the folks over at wondering if the UK is regressing when it comes to data security.

    Personally, I'm not sure that it is, at least not from a digital data security standpoint: only 8% of data breaches are attributed to lost or stolen hardware.  I don't know whether this figure excludes data that was protected with encryption software.  Regardless, it's an impressive feat considering that in other developed countries, the rates are much, much higher.

    Different Laws, Different Definitions of a Data Breach

    Different countries (and – depending on the country – different counties, states, prefectures, and other regional government bodies) could have their own definition of what constitutes a data breach.

    While I haven't found any UK legislation that specifically states that the loss of encrypted data does not constitute an information security breach, the ICO has brought numerous actions against organizations that didn't use encryption.  The writing is very clear: in the UK, the loss of encrypted data is not a data breach.  That is, assuming the password to accessing the secured data wasn't lost in tandem.

    This differs markedly from, say, New York's position on encrypted data.  Under state law, even encrypted data is classified as a data breach.  In Massachusetts and Nevada, it's not.  Neither is it in California.  Indeed, most US states have a safe harbor provision for encrypted data.  Three have no provisions whatsoever – they don't have a data security or privacy law in their books – and some have made the terrible decision to accept the use of password-protection as proper data security.

    8%?  That's Not Bad at All

    We could discuss about different laws until the cows come home, but the really interesting thing about ViaSat's conclusion is this:
    The most common form of data breach, at 48 per cent, involved the sending of information to the wrong recipient. Lost or stolen paperwork followed, making up 16 per cent of reports, while lost or stolen hardware accounted for 8 per cent.
    The data breach rate for hardware is extremely low.  In the US, and in most post-industrial countries, the rates tend to be not only in the double-digits, but sometimes well over 20%.  That the UK is seeing an 8% rate can only mean two things:  (1) they've succeeded in convincing people to use proper security on their hardware or (2) people are not reporting their digital data breaches to the ICO.  The latter would be very puzzling, though, seeing how they do report it for paper documents.  Why report one but not the other?

    Also, consider this: if the 8% figure includes incidences where data encryption software was used, then the actual breach rate is even lower.  Regardless, it's low overall.

    That's an enviable number, regardless of how you slice it.

    11% Increase in Reported Breaches

    Also enviable?  The 11% increase in breaches reported year-over-year.  Yeah, ideally we want to see the number decrease – heck, we want to see the number at 0%.  But a growth rate that's barely broken the single digits is not so bad.

    Such growth could be attributed to something other than increased security events.  For example, perhaps people are now more cognizant of the fact that lost paperwork, or that "sending information" to the wrong person (I get the feeling that most of these are instances of sending email to the wrong address), also constitutes a reportable security incident.

    Overall, it seems that the ICO's past actions are paying dividends.  Let's not forget what the ICO has said in the past: that they use monetary fines in the most egregious offenses, not only as form of punishment but also as a way of signaling to others what is not acceptable.

    Twice Bitten, Four Times Shy?

    Also to be taken into consideration is that, in 2013, the ICO lost a couple of high-profile cases involving the handing out of monetary penalties, as detailed in  Of course, there were also plenty of appeals where the courts held the ICO's hand as well, but once they put the kibosh on one aspect of your operations, you tend to become more circumspect.

    Related Articles and Sites:
More Posts « Previous page