This Blog




AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.


AlertBoot Endpoint Security

AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.
  • Australia's Notifiable Data Breaches Law Nets 31 Reports In 3 Weeks

    A new Australian law appears to be succeeding in finally unveiling the current state of data breaches in the Land Down Under. According to a release by the country's information commissioner's office (the OAIC), thirty-one data breaches were reported to the government since the law took effect on February 22, 2018.

    Notifiable Data Breach

    Australia's Notifiable Data Breach (NDB) scheme, which makes mandatory the reporting of data breaches to the government, was a long time coming. While Australia already had data breach laws before the NDB, going public with a breach was a voluntary act. Obviously, that was never going to work. And the numbers prove it: in 2010, only 56 data breaches were reported. That roughly doubled by 2014, when 104 data breaches were reported.
    Obviously, these are comically tiny numbers when you consider that 20 million people live in Australia, and that more than 2 million companies were registered with the government in 2014 – 2015. The numbers would suggest that either (a) Australia's businesses are unusually top-notch when it comes to data security or (b) data theft and loss incidents were seriously underreported.
    Even today, with the latest revelation, a person tracking such incidents may feel that the numbers are a little low, possibly due to the public not being aware of their responsibilities under the new law, or because the OAIC has yet to show that it's willing to extend serious repercussions to non-abiders of the NDB.
    The USA, with its disparate set of laws and regulations regarding data breach notifications, has shown the effects of voluntary vs. mandatory, enforced vs. unenforced notifications. Such laws began to surface with California 15 years ago, and other states have passed their own versions, unwittingly leading to an experiment on what is effective. The conclusion: even when the law mandates going public with a data breach, many companies will not do so if repercussions for not doing it fail to really materialize.
    In addition, HIPAA / HITECH regulations covering the US medical sector showed that the fastest and surest way of ensuring that companies take notice of privacy and data security laws is to penalize companies, in monetary form, and publicize it.

    Turnover of $3 Million, A Couple of Conditions

    According to, the new law applies to companies that have an annual turnover (aka, total sales or total revenue) of $3 million or more, with certain exceptions like APP entities that trade in personal information. In addition, the data breach to be reported must be:
    • Unauthorised access to or disclosure of personal information that could be used to harm an individual; and
    • Risk of unauthorised access or disclosure, in which case the information has been lost and is in danger of being used to harm an individual
    The same article quotes an expert who says that the new law may not really affect the behavior of businesses, seeing how:
    the Australian laws are still "less stringent and the penalties less severe than similar regimes in other jurisdictions".
    Considering how the law is worded, it's hard not to agree. For example, what does it mean that the data breach could "harm an individual"? There's too much room for interpretation there, even if the OAIC notes that "objective assessment… from the viewpoint of a reasonable person" should be used in making the determination.
    Thankfully, at least it's pointed out that the use of strong encryption provides safe harbor from the NDB, as encrypted data is safe from unauthorized access. Indeed, multiple examples are provided where the NDB exception is directly tied to encryption, underscoring the importance of encryption in safekeeping personal and private data.
    Related Articles and Sites:
  • Fresno State Hard Drive Stolen, 15000 Affected

    At least 15,000 California State University, Fresno "student athletes, sports-camp attendees, and Athletic Corporation employees" were affected by a data breach earlier in the year, according to and other news sites. A hard drive, 18 laptops, and other items were reported missing on January 12 from the university's North Gym building. On the face of it, it seems that the device was not targeted in the theft which, based on the university's 2017 – 2018 academic calendar, appears to have been planned to coincide with Fresno's winter break period.  

    Data From 2003 Onwards

    In Fresno State's public data breach notification, the university notes that only 300 of the affected are "currently affiliated with the University," implying that most of the breached data involves former students, laypeople, and employees.
    The breached information includes:
    some personal information, including names, addresses, phone numbers, dates of birth, full or last four digits of Social Security numbers, credit card numbers, driver’s license numbers, passport numbers, user names and passwords, health-insurance numbers and personal health information.
    Considering the type of information that was being held – and how far back it went: 15 years – it's hard to understand why this external drive, which was used as a backup device, was not protected with encryption. Why wasn't it?
    Possibly, the (roundabout) answer lies in the 18 laptops that are not mentioned in Fresno's notification. Why are the laptops not mentioned if they were also stolen at the same time as the external drive? One possibility is that none of them held any personal, sensitive data.
    The more probable explanation is that these laptops were encrypted, obviating their inclusion in the breach notification. Maintaining this train of thought, it's probable that Fresno is dealing with an employee's wayward data security practices. Of course, it could also be that the university's IT department fumbled: if you've got hundreds of devices to secure, an odd hard drive or two could very well slip through the cracks and remain unprotected.  

    Cold Comfort

    Fresno, like many entities that report on data breaches, noted that they had:
    not received any reports that any of the stolen information has been accessed or misused in any way, and there is no reason to believe that the hard drive was stolen for the information it contained.
    Lawyers should stop their clients from adding the above language in breach notifications. It's embarrassing. The problem with it (aside from the fact that it's about as believable as we're sending this notification out of an abundance of caution: everyone knows you're sending it because it would be literally illegal not to) is that it is meaningless.
    In this day and age, people know that the data contained in devices can be more valuable than the hardware itself, and you can bet that people who steal computers are even more likely to be aware of this fact. So, not getting any signals that the stolen information was accessed... means squat.
    In addition, there's this implication that the information was not or will not be accessed because the hard drive wasn't stolen for the information. How faulty is that logic? Let us assume that some guy boosts a car because he's going to sell it to a chop shop. Are you telling me that he's not going to maybe take a peek in the glove compartment box or the trunk because he stole the car for its hardware, and not its content? Possibly lift up the armrest to access the center console? Steal the quarters in the ashtray?
    Having your personal details stolen is terrible. Receiving a breach notification letter is terrible. Ham-fisted attempts at PR are vexing and insulting. It wouldn't be surprising to find that such language backfires on its intent and, not only does it not comfort people, but encourages them to file lawsuits.
    Related Articles and Sites:
  • HIPAA Breach Results In Lawsuit And Countersuit Between Aetna and KCC

    Reuters reported earlier this month that Aetna, the health insurance company, and Kurtzman Carson Consultants (KCC), an administrative-support services provider, have sued each other over a mishandled class action settlement notification.
    Last year, Aetna settled a number of lawsuits regarding the fulfillment of HIV medication prescriptions. With the legal issues finalized, it was up to KCC to mail the settlement notifications and finally close the book on the situation. Unfortunately, the notifications were sent using envelopes with transparent windows, the ones where names and addresses show through. But in this case, there was a little more:
    Most of the first sentence of the notification was also displayed - including the words "when filling prescriptions for HIV medications." []
    That's as private as private information can get. Naturally, Aetna was sued for breach of patient privacy, which the company quickly settled. In turn, the company sued KCC "to indemnify…the entire cost of the notification disaster," or nearly $20 million. Aetna claims that they didn't know that envelopes with transparent windows would be used, that private information would be showing, etc.
    Basically, it wasn't Aetna's fault.
    KCC, however, has countersued, stating that "Aetna and Gibson Dunn [the insurance company's legal representation] knew what the notifications would look like" and, allegedly, approved it prior to mailing out the settlement notifications.
    Obviously, someone has to be lying. The calamities don't end there, however.  

    No Encryption in this Day and Age?

    KCC has also averred in their suit that,
    When Aetna’s lawyers passed along the list of health plan members to be notified about the HIV prescription policy, there was no protective order in place. Nor did Gibson Dunn encrypt all of the data it sent to [KCC].
    In fact, KCC states that private health information (PHI) wasn't encrypted nor password-protected (not that password protection would do any good; it's certainly not a HIPAA-compliant PHI security measure). And, they further claim that "more data than was necessary to perform the noticed function" was sent to them… which is not necessarily forbidden under HIPAA but is definitely frowned upon. In fact, it might be one of those red flags that spark an investigation by the Department of Health and Human Services (HHS).
    On the other hand, passing sensitive patient data around without encryption? We all know how the HHS feels about that one. The Reuter's article summed up what's at stake for Aetna and KCC in this manner:
    For both Aetna and KCC, as you can see from the dueling complaints, responsibility for the botched settlement notifications is really an existential question. As a health insurer, Aetna has a moral and legal obligation to protect patient privacy. As a claims administrator, KCC is supposed to know – of all things! – how to mail out a settlement notification without violating recipients' privacy.
    The above is insightful and yet misses a number of observations.
    It should be noted that KCC received the data from Aetna's lawyers. So, if KCC's allegations are true, then Aetna has another business associate that's not paying attention to HIPAA/HITECH requirements. And, what's true for KCC – that they should know how to properly mail out notifications because it's their job – can also be said for lawyers that are sharing sensitive data that belongs to a HIPAA covered-entity. After all the law has specifics on how PHI data should be handled by business associates of HIPAA covered-entities. Business associates such as lawyers, who, by virtue of their profession and their client, should know not to pass around PHI unencrypted.
    Also, the allegations open up a another can of worms for Aetna, seeing how it now has two business associates that have contravened HIPAA/HITECH data security rules in less than one year. It can take very little to get the HHS to open up an investigation into data security violations. Having three HIPAA incidents in a one-year period must certainly attract attention, and KCC's allegations gives the HHS a reason to dig more in depth into Aetna's adherence to HIPAA privacy and security rules.


    Related Articles and Sites:

  • HIPAA Security Trickle-down? Notifications State Sensitive Information Not Contained In Stolen Devices

    According to, two medical entities recently alerted patients of a data breach: Eastern Maine Medical Center (EMMC) and Nevro Corporation.
    In the case of EMMC, an external hard drive went missing. For Nevro, a number of laptops were stolen during a break-in. Information contained in these devices was not protected with data encryption in either case, but then again, "sensitive information" was not stored on any of the devices involved.
    While the lack of encryption seems reasonable at first glance, the truth is that a number of HIPAA / HITECH regulations were probably broken.  

    Eastern Maine Medical Center

    In the case of EMMC, the data breach was triggered when a third-party vendor's hard disk drive disappeared. Bangor Daily News reports that the "missing hard drive contains information on 660 of the patients who underwent cardiac ablation between Jan. 3, 2011 and Dec. 11, 2017."
    The missing drive was last seen on December 19. Reportedly, the storage device contained:
    Patients' names, dates of birth, dates of their care, medical record numbers, one-word descriptions of their medical condition and images of their ablation… [but NOT] Social Security numbers, addresses and financial information.
    On the face of it, it looks like the data breach could be classified by most people as "small potatoes."  

    Nevro Corporation

    Unlike EMMC, Nevro was responsible for its data breach. And yet, the company cannot be strongly faulted for the data mishap: it's not as if the laptops were in an unsafe location (like an employee's car). The laptops were at the company's headquarters, which one assumes was reasonably secure against break-ins.
    Per Nevro's breach notification letter, "nearby business were also targeted" and laptops were stolen from them as well, so chances are that Nevro had comparable security in place. (Either that or most businesses in the area decided to dispense with security, a dubious assumption).
    The company noted that all of the stolen laptops were password-protected "although not all were encrypted." Yet, the silver-lining is that "limited categories of information" were stored on these devices and that none of them "contained, sensitive identifying information such as Social Security or other government-issued identification numbers or credit card or financial institution information."
    The "limited information" pertains to names, addresses, and other similar information listed by EMMC. Indeed, Nevro seemingly implies that it's only sending affected patients because
    applicable state law considers this type of information [limited information about your treatment relationship with Nevro] sufficient to warrant a notification.
    Again, most people would look at this as small potatoes (especially when you take into consideration what Equifax admitted to last September. That was definitely not small potatoes; heck, it went well beyond the tuber family).
    As pointed out in previous posts, such "not sensitive" information can still be used to carry out fraud and scams. Tech support scams, for example, are successful even though there is very little personal data involved. Can you imagine how much more convincing a phone scam would be if someone called a person about his or her cardiac ablation?
    That being said, there is a remote possibility of it happening. In contrast, the malicious use of SSNs and other information generally considered to be "sensitive" is more than possible. So, the lack of what most people would deem "sensitive personal information" should come as something of a relief to patients.  

    Could Still Be a HIPAA Breach

    It may not be, however, a relief for the two organizations. A cursory search on the internet seems to indicate that both fall under the purview of the Health Insurance Portability and Accountability Act (HIPAA). HIPAA has very strict definitions of what is and is not PHI (protected health information).
    As this link shows names, physical addresses, telephone and fax numbers, email addresses, etc. are considered to be PHI if combined with certain information, such as what medical treatment one was receiving. So, technically, it looks like the two organizations have a full-blown medical data breach in their hands.
    It goes without saying that the use of full disk encryption would have paid off wonderful dividends in both cases because HIPAA provides safe harbor if data is encrypted when lost or stolen. That not being the case, what will be the fallout?
    HIPAA / HITECH data security compliance is administered and overseen by the Office of Civil Rights (OCR) of the Department of Health and Human Services. The OCR has not been shy in dispensing monetary penalties, sometimes in the millions of dollars.
    And, as befitting such large sums, it often takes years to reach a decision on how to deal with HIPAA covered-entities that have suffered a data breach.
    Related Articles and Sites:
  • Coca-Cola Laptop Theft Lawsuit From 2014 Still Ongoing

    Over at, Bloomberg Law reminds us that there are a number of "legal battles over workplace cybersecurity being waged" in the USA. For example, ENSLIN v. THE COCA-COLA COMPANY ET AL, which has been ongoing since 2014.
    The breach was covered here and here previously, and the short version is: A Coca-Cola employee stole laptops that were meant to be disposed of, triggering the data breach. This results in a former Coca-Cola employee, Enslin, suing the beverage maker because it failed to adequately protect employee information. In a complaint, Enslin says that he fell victim to identity theft and other crimes not long after receiving the breach notification letter.
    When the story originally hit the wires in 2014, there was a dearth of information. Three years and a bunch of court filings later, we have more to go on.  

    Quis Custodiet Ipsos Custodes? (i.e., Who will Police the Police?)

    As noted in a previous post, it would have been hard (possibly near impossible, depending on the circumstances) for Coca-Cola to prevent the theft of the laptops. The computers at the center of the breach – 55 of them, stolen from Coca-Cola over a period of six-and-a-half years – were meant to be disposed of… and the thief, another Coca-Cola employee, Thomas William Rogers III, was the person responsible for disposing of them.
    It was later reported that Coca-Cola only became aware of the situation when they received an anonymous tip (
    On November 17, 2013, the anonymous caller contacted Coke security and reported the company owned equipment was going to be moved at any moment due to a big fall out between the employee [Rogers] and his wife.
    Most of the computers were found in Rogers's home, but a number of them were given to acquaintances as well. The media reported that all the stolen machines were eventually recovered (but not necessarily, according to court documents).
    Sensitive personal information like Social Security numbers and driver's license numbers were found when the company performed forensic data analysis on the machines, triggering data breach notification laws.
    A number of months later, Enslin and his family found themselves mired with identity theft problems while on vacation. The problems gradually snowballed, with criminals using his information to obtain a job; purchasing thousands of dollars of merchandise; attempting changes of address to further scams; etc.  

    Contradictions and Errors

    Data breaches, no matter how straightforward, always contain an element of uncertainty in them and the Coca-Cola situation is no different.
    Initially, the media reported that 53 laptops were involved in the data breach. At some point, that was corrected to 55 laptops. The interval of the breach also increased, from 5 years to more than 6 years. Also, it's been reported that the stolen machines' prices ranged from less than $500 to $2500, leading one to ask whether it really was only decommissioned laptops that were stolen.
    Perhaps the confusion originates from Rogers himself, who,
    In a written statement to [Coca-Cola], Thomas Rogers stated he had "a couple of boxes full of laptops" but "didn't know how much equipment he had" because he had been accumulating it for five years []
    The implication, then, is that there could be more laptops out, even if it was reported that all computers were accounted for. The company admitted as much to the courts:
    After it learned of the breach, Coca-Cola sought to recover its missing hardware, and while it has "a very good feeling" that it has been able to recover it all, it cannot say for sure. []

    Could Have Had Better Security

    In light of the above, could Coca-Cola be accused of being lax in their responsibilities? It would be hard not to.
    They could have easily prevented a data breach (not necessarily the physical act of the laptop theft itself) by employing disk encryption on all and any computers, be they laptops or desktops. Without the correct password, Rogers wouldn't have been able to access the machines, and so the personal, sensitive information would have been protected.
    Furthermore, the company could have designed their process for retiring computer equipment to include the deletion of the encryption key for each computer prior to giving it up for disposal. By doing so, the data would still be protected if the passwords were obtained by Rogers somehow.
    (And let's not forget that encryption protects companies from data breaches while the machines are being used in everyday life – break-ins and loss/misplacement have been prominent sources of breaches, too).
    Also, it is very troubling that this went on for more than six years. The fact that a domestic disturbance is how the breach was uncovered... well, that's just not how well-thought out security is supposed to work.
    There were some very obvious failures here.
    However, there is "being lax in their responsibilities" in a moral, ethical manner and being so in a legal manner. The courts so far seem to be of the opinion that, in the latter, Coca-Cola was not in the wrong.
    Regardless of what the ultimate outcome may be, one thing appears to be pretty clear: properly securing the data would probably have been cheaper than defending against a lawsuit that's taking more than three years to resolve.
    Related Articles and Sites:
  • Penn Medicine Sending Breach Notifications To 1000 Patients Over Stolen Laptop

    Penn Medicine has revealed this past week that a laptop computer with protected health information (PHI) was stolen on November 30. While the details are meager (aside from a short entry at, which is referenced by, an online search comes up empty), the following was revealed:
    • About 1000 people were affected.
    • The laptop was stolen from a car at a parking lot.
    • Breached information includes "patient names, birth dates, medical records, account numbers, and some other demographic and medical information."
    • It does not include "Social Security numbers, credit card or bank account information, patient addresses or telephone numbers stolen."
    Penn Medicine promised to review procedures to ensure that patient information is protected on portable devices.  

    What is This, 2009?

    In an age when breaches can – and do – involve tens of millions of people, Penn Medicine's data breach almost feels quaint. And, yet, that's why it's also not so easy to forgive.
    Consider servers with massive amounts of data that are hooked up to the web, and thus, "hackable" by anyone with a decent internet connection, in both theory and practice. Indeed, a small group of network and security professionals are exploring the build-out of a separate, "better" (better secured?) internet, seeing how our current global communications web will be forever playing security catch-up to the bad guys.
    So, even if millions of people are affected by a breach, it's "understandable:" it shouldn't be happening, and we feel outraged when it happens, and lawsuits are going to be filed left and right, but we get it: there's very little that can be done unless we redesign everything.
    But when it comes to an individual laptop computer, there is a proven method of ensuring that its contents as a result of a burglary. It's the same method that led to the Apple vs. FBI face-off two years ago: full disk encryption. It's a very well established technology that's been around forever.
    Indeed, most hospitals, clinics, and medical practices routinely use full disk encryption to protect not only their laptops but also their desktop computers, which have been proven less than immune from theft. And, larger organizations have been more aggressive and thorough than smaller concerns, not in small part due to lawsuits brought by the federal government.
    For example, BlueCross BlueShield of Tennessee knows that they should encrypt any hard drives that are used to store phone call recordings, an insight they obtained after being embroiled in what was one of the largest data breaches in history at the time.
    This lesson was learned in 2009.
    So, when one reads, in 2018, that one of the bigger hospitals in the US is looking to review their procedures to ensure that patient information is protected on portable devices... it sounds tone-deaf. Technically, as a HIPAA covered-entity, they should be doing this periodically or whenever security conditions change.  

    Related Articles and Sites:
More Posts « Previous page - Next page »