This Blog




AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.


AlertBoot Endpoint Security

AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.

September 2012 - Posts

  • Connecticut Data Breach Law Updated: You Are Required To Notify State AG

    Beginning on October 1, 2012, an update to Connecticut law requires firms that do business in the state to also notify the Attorney General of any reportable data breaches.  Of course, the law hasn't completely revamped all legislation concerning data breaches: reporting the loss of data is not required if the information has been properly protected with drive encryption software like AlertBoot, which combines both smartphone and tablet device protection with traditional hard disk encryption under one easy to use management console.

    Companies that need to comply with the above requirement should keep this email address handy:  That's the email address specifically setup by the AG for reporting data breaches.

    Minor Update to 36a-701b, CT's Data Breach Law

    I noted a couple of years ago how CT's data breach law was a breath of fresh air.  As a layperson (read: non-lawyer), this was one piece of legislation that could be read by a child and made sense of.

    However, it looks like there a little oversight over a particular matter.  From AG Jepsen in

    "Existing state law directs my office [Office of the Attorney General, OAG] to enforce requirements that companies notify state residents whose personal information may be compromised by a data breach," said Attorney General Jepsen. "However, the law made no requirement that my office be notified, making enforcement difficult. That will change beginning October 1, and I want to ensure that the process for a business owner to report a data breach is as easy as possible."

    Of course, you don't have to use the email address I've listed at the top of this post.  Any way of contacting the AG is valid, it appears.

    (If I were you, I'd email and then follow up with a phone call to confirm receipt.  It happens rarely, but emails do sometimes end up in the netherworlds of networks.  And I'm not referring to a spam folder: it literally just disappears into the network, never to be found again).

    An additional requirement is that "the attorney general also be notified no later than when the affected residents are notified."  Failure to do so is a violation of the Connecticut Unfair Trade Practices Act (CUTPA).

    Why is CUTPA important?  Because it essentially determines what types of penalties a company faces for violating the data breach laws.  Connecticut's breach laws don't specifically mention any penalties; it just links them to CUTPA.

    Interestingly enough, the Connecticut Insurance Commissioner (CIC) had/has much stricter policies.  In a post dated August 31, 2010 I noted that the CIC:

    • The use of encryption software [ ; managed computer disk encryption ] does not grant safe harbor from data breach notifications
    • The breach of paper records are also reportable
    • The breach must be reported within five calendar days from the discovery of the breach

    It looks like the AG is playing catch-up to the Insurance Commissioner.  On the other hand, information pertaining to insurance companies tend to be highly sensitive in nature, so it makes sense that a particular subset of businesses operating out of Connecticut are more aggressively required to report their data breaches.

    Related Articles and Sites:

  • HIPAA BYOD Security: Massachusetts Eye and Ear Infirmary Pays $1.5 Million To Settle PHI Breach

    Massachusetts Eye and Ear Infirmary (MEEI), a Boston hospital overlooking the Charles River and Longfellow Bridge, has agreed to settle a HIPAA PHI breach case for $1.5 million.  The breach occurred when a laptop computer was lost during a 2010 medical conference in South Korea.  The computer was not protected with laptop encryption software, although it did feature a sort of LoJack for computers.

    Device Tracking and Data Deletion: Sometimes a Little Too Late

    When considering security for BYOD initiatives, many professionals look at remote data wiping and device encryption as top requirements.  This is common sense: devices get lost, and because the odds of recovering them are slim, a method to ensure that the data in said devices is not accessed is a much sought-after solution.

    Laptops are devices as well, and they're part of the burgeoning BYOD trend (if not the pioneer of this awkwardly named trend).  Oddly enough, the professionals appear to feel safe just installing tracking software without any encryption whatsoever, despite the higher probability of laptops carrying lots of sensitive data.  In the MEEI's case, records for approximately 3,500 patients that spanned over a twenty year period.

    The problem with tracking and remote wiping software is that you need a network connection.  If a thief steals a laptop or a smartphone, where is the guarantee that it will connect to a network?  The odds are high for the latter, it is a phone after all, but for a laptop?  MEEI found out the hard way that it's not necessarily guaranteed: it took nearly a month for the laptop to show up in the hospital's radars.

    Snowball Effect: HIPAA Investigation

    The data breach triggered an investigation of MEEI's data security practices by the Department of Health and Human Services Office for Civil Rights, the branch of the US federal government that is charged with enforcing HIPAA and HITECH (which amended and updated HIPAA).

    OCR found that the hospital demonstrated "a long-term, organizational disregard for the requirements of the Security Rule," according to, and resulted in a fine of $1.5 million, the maximum civil monetary penalty that can be assessed on a HIPAA covered-entity.  Mind you, the penalty was not just for the loss of the laptop computer.  Rather, OCR found:

    six areas of potential past non-compliance which were addressed by Mass. Eye and Ear between October 2009 and June 2010. These areas of potential non-compliance were primarily focused on controls to protect health information accessed or stored on portable electronic devices, such as laptop computers. []

    The six problematic areas include (

    • Conducting a thorough analysis of the risk to the confidentiality of ePHI maintained on portable devices,
    • Implementing security measures sufficient to ensure the confidentiality of ePHI that MEEI created, maintained, and transmitted using portable devices,
    • Adopting and implementing policies and procedures to restrict access to ePHI to authorized users of portable devices, and
    • Adopting and implementing policies and procedures to address security incident identification, reporting, and response.

    In addition to the monetary penalty, MEEI agreed to the following:

    In addition to the $1.5 million settlement, the agreement requires MEEI to adhere to a corrective action plan, which includes reviewing, revising, and maintaining policies and procedures to ensure compliance with the Security Rule. An independent monitor will conduct assessments of MEEI’s compliance with the corrective action plan and render semi-annual reports to HHS for a 3-year period. []

    MEEI's Disappointment Begets My Disappointment

    My understanding is that the Massachusetts Eye and Ear Infirmary is an excellent medical institution, the latest developments notwithstanding.  While I was disappointed to hear that they had suffered one of the most prosaic data breaches possible, something in their post-settlement statement triggered an even bigger sense of disappointment (my emphasis):

    The review of Mass. Eye and Ear by the U.S. Department of Health and Human Services (HHS) was triggered by the hospital’s proactive self-reporting of a doctor’s unencrypted laptop being stolen while he was traveling abroad in 2010.  Mass. Eye and Ear has no indication that any patients were harmed by this isolated incident.

    Proactive self-reporting?  HITECH's Breach Notification Rule makes it clear that it is the law to report such incidents within 60 calendar days of the HIPAA-covered entity discovering (or being alerted to) the PHI data breach.  The term "proactive self-reporting" tends to imply, in my opinion, that MEEI did not have a duty to report this breach but did so voluntarily, which is clearly not the case.

    From this standpoint, it is not different from PR moves by insurance companies who send breach notification letters to clients, noting that they're doing so "out of an abundance of caution" instead of 'fessing up that they're forced to do so under the Breach Notification Rule.

    I don't think too highly of this particular practice, and I must say I'm a little saddened to see Mass Eye and Ear engaged in a similar move.

    Related Articles and Sites:

  • Smartphone BYOD Security: Over 50% Of Android Devices Are Unpatched To Known Vulnerabilities

    The firm Duo Security has released some findings based on the usage results of their "X-Ray" mobile App for Android devices.  Based on their conclusions, more than half of all Android devices are vulnerable to known security issues that can be fixed by applying a patch.  Companies that rely on BYOD security to ensure that their mobile workforce is transmitting data and communicating securely should check on the patch status for devices.

    What is X-Ray?

    According to the FAQ for X-Ray, the folks over at Duo Security developed the Android app to scan "your Android device to determine whether there are vulnerabilities that remain unpatched by your carrier."

    More specifically, it will look at:

    a class of vulnerabilities known as "privilege escalation" vulnerabilities. Such vulnerabilities can be exploited by a malicious application to gain root privileges on a device and perform actions that would normally be restricted by the Android operating system. A number of such vulnerabilities have been discovered in the core Android platform, affecting nearly all Android devices. Even more have been discovered in manufacturer-specific extensions that may affect a smaller subset of Android users. Unfortunately, many of these privilege escalation vulnerabilities remain unpatched on large populations of Android devices despite being several years old.

    The comments section to the blog post announcing the findings shows users claiming that vulnerabilities were not found on certain devices; however, Duo Security noted that:

    Folks on Android >= 4.0.4 [using 4.0.4 or a more recent version of Android] shouldn't be vulnerable to anything that X-Ray currently detects. But, we are adding additional vulnerabilities (eg. WebKit vulns) soon, so check back in a bit!

    So, it's not a result of newer versions of Android being more secure (which, incidentally, they are) but that X-Ray is not running checks on newer versions of the mobile operating system.

    Incidentally, the app is not available on Google Play, so one must allow downloads and installs from third party sites (which is, ironically enough, considered an unsafe practice.  It shouldn't be  a problem in this particular case, though).

    50 Percent: Conservative Estimate

    The team at Duo Security based the "more than 50%" figure on data culled from over 20,000 devices worldwide.  The piece of intelligence that Android devices are vulnerable to attacks is not news, of course.  Many security companies have released their own findings.

    But this is the first study that looks into vulnerabilities that, in essence, shouldn't exist.  After all, the implication is that the application of a patch should nullify these specific threats.

    Incidentally, the figure is considered "conservative."

    Yes, it's a scary number, but it exemplifies how important expedient patching is to mobile security and how poorly the industry (carriers, device manufacturers, etc) has performed thus far. We feel this is actually a fairly conservative estimate based on our preliminary results, the current set of vulnerabilities detected by X-Ray, and the current distribution of Android versions globally.

    They should add in an extra factor: selection bias, which is a term used in statistics to refer to selection errors introduced to a study.

    In a nutshell, people who tend to run a vulnerability check on their devices tend to be the ones who are security-conscious.  Most people are not security-conscious when it comes to their smartphones and other mobile devices (this has been proven in study after study over the last few years), and will generally not download the app and run a check.

    Thus, the above results can only stem from people who have a tendency to be interested in mobile security, no matter how small that interest may be...people who lean towards applying patches, as opposed to not even knowing what a patch happens to be.  Quite disconcerting that over half of them are exposed to an easily fixable digital malaise.

    But then, you can't really blame mobile device users.  After all, patches have to be available if users are to apply them, and the conclusion is that carriers, manufacturers, and others are not releasing them as necessary.

    That's not news, though, is it now?

    Related Articles and Sites:

  • Laptop Encryption Software: Edinburgh City Council Adoptee Info Stolen From Consultant

    Edinburgh City Council has had to admit to the breach of children's data when a laptop computer was stolen from a consultant's home.  The computer was not protected with the likes of laptop encryption like AlertBoot.

    Files and Minutes Stolen

    According to, "files and minutes from dozens of reviews" containing sensitive information on foster and adoptive parents were among the data files kept on the stolen computer.  The laptop computer belonged to an independent consultant who reviews the information, but the program is operated by the Edinburgh City Council.  It is being reported that the information could have been "several years' worth."

    A Simple Solution

    The answer to this particular problem is blindingly obvious: the use of encryption software to safeguard the contents of the laptop computer.  A solution like the FDE protection wing of AlertBoot's Mobile Security solution suite, which features computer disk encryption as well as MDM -- which includes antivirus protection, remote lock and wipe, and other features necessary for smartphone and tablet security in a BYOD program -- under one easy-to-use console.

    So why did the council dilly-dally?  After all, it's not as if the loss and theft of laptops, especially by the different councils that operate within the UK, is some well-hidden secret.  In fact, one MSP called the situation "another wake-up call":

    Lothians Tory MSP Gavin Brown called for the council to take stricter security measures.

    He said: "This should serve as another wake-up call to the council about security. While there is always a danger of laptops being stolen, what the council can control is the level of security within the laptop."[]

    Another wake up call?  Hitting the snooze-button is OK if you have to meet friends for brunch; it's a calamity if your doing so has great repercussions.  I mean, can you imagine if a firefighter decides he needs "another five minutes" when the alarm in the fire station begins to ring frantically?

    What Will the ICO Think?

    It was only last week that the ICO came down on another council, the Scottish Borders Council, for being remiss when it comes to data protection.

    The Information Commissioner very specifically pointed out that the ICO took issue with how the council handled its responsibilities when a third-party came into the picture:

    The Data Protection Act requires that, if you decide to use another organisation to process personal data for you, you remain legally responsible for the security of the data and for protecting the rights of the individuals whose data is being processed.

    But Scottish Borders Council put no contract in place with the third party processor, sought no guarantees on the technical and organisational security protecting the records and did not make sufficient attempts to monitor how the data was being handled.

    Are more fines in store?  More importantly, do these monetary penalties even work?  Granted, you can't expect to feel the effects of a historical penalty in less than a week.  However, penalties have been handed out for a while now, since November 2010.

    Related Articles and Sites:

  • Laptop Encryption Software: Irish Telecom Company "Fined" €30,000 For Laptop Theft

    Two companies affiliated with Ireland's Eircom Group have been ordered to donate €30,000 to charities for the loss customer data.  In the final days of 2011, two laptops were stolen, around the same time but a different locations.  Unfortunately, these computers were not protected with data encryption software like AlertBoot.  Furthermore, they breached the law in a number of ways.

    Companies Plead Guilty

    I covered the original breach earlier this year.  In summary, two laptop computers containing customer information were stolen from an office in Dublin and at an employee's home.  A total of 7,531 people were affected.

    The use of encryption software would have ensured the integrity of the information -- which included copies of IDs and bank account details -- but neither laptop was protected properly.  This is especially surprising for the laptop that was stolen from the employee's home.  After all, there are laws that essentially forbid the practice.

    The companies also broke the law by not reporting promptly the incident to the Data Commissioner:

    The Office of the Data Protection Commissioner says all such breaches should be reported to its office within two working days.

    However, the court was told that in this instance it was only notified of the breach 30 days after the laptops were found to be missing. []

    The companies also started to inform customers of the data breach 38 days after the incident was discovered, with some taking as much as 73 days.  Both companies pleaded guilty to the late disclosure to the Commissioner, which resulted in the companies making donations:

    The judge gave the companies the benefit of the Probation Act, provided they pay a total of €30,000 to the Laura Lynn Foundation and Pieta House by the end of the month. []

    That's one weird penalty.  I mean, would the company be able to write off the money as a tax-deduction?

    160 Laptops Not Encrypted

    Some surprising details were revealed as part of the trial.  The Eircom Group had over 3,100 encrypted laptops before the data breach took place.  An audit after the incident found 160 unencrypted laptops (which have been encrypted since).

    This means that around 5% of existing machines were left unprotected.  Regardless of how you take that into consideration -- percentage or nominal value -- that's a huge number.  One wonders why they didn't perform an audit before the breach.  After all, the existence of over 3,000 laptops in an organization means that annual audits (at least!) ought to be performed.

    Perhaps their laptop encryption solution was not quite geared towards auditing.  In AlertBoot, the system was built from the ground up to ensure easy, comprehensive auditing via its integrated reporting; however, many encryption solutions out there aren't designed in this manner, sometimes because they're dealing with old technology with its own legacy of problems and limitations.

    Related Articles and Sites:

  • Data Breach Lawsuit: AvMed Laptop Breach Lawsuit To Proceed

    AvMed, the health insurer who saw two laptops stolen from its offices in 2009, and subsequently revised the number of affected people from 210,000 to 1.2 million, has received some bad news from the 11th Court of Appeals: it has overturned a lower court's decision and allowed a lawsuit against the firm to proceed.  It's the ongoing saga of a company that didn't quite live up to standards when it comes to protecting sensitive data on mobile devices.

    Ruling: Cognizable Harm Present

    With 1.2 million people affected, it shouldn't come as a surprise that someone "out there" decided to sue AvMed.  Of course, AvMed argued that the lawsuit didn't have any merit.  The courts originally agreed, noting the lack of a "cognizable injury": Essentially, the accusers couldn't prove that they were directly harmed by the AvMed data breach incident, as is usually the case when one's going after a firm that's suffered a data breach.

    Indeed, I had pointed out (as a non-lawyer) that one of the lawsuits had a novel approach to the situation.  Instead of going through the usual exercise of suing the company for losing clients' information -- and getting the suit tossed for a lack of cognizable injury -- plaintiffs accused the insurer of "misleading them":

    ....plaintiffs are saying that AvMed engaged in "misleading advertising" because the insurer claimed that they followed HIPAA when in fact it didn't: the evidences lies in the fact that (a) encryption was not used and (b) the laptops were stolen from a conference room accessible by anyone--including people who shouldn't have access to unencrypted PHI.

    This is an interesting approach.  Prior to reading the suit's details, I was going to remark that there haven't been any successful lawsuits mounted against companies since you have to prove harm, that your data stolen from company A was used to perpetrate a crime.

    This particular approach appears to have failed, seeing how, according to, the "court [of Appeals] upheld dismissal of charges of entitlement to relief under Florida law for the claims of negligence per se and breach of the implied covenant of good faith and fair dealing."

    The same court, however, noted that:

    "Plaintiffs allege that they have become victims of identity theft and have suffered monetary damages as a result. This constitutes an injury in fact under the law."

    The ruling also said that despite the length of time between the laptops' theft and the identity thefts, it is plausible the two events were connected. Plaintiffs "have sufficiently alleged a nexus between the data theft and the identity theft and therefore meet the federal pleading standards," said the ruling.

    More specifically, two of the plaintiffs have alleged harm due to AvMed's carelessness when it comes to mobile data protection:

    • Ms. Jauna Curry, whose information was used 10 months after the incident to open a Bank of American account and credit cards (that were used).
    • Mr. William Moore, whose information was used 14 months after the incident to open an E-Trade account that was overdrawn.

    The plaintiffs will still have to show that their financial injuries can be tied back to the AvMed data breach, of course.  And while I personally wonder whether this will be possible -- there are numerous data breaches in any given year, made public and not -- that is something to be established in the courts.  The fact that it appears nearly impossible to prove certainly isn't a valid reason for tossing out a case without its due day in court.

    Proper Data Security at Root of Problem

    The situation has been unfolding for the past three years or so.  While it does take time for cases to wend its way through the judicial system, three years seems like a long time (which, as an aside, is why you want to prevent something like this happening in the first place).

    And, as we can see from the case being taken to a higher court, the plaintiffs are dead set on seeing things until the end.  Why so serious?

    The answer may lie in what AvMed did.  Or, rather, what it didn't do.

    • It left the laptop in a conference room which was easily accessible by anyone.
    • The laptop in question was not properly encrypted despite the sensitive data it stored.

    If you store and use sensitive data, encryption software is a must.  And if you as a company are embracing the BYOD trend, then the use of tablet encryption and smartphone encryption is a must, as well as the use of other data security tools like mobile device management (MDM) solutions.

    Related Articles and Sites: (free but registration needed)

More Posts « Previous page - Next page »