This Blog




AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.


AlertBoot Endpoint Security

AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.

August 2014 - Posts

  • UK Laptop Encryption: Getting Fined £180,000 For Not Knowing To Turn Encryption On

    One of the classic images from The Benny Hill Show that lingers in many people's minds is that of the blundering police officer.  I always imagined it to be a good ribbing, or possibly intended to spark a bit of outrage in what was a very (seemingly) proper and buttondown era.  But when I run across stories like the following, it makes me wonder whether it was the highest form of satire.

    UK Ministry of Justice Fined… Again

    According to many news sites, the UK's MoJ has been fined a significant amount by the Information Commissioner's Office (ICO).  Indeed, the fact that the MoJ has to fork over £180,000 in penalties for a 2013 data breach has been the leading headline of the story, followed by the fact that it was its second data breach is as many years: in 2011, the MoJ was fined £140,000, although the circumstances of that particular data breach were very different from the latest one.

    The 2013 data breach centers on the loss of an external hard disk drive that was used as backup media.  Information on 2,935 prisoners was contained in it, including "details of links to organised crime, health information, history of drug misuse and material about victims and visitors."

    The use of encryption software would have prevented the data breach.  And, following the 2011 data breach, the Her Majesty's Prison Service did provide hard drives with encryption (from
    in May 2012 the prison service provided new hard drives with the option to encrypt data to all of the 75 prisons across England and Wales
    Excellent!  The system of propagating data security down the line works!  Except…
    the ICO found in its investigation of the back-up hard drive from HMP Erlestoke in Wiltshire that the prison service didn't realise that the encryption option on the new hard drives needed to be turned on to work correctly. (my emphasis)

    In Their Defense, It Can Get Confusing

    If not using encryption to protect information on data storage devices is a blunder, having access to encryption tools and not using them is an even more terrible one.  Unless, it could be argued, one made an effort to use them.  And it seems that, in a weird way, employees at HMPs did: as far as I can tell, the ICO has not indicated that anyone was found with unauthorized storage devices.  The problem has been with the authorized hard drives themselves.

    It is not necessarily ignorant to assume that a hard drive that comes with encryption already arrives encrypted.  For example, Apple's iPads, iPod Touches, and iPhones come pre-encrypted (in fact, you can't turn it off).  One is only required to apply a password or a 4-number PIN to guarantee data security.  Plus, there are numerous external data storage devices that come pre-encrypted.  Again, setting a password is all you need to do.

    The confusion can also extend to encryption software, which is why AlertBoot provides encryption status indicators on endpoints as well as in its management portal, albeit it's more for monitoring and auditing purposes.

    Regardless, there's a reason why you train people how to use tools.  If the MoJ had truly just distributed the hard drives without giving a second thought to teaching people how to use it, or to conduct a security audit soon afterwards, to see whether they were being used correctly, it's only logical than an exasperated ICO is coming down on them with furious might.

    Related Articles and Sites:
  • Cost of Target Data Breach: $148 Million, CEO's Job, and Depressed Stock

    The story for Target gets worse and worse.  After admitting to a data breach in December of last year, the company has seen some of the worst that a data breach can bring to a company.  Although regularly compared to the TJX data breach of 2005, it seems to me that even TJX didn't have it as bad back then.

    Breach Cost: $148 Million

    According to, the Target data breach of 2014 has cost the company $148 million, if the company's forecasts are to be believed (previously, other sources put the figure at $200 million and counting):
    "In second quarter 2014, the company incurred gross breach-related expenses of $148 million, partially offset by the recognition of a $38 million insurance receivable,” Target said in its earnings report. “Expenses for the quarter include an increase to the accrual for estimated probable losses for what the company believes to be the vast majority of actual and potential breach-related claims, including claims by payment card networks."
    Notice how the language implies these are costs directly related to the data breach itself.  Other costs such as loss of client revenue (which slid 5% in the Xmas quarter over the previous year while competitors showed growth, according to, falling stock prices in the market (a 19% fall at one point), loss of brand value, etc are not factored in.  In light of this, outsiders' $200 million figure is probably closer to the mark, if not the lower end of the estimate.

    In addition, there is the cost associated with the loss of the CEO.  Although the 35-year veteran resigned, it's pointed out that it was a matter of time before he did so, seeing the extent of the breach.  Finding a new CEO takes time and money, and a company could founder while the search is ongoing.

    Was the Resignation Necessary?

    While we are on the subject, should it really have cost Gregg Steinhafel his job?  As far as I know, CEOs at retailers do not have IT backgrounds.  How could Steinhafel have prevented or limited the data breach?  The best he could possibly have done was to support the IT department to do whatever is necessary to secure Target's client data.

    Did he?  Or was he instrumental in denying his IT guys the tools and other resources necessary for data security?  For example, let's take an example from the other data breach that Target's is being compared to: TJX.

    One of the stories that struck me about the TJX data breach (it's the first thing I recall on the subject even now, seven years later) is that the company had knowingly implemented weak security as a cost-saving measure, and this had a direct effect on the company being hacked from its stores' parking lots.

    It's a Different Game Now

    As long as we've brought up some details concerning TJX, let's explore a different face of the data breach: the fact that Target has lost sales, and that this seems to be tied to the data breach.  Because competitors saw an increase in sales in the same period, and Target has had a history of performing stronger than its competitors, it wouldn't be an inaccurate assumption to tie the depressed sales with the data breach.

    Consequently, it wouldn't be a far reach to assume that the value of Target's brand has been negatively affected.  Or to assume that the devaluation of the stock price is tied to either or both the lowered sales forecast and brand value / goodwill.

    What's exceptional about all of this is that it's not exceptional at all.  Logic dictates that all of this should happen.  But, this was not what happened to TJX seven years ago.  In their case, they saw increased year-over-year sales revenues.  Their stock price didn't really take a beating; it barely budged.

    All of this went counter to what people were expecting.  At one point, I theorized that TJX was doing fine because (1) the economy was so bad that people needed to shop somewhere, and you really couldn't get lower prices than at TJX (the celebrated retailer Target it was not; the bigger you are, the bigger you fall); and (2) people still had no idea what being a victim of a data breach meant.

    Fast forward seven years, and things are different.

    The Tragedy of the Commons

    There are a number of things about the ramifications of the Target breach that I don't really agree on.  Above all, I think the CEO should have stayed in his position absent of any improprieties regarding data security at the company. (Among other things, I believe that the concept of "once bitten, twice shy" is accurate when it comes to data breaches.  Whereas an incoming CEO might think, "eh, it ain't going to happen to me" and thus the renewal of the cycle.)

    For example, consider this scenario: a CEO decides that, for cost reasons, he will not implement laptop encryption on company machines that are used by travelling sales reps who are authorized to store sensitive, personal data on these devices.  This is a hallmark of a CEO who's being shortsighted when it comes to data security.

    It's obviously a bad policy and if a data breach were to occur, the buck should stop at the CEO's desk.  But what if the CEO is very much pro-data security, has done all that he could, but a data breach occurs anyway?  After all, data security is ultimately about risk management.  A systematic risk exists in every data security scenario, one that cannot be eliminated, that can be equated to an act of God under particular circumstances: something that is expected to happen, but that no one knows when, where, how, or to whom.

    If a sinkhole were to open up and swallow a Target store, would you hold the CEO responsible?  After all, it will affect sales figures, which will in turn affect the stock price.  It's a given that survivors will sue the company.  The brand will be tarnished (you should never underestimate the number of people who think that another Target store will suffer the same circumstances due to bad juju).  I get the feeling, though, that the CEO would be fine in this case.

    Of course, you could point towards all the security upgrades Target is currently undergoing and say that it's obvious that the CEO didn't do all that he could do.

    What's also true, however, is that, as far as I know, no other company in the US has undergone the kinds of security upgrades that Target is currently undergoing.  By the end of all this, Target will potentially be one of the best protected retailers in the US, at least for a while.

    Such a project is not easily undertaken without justifiable reason, especially if it costs millions of dollars.  Money at such levels cannot be decoupled from opportunity costs and thus from competitive pressures.  The forces that power the tragedy of the commons is obviously at play when it comes to data security.  At the end of the day, it may require government intervention so that the playing field is leveled and no company feels that they're taking a backward step by giving data security the level of attention it deserves (and others ignore).

    Related Articles and Sites:
  • HIPAA Laptop Encryption: Cedars-Sinai Announces Data Breach Tied To IT Employee

    There is this theory among the HIPAA security crowd that education is as effective as technical solutions, or that it can be even more important.  While not wrong, it depends on what you're looking at.  At the end of the day, it's the technical solutions that will do the heavy lifting when it comes to securing PHI data.  It's also what will increase compliance rates, as the following story illustrates.

    Cedars-Sinai Health System Employee Laptop Theft

    According to information filed with the California Attorney General's office, Cedars-Sinai suffered a data breach when an employee's laptop was stolen from his (or her) home (from
    The laptop, which was used by the employee for troubleshooting software used for clinical laboratory reporting, was stolen along with personal items of the employee in a June 23 burglary at the employee’s home. (The employee’s duties included being available outside of normal business hours to troubleshoot software problems as they occurred, which is why the laptop was at the home.)
    As far as I can see, all indications point towards the person being a Cedars-Sinai employee, as opposed to being an outside contractor.  Whether the stolen laptop was the employee's own or a work-issued machine is not readily apparent, although I see suggestions that it might be the latter.  However, the hospital did admit that HIPAA laptop encryption was not used to secure the computer.

    But what I really want to point out is this: data security education can only go so far.

    Even Professionals Make Mistakes

    Why was encryption software not used?  Perhaps the laptop was a personal one, so Cedars-Sinai couldn't touch it.  Or perhaps it was a Cedars-Sinai laptop but it fell through the cracks when it came to encrypting it (the most likely answer).  Or perhaps a commonly made argument was used (not likely in this particular case): because HIPAA doesn't require the use of encryption, it wasn't.  Instead, employees were made to read and sign computer usage policies, and periodic data security sessions were held as an alternative to a technical solution.

    You know, stuff like: employees are not authorized to take PHI, in any form, off the clinic's / hospital's / research center's /what-have-you premises.  The idea is that employees won't do so, especially if they're educated about the potential risks.

    The problem is, education is not the problem.  Who could be more cognizant of the potential for a data breach, and the ensuing ramifications, than a guy who works in the hospital's IT department?  And, yet, here we are today, reading about it.

    Education has its place, but it tends to work better as a secondary or auxiliary option.  For example, use encryption to secure laptops but make sure that employees understand they shouldn't share passwords (in fact, don't give them a reason to: give everyone their own access IDs) nor display them as notes or Post-Its attached to the device in question.

    Related Articles and Sites:


  • Community Health Systems HIPAA Data Breach Second Largest, Company Has Cyber Insurance

    One of the largest data breach items in the news this week is that of Community Health Systems, a Fortune 500 operator of general acute care hospitals.  With 206 hospitals in 29 states, it's no wonder that the latest HIPAA data breach could be affecting nearly 4.5 million people.  And while the use of encryption software could not have prevented this attack, it looks like a known bug was at the heart of the matter.

    It makes me wonder if CHS will successfully get their cyber-insurance money.

    Chinese Hackers

    According to (and many other media outlets), CHS fell victim to Chinese hackers who used a widely known security flaw to steal sensitive medical data belonging to 4.5 million patients.  Community Health Systems is apparently the second-largest chain of hospitals the US, which not only explains why so many people were affected but also why the hospital chain was a target.

    The information that was stolen includes Social Security numbers, names, addresses, and other sensitive data.

    Heartbleed: A Known Bug

    According to a security consultant who was tipped off about the situation,
    the hackers used the Heartbleed vulnerability to collect user credentials from the memory of a hospital device… and used them to log in through a… VPN. The attackers then extended their access into the company’s network. []
    Heartbleed is a flaw that was discovered by a Google engineer.  It was made public in April of this year and caused something of a sensation as it pointed towards a zero-day vulnerability that could affect the whole of the internet.

    Many companies were forced to update their security to stave off potential attacks. (In the case of major companies in the Fortune 500, you can strike off the word "potential."  The attacks were coming, and people knew it).


    In a separate publication, reported that CHS filed a statement with the SEC that it "won't take a financial hit because of the recent breach of security" since it "carries cyber/privacy liability insurance to protect it against certain losses related to matters of this nature."

    That's great news for the company.  But it's not going to be so cut and dry for them when it comes to collecting their money, I imagine.  For one thing, plenty of insurers have gone to court trying to get out of paying for such policies, usually based on a technicality (Of course.  What else?).  And for good reason, too.  The insurance payouts can be astronomical, and with 4.5 million people affected, you can bet it's not going to be on the cheap side.  Any insurer would be tempted to find a way out of making good on the contract.

    The fact that this attack was (supposedly) carried out using the Heartbleed bug poses some challenges for CHS.  While the bug was made public in April, CHS stated in its SEC statement that "it thinks the breach came in April and June."  While one could argue that the April attack couldn't have been counteracted – implementing a solution across 206 hospitals takes time – making the same argument for the June attack is not as easy.  If it can be proved that other companies had successfully updated their online security to counter Heartbleed, things might get tough for CHS.

    Perhaps the saving grace is that the "hospital device" that acted as a gateway to CHS's database requires the manufacturer to provide an update to the vulnerability.  As long as there wasn't an implementable solution, things would have been out of CHS's control, and it could argue that it had done everything technically possible to secure PHI.

    Related Articles and Sites:
  • HIPAA Laptop Encryption: NYU Langone Had A Laptop PHI Breach In April

    According to, NYU Langone Medical Center announced a data breach in June, a little before their July admission to another data breach that affected 8,400 people.  Unlike the latter announcement, though, the June announcement appears to be somewhat outside of NYU's control.  It's a shame that not everyone is getting the story on the importance of medical laptop data encryption, for it's the one solution that would have prevented the data breach.

    Laptop Stolen from Employee Car

    It's a recurring topic, this story of a laptop with sensitive data being stolen from an employee's car.  According to the NYU press release, the breach announced in June (which actually took place on April 25.  I should note that HIPAA requires notifications within 60 calendar days, and it looks like NYU came very close to the deadline) arose from a vehicle burglary.

    In California.  (I don't know whether NYU has a branch out in The Golden State but my guess is that the answer is "no").

    And while "the employee promptly filed a police report with the California police department and notified the Medical Center of the incident," it hasn't been explained why the computer was not protected with HIPAA-compliant encryption.  After all, this is not the first time that NYU has had a data breach involving electronic PHI.

    At first, I thought it was because the device was a personal one belonging to the employee who instigated the data breach, and thus NYU had limited control over what data it carried: "The use and storage of PHI on unencrypted personal devices is strictly prohibited and against Medical Center policy."  Then I realized that this particular statement could be one that had no bearing whatsoever on the case itself, and that the hospital was just giving a general description of their policies.

    The fact that they would have another data breach nearly one month afterwards is certainly a coincidence but one that could very possibly lead the HHS/OCR to closely investigate the situation, as NYU Langone has had more than its fair share of incidents over the past five years.

    Related Articles and Sites:


  • UK Laptop Encryption: ICO Warns Barristers And Solicitors To Secure Information

    The UK's Information Commissioner's Office (ICO) has recently issued a warning to people in the legal profession. According to an August 2014 news release, the ICO has seen a higher-than-expected-number of data breaches stemming from the actions of solicitors and barristers.  Normally, the recommendation would be to use data encryption software to secure sensitive files; however, because paper files are involved, easy data security efforts are being stymied.

    15 Incidents in 3 Months

    The ICO revealed that there were "15 [data breach] incidents involving members of the legal profession have been reported to the ICO" in the three months preceding August 2014.  It noted that while these numbers don't seem high, legal professionals "carry around large quantities of information in folders or files when taking them to or from court, and may store them at home" increasing the risk of a serious data breach.

    Data Protection Tips from the ICO

    The problem is that these files and folders are of the paper, not computer, variety.  What to do?  The ICO has proposed the following recommendations: 
    • Keep paper records secure. Do not leave files in your car overnight and do lock information away when it is not in use.
    • Consider data minimisation techniques in order to ensure that you are only carrying information that is essential to the task in hand.
    • Where possible, store personal information on an encrypted memory stick or portable device. If the information is properly encrypted it will be virtually impossible to access it, even if the device is lost or stolen.
    • When sending personal information by email consider whether the information needs to be encrypted or password protected. Avoid the pitfalls of auto-complete by double checking to make sure the email address you are sending the information to is correct.
    • Only keep information for as long as is necessary. You must delete or dispose of information securely if you no longer need it.
    • If you are disposing of an old computer, or other device, make sure all of the information held on the device is permanently deleted before disposal.

    What I find interesting is that at least half of these recommendations are actually geared towards electronic data.  I guess, at the end of the day, you can't really give pointers for just the one and not the other.

    Plus, what kind of tips could you possibly give for a bunch of sheets collated together or stuffed into a folder, aside from "don't lose them"?

    Related Articles and Sites:
More Posts Next page »