This Blog




AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.


AlertBoot Endpoint Security

AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.

June 2011 - Posts

  • Disk Encryption Software: Ohio PASSPORT Program Breach

    The Ohio District 5 Area Agency on Aging has announced a data breach that affects 78,000 people in total, if I'm reading things correctly.  The breach was triggered when a laptop computer was stolen.  The implication seems to be that hard drive encryption like AlertBoot was not used to secure protected health information.

    Car Theft

    The laptop computer was stolen from an employee's vehicle on June 3.  The employee was a case manager, and the car was parked at a library in downtown Mansfield (per  It contained personal health information (PHI) on 43,000 people, and contact information on an additional 35,000 "related clients' personal representatives" are affected.

    Just what type of patient information was lost (PHI can range from medical diagnostics, to patients' names, to their credit card numbers) has not been specified.

    The PASSPORT program stands for "Pre-Admission Screening Providing Options and
    Resources Today."  In a nutshell, it provides alternatives to nursing homes and other forms of age-based institutionalization (hate to put it that way, but that's what is, essentially).  It is funded via Medicaid, which means that the loss of this laptop has HIPAA / HITECH repercussions.

    Or does it?  The agency director has been quoted that the computer requires two passwords, and hence it would be difficult to be hacked (

    Password or Encryption Software?

    The problem with the director's statement is that it doesn't really reveal anything.  Are the passwords in any way associated with encryption?  Or is it literally two passwords?  If the latter, it could be that there is less security than the director believes there is.

    After all, if the use of passwords alone were considered "security," why is it that safe harbor is not extended under the HITECH Act when patient data is lost or stolen, whereas the use of medical data encryption does afford safe harbor?  (By safe harbor, I mean granting a breached entity the option to not send notification letters.)

    Is it because of the encryption lobby?  Nope.  It's just that encryption today secures any sensitive data that is worth securing, not because of some conspiracy, but because encryption works.

    Related Articles and Sites:

  • Hard Disk Encryption For Non Profits in the UK

    Two charities out of the UK have signed Undertakings with the Information Commissioner's Office (ICO): Asperger’s Children and Carers Together (ACCT) and Wheelbase Motor Project.  In both cases, laptop encryption software like AlertBoot was not used to protect sensitive information, resulting in a data breach when their respective notebook computers were stolen.  Summaries of those incidents follow below, but first....

    Non-Profit Encryption Software: AlertBoot (and Probably Others) Offers Discounts

    There are substantial legal differences between non-profits and their for-profit counterparts.  However, there are certain areas where the law applies equally, too.  For example, the law wouldn't treat employees at a non-profit front for the mob any differently from a for-profit that acts as a front.

    Likewise, the law applies equally to both non-profits and for-profit corporations when it comes to data breaches and data security.  As far as I know, this is true regardless of where you are, be it the UK, the US, Canada, Mexico, whatever.

    I'm still quite apoplectic over the incident that prompted me to write up why Nevada's non-profits need to abide by the Nevada Personal Information Security Law.  In fact, I don't see why I even had come up with that trail of proof: it's common sense.  Sensitive data is sensitive data, no matter who loses it.  Why would a non-profit think that they don't have to comply with data security regulations just because they're not making a profit?  I mean, do they also not have to follow employee-rights legislation, either?

    Of course, all of the above being said, it doesn't mean that a non-profit must also bear the same weight that a for-profit does.  There are plenty of companies, including AlertBoot, that will extend a discount in its products and services to a non-profit organization.

    (If you are a non-profit and are interested in AlertBoot, contact us and let us know you're a non-profit.)

    Granted, using security products probably means that you're diverting resources into something that feels less worthwhile, especially considering your mission.  However, make no mistake, a data breach affects for-profits' and non-profits' clients alike.  I'd hate to find, for example, that children suffering from Asperger's Syndrome also have to fend against ID theft and fraud because my organization didn't have adequate data protections in place.

    Asperger’s Children and Carers Together

    ACCT reported a breach to the ICO when a computer with sensitive information for 80 children was breached when a laptop computer was stolen from an employee's home, over Christmas.  The computer held children's names, addresses, and dates of birth.

    Per the Undertaking, we know that ACCT has installed encryption software on its computers since the lamentable event.

    Wheelbase Motor Project

    Wheelbase reported a theft from its offices: an unencrypted hard drive was stolen, impacting 50 people.  The external hard drive was used as a backup, and contained details on criminal convictions, racial background, special education needs, and child protection issues were breached as a result.  The incident took place in February 2011.

    Wheelbase "intends to complete encryption of all back-up devices by 4 March 2011," per its Undertaking agreement.  That's a weird statement to be making, unless they're implying that desktop computers and laptops were already encrypted to begin with.

    If the implication is not true, then there is the other implication that backups will be encrypted while computers will not, which is quite the feebleminded decision.  I'm hoping it's a misquote due to the formatting of the Undertaking itself.

    Regardless, the ICO is requiring in the Undertaking that any portable devices that contain sensitive data is properly secured with encryption.

    Related Articles and Sites:

  • Data Encryption: Staples Business Depot (Canada) Terrible At Wiping Data. Why Are They Responsible?

    An audit report by Canadian privacy commissioner Jennifer Stoddart laments the fact that Staples Business Depot has failed to get a grip on its continuing data breaches.  This is one instance where the use of drive encryption software like AlertBoot doesn't make sense (kind of).

    Not Really Staples's Fault

    When a story involves a giant corporation, customer data, and data breaches, it's usually the corporation that is in the wrong.  In this particular story, Staples is at fault, as one would assume; however, I cannot bring myself to blame them.  Ultimately, they are having data breaches because its customers' are being idiotic.

    The official travesty on Staples's part is this:  Despite the implementation of new procedures, an audit of resold used computer equipment shows that 1/3 of products for resale contain sensitive data.  It's not quite clear how many of these products originally contained sensitive data, which leads me to speculate: so how many products are successfully scrubbed of data?

    For example, are we to assume that 100% of all returned products contain sensitive data, and hence the 1/3 figure means a 66% scrubbing success rate on Staples's part?  Or do 33% of returned products contain sensitive data, meaning that there is a 0% scrubbing success rate?  Most probably, it's a figure in between.

    On the other hand, I'm not sure that I should be asking this question.  The real question is, "why are customers returning stuff to Staples with personal data in them?"

    It's Convenient, But It Shouldn't Be That Way

    It's a weird arrangement.  Why is Staples charged with scrubbing the data?  Probably because it's the most convenient method of ensuring data security.  But it seems to be something of a moral hazard, too.

    Consider a wallet.  Let's say you get a wallet from an on-line retailer.  You use it for a couple of days, placing in it cash, identification, credit cards, etc.  You find out that it's not quite what you were looking for so you return it.  Without removing the cash, ID, and cards.  If stuff gets lost or stolen, whose fault is it?

    Consider another scenario.  You put up for an auction a used computer on eBay.  Someone makes a bid and you send it to that person after checking the money's in your bank (or PayPal account).  But, you don't delete the data on that computer, and soon find that someone is accessing your on-line banking account.  Whose fault is it?

    Of course, the right thing to do in the first case is to return the cash, ID, and cards to the rightful owner.  And, in the second case, the buyer of the computer doesn't obtain the right to do whatever he wants with the data on it.  But, pragmatists will likely observe that in both cases the victims were acting stupidly.

    In Staples's case, you've got a situation where 33% of items for resale still had sensitive data on it.  This means that at least 3 in 10 people don't do anything to scrub data, or at least don't check to see if their data is actually scrubbed after they think it's been deleted.  That's a significant number of people not exercising proper data security.

    I'm sorry to point out that, perhaps there is a bigger problem here than Staples's data scrubbing policies not working.

    Related Articles and Sites:

  • Data Security: The Undiscussed Side Of Manning And WikiLeaks

    This blog generally does not delve into politics, since politics doesn't have anything to do with data encryption software (well, with the possible exception of office politics).  Hence, when we did have any posts regarding Bradley Manning's now infamous WikiLeaks submission, it was from the perspective of how the military could have avoided the situation, and how there's a limit to what they can do.  Whether Manning is a spy, a patriot, a traitor, or just plain cuckoo -- all of that was outside the scope of this blog, and still is, despite the editorial nature of this blog.

    However, I couldn't help but provide a link to this latest piece by -- which paints a very sympathetic picture of Manning and a pretty damning one of the military -- and provide some commentary.

    Not Fit for Military Duty

    I'm not going to go into details; you can read it yourself and form your own opinion.  The Guardian's article above, which I'm assuming is forthright, outlines why Manning should not have been in the military.  Based on the details revealed in that article (and accompanying video), the military essentially set itself up for a disaster (and took some small pains to avert one), although I doubt that anyone could have imagined it would have reached such levels.

    Again, not defending Manning for what unfolded, but it seems clear, and not just in retrospect, that he shouldn't have been there: pretty much everyone knew that.  The military brass -- who knows at which level -- decided to take a chance, which brings me to what I want to criticize.

    No Data Security Whatsoever

    Again, assuming that The Guardian is accurate in its article, the Forward Operations Base Hammer, where Manning copied the eventually leaked data, didn't have an iota of real data security place.

    Certainly, they issued personal passwords, and there was physical silo-ing of data networks, etc.  But, sticky notes with passwords everywhere?  People who don't have the authorization to log on to a network that do so?  Other people who witness this but say nothing?

    I can understand that last part, based on my military experience: sometimes you just shut up and live with it because, overall, you've facing bigger problems that require the frail peace and harmony that's uniting soldiers, so why risk that over what is a technical violation of the field manual, as long as nobody is getting hurt?

    It holds even truer in a forward operating theater: the enemy is out there, not amongst us.  Possibly why things were lax inside the base: you might not like the guy, but you trust him (or her) to have your back and to reciprocate in return (sometimes you have to because there's no other option).

    On the other hand, it's this kind of real-world behavior (as opposed to what's supposed to happen per policy) that eventually leads to the defeat of technological security measures, be it a trillion-dollar secure network or a commonplace data disk encryption program.  Hence the existence of policies, and why senior officers are charged with enforcing them (as opposed to getting out in the battlefield with guns blazing).

    It seems to me that, based on principle, there should be more heads rolling than Manning's, WikiLeaks or not.  Any senior officer who allows physical perimeter security to knowingly degrade is personally accountable -- especially if there is a breach.  Think: a spy gets into base, and it's because nobody was doing rounds, nobody was at their post, and the senior officer on duty failed to check up on soldiers.

    Why is it so different when it comes to virtual security?  Why is no one else in trouble after this incident?

    This is My Rifle.  There are Many Like It, but This One is Mine

    A final parting shot to what is 20/20 hindsight.  According to The Guardian article, the bolt to Manning's rifle was removed "because he was thought to be a danger," two months into his tour in Iraq (a danger to his fellow soldiers, obviously).

    In other words, he was rifle-less.  When this happens, per what I've witnessed, you're relegated to some other duty, like helping out in the mess hall or the laundry room because you're not fighting without a working rifle.  But, by all accounts, this guy is an ubergeek.  His "rifle" is not really his rifle, it's his computer.  Why didn't they remove his computer's CPU and have him scrub pans if they were so concerned?

    The dissonance between the reactions in physical and virtual environments is just astounding.

    Related Articles and Sites:

  • Laptop Encryption Software: NHS North Central London Loses 12 Laptops, Claims Minimal Risk (Updated)

    NHS North Central London has announced that 12 laptop computers were stolen from a storage facility.  Of the computers, one supposedly contained the records for 8.63 million people (unconfirmed).  The computers were not protected with hard disk encryption like AlertBoot, although password-protection was used (which is a diary lock to encryption's bank vault).

    Update ( 22 June 2011): Per the Sun, this is biggest data security breach involving NHS.  Also, a total of 20 computers were stolen initially, but 8 were recovered.  The laptops supposedly cost £10,000 each (that's US$16,000).

    Nothing To Fear?  Sounds Like It...

    NHS North Central London has explained why 8.63 million records were on that laptop:

    London Health Programmes routinely audits a large amount of data to track the care that patients receive. The data did not include patient names. []

    This is actually understandable.  I'm sure that people will hear laptop and sensitive data and go into a knee-jerk harangue about portable computers and sensitive data, and how such information shouldn't be saved on it, etc.

    As if desktop computers don't get stolenPlenty of them are filched.  And why wouldn't they?  Today's desktops are tiny.  Plus, laptop sales now outpace desktop sales.  I get the feeling that in 10 years "desktop" computers could potentially be smaller than laptops, if you can find them, that is.  The Mac Mini, for example, already points towards where things are going.

    The rhetoric that sensitive data shouldn't be on a laptop computer is at this point in time, and probably going forward, an antiquated argument.  After all, people have to work on something, and if laptops are what's provided at work, that's what you use.

    Rather, the obvious criticism is that sensitive data shouldn't be stored on an unsecured laptop computerPassword protection does not count as security.  Instead, encryption software should be used.

    There is one other option, though: deleting the data.  After all, you can't have a breach of any data that is not there.  Which is what NHS North Central London claims they did:

    The data was deleted from the laptop after it was analysed, (sic) and we currently believe there is a very low risk that any data could be recovered from this laptop or that patients could be identified from the loss of this data.

    ...But Can You Prove It?

    If this is true, why do they say there's "a very low risk"?  For a couple of reasons.  First, did they really delete it?  Or does someone think they remember doing it?  Regardless of what type of data security practices you engage in, being able to prove your data is secure is as important as the security itself.  So, does NHS North Central London actually have proof, or are they pointing towards their policies which requires it (and might have been ignored)?

    Second, the data could be recovered unless it was written over.  Generally, data rewrites are done at the end of a computer's life, when one's getting ready to toss the hard disk.  However, there is software out there that will write over individual files when you "empty the recycle bin" on your computer.  If such software was not used, the files could conceivably be recovered.

    Of course, with the use of encryption this would be a moot point.

    Related Articles and Sites:

  • Data Encryption Software: Tom's Hardware Also Notes The Need For Longer Passwords

    Another day, another article that shows how passwords of eight characters long are not deemed secure anymore.  The beauty of Tom's Hardware is that, aside from noting what should be obvious by now if you're a reader of general security news, is that they test things using their own equipment, plus go into more depth when it comes to passwords and data encryption similar to AlertBoot (TH uses compression programs that also have the ability to encrypt).

    It's a mighty good read, even if the conclusions arrived at are slightly different from other similar research like this one and this one.

    The one thing that all the different researchers (I use the term loosely) do agree on: the age of eight-character passwords is over.  If I were you, I'd go with the recommendation by the academic and use a 12-character password.

    Related Articles and Sites:,2945.html

More Posts « Previous page - Next page »