This Blog




AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.


AlertBoot Endpoint Security

AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.

December 2011 - Posts

  • Medical Data Security: Will 2012 Finally See the Final Rule To HITECH?

    The Senate Judiciary Subcommittee on Privacy, Technology, and Law recently asked a number of questions to the Department of Health and Human Services (HHS) and the Department of Justice (DOJ):

    • What's holding back the Final Rule to the HITECH Act amendment to HIPAA?
    • Why is enforcement of current medical privacy laws so lackadaisical?

    While the Interim Rule has been successful in certain areas - such as in the use of medical data encryption technology, which has increased significantly in medical settings, supposedly, and covered entities reporting their data breaches to the HHS -- there are other areas that are lagging, as medical organizations wait for a definite, final ruling.  The lack of enforcement, of course, also dissuades them from fully following the Interim Rules, which, despite the name, is the current law.

    HHS, DOJ Respond

    The HHS and DOJ, for their part, have noted that they are "ramping up" enforcement and pointed out past convictions:  In 2010, CVS/Rite Aid settled with the government's charges, and Massachusetts General Hospital was fined a record $1 million earlier in the year.

    As to the final HITECH rules, a specific timetable has not been revealed.  However, its need and importance has been underscored by various people who've appeared before the committee.  It was pointed out that the regulations (the final rule) are necessary for progress to be made.  In fact, one person testified that

    The lack of a final HITECH Rule is “a big reason” why the HIPAA Privacy and Security Rules are not being effectively enforced.  In response to a question by Senator Richard Blumenthal (D-CT),  Myrold [Privacy Officer for the Hennepin County Medical Center] speculated, “Until we actually get those final rules and people know that they’re going to actually be enforced, you’re probably not going to see a lot more compliance.” []

    That makes sense.  Take the uptake of encryption software like AlertBoot after the last Interim Rule was passed.  Although nothing is final, covered-entities have deployed them with some measure of alacrity because:

    • The Breach Notification Rule requires going public with a data breach unless protected health information (PHI) is protected with encryption
    • HIPAA's breach notification harm threshold was eliminated (the harm threshold essentially put the breached entity in charge of concluding whether the breach was significant or not) My mistake.  The harm threshold was not eliminated as of yet.  Supposedly, it's the reason why we still have the Interim Final Rule, and under it, the significant risk of harm is still applicable

    In this particular case, it didn't take a genius to see what would be included in the Final Rule and what wouldn't be, especially considering the development of similar rules or laws, federal as well as state-level, that applied in other areas, such as finance and government.

    Furthermore, a final law implies finality.  People wouldn't be able to make excuses that they were waiting for a definitive law.  The accounting and procuring departments wouldn't be hamstringed with trying to guess whether investments and expenditures today will mean write-offs (and possibly their heads) tomorrow because of changes in legislation.

    Will we see the finally see the Final Rule in 2012?  Hopefully.  But, it goes against history.  My research shows that the original Final Rule for HIPAA's security of electronic health information was eventually published in 2003 with a compliance date of October 16, 2003, which is a full seven years from the Health Insurance Portability and Accountability Act of 1996.

    Related Articles and Sites:

  • Data Encryption Software: Using Your Bum As A Password?

    Researchers at the Advanced Institute of Industrial Technology in Tokyo have developed a new way to identify people: using 360 sensors arranged and embedded into a seat to measure the pressure profile of a person when they sit down.  The system has 98% accurate identification rate in the lab.  Besides the obvious application of using it in cars, there are ruminations of using it as security identification in office settings.  Can you imagine accessing your computer disk encryption using your butt?

    Developed as Car Anti-Theft Device

    The car-seat identifier uses 360 sensors to measure a person's pressure profile, each sensor reading the pressure on a scale of 0 to 256 (update: yeah, I know it's probably 0 to 255.  I just reports it as I sees it...literally, in this case).  From the figures that I can see, it's a matter of creating a pressure contour line.  In other words, a virtual ass groove (all the articles I've read refer to it as measuring the posterior, the rear, etc.  Let's not beat around the bush.  An ass groove is what it is).

    I guess using your butt to identify people shouldn't come as a surprise.  I mean, I remember thinking how everyone's butt prints were different-looking at the beach.  If only I had used that insight to do some research....

    Besides scientific confirmation that 98% of rear projections are unique, I've also learned that

    They say that traditional biometric techniques such as iris scanners and fingerprint readers cause stress to people undergoing identity checks, while the simple act of getting seated carries less psychological baggage. Their other point is that other technologies such as fingerprint scanning can be compromised when sensor surfaces are unclean, or when there is poor lighting as in iris scanning, contaminating results. []

    I knew about the iris scanning and the fingerprints being unreliable, but I had no idea that it caused "stress."  Well, except when they don't work as they're supposed to.  I swear, the annoyance and frustration of having to rescan the same finger over and over, at different rates, pressures, etc. can become a little stressful, especially if you're in a hurry.

    Outside the Lab

    This security solution(?) is fraught with problems which generally isn't tested in a security lab.  Just off the top of my head I can think of:

    • Weight changes
    • Musculature changes
    • Armed car-jacking
    • Scamming
    • Lending your car

    Weight changes.  If you go on a diet or gain weight, your weight changes.  If you get pregnant, your weight changes.  If you decide to leave your wallet behind, your weight distribution changes.  Granted, all biometric identification schemes require a bit of fuzziness, but make it too big and the security solution is practically worthless.  This is the reason why fingerprints and irises are used as identifiers: not only are they unique, they're generally unchanging.

    Musculature changes.  Let's say you decide to get in shape.  Usually, there will be a weight change.  But, even if there isn't, your musculature can change as well.  Working out your glutes or your back will affect how you sit and how your weight gets distributed.

    Car-jackings.  I'm assuming that the butt identifier is used once at the beginning of one's drive (say, to start the engine).  To use it on a continuous basis -- as in spot checks or continuous, real-time feedback while the car is moving -- could result in life-threatening problems.  For example, your dog decides to jump in your lap while you're driving, changing your seated contours, cutting off your engine.

    If this assumption is correct, all a criminal has to do to steal a car is to lie in wait and then threaten the driver with a weapon: sit and start your car, then get out.  Steal the victim's cell phone while you're at it (can't call the cops as fast), chuck it out the window a mile down the street (so you can be tracked).

    Scamming.  This is where you've got to really let your creative juices flow.  I can already imagine a scenario where enterprising criminals install these pressure sensors on a bunch of body massage chairs.  The chairs are installed near the entrances/exits to a mall's parking garage.  The use of the chair is offered free of charge.  People get their massage; car thieves get their butt profile.  A 3-d printer is used to create a mold, the car is stolen while the victim is still shopping.

    There are kinks, such as having to match driver to car, but nothing insurmountable.

    Lending your car to other people.  Of course, for some, this is not really a problem as it is a solution....  On the other hand, what if you have a heart attack in the parking lot, your passenger is right there, and the building next door is a hospital?

    Despite what I've written above, I can see how gluteal biometrics could make your life a little bit more convenient.  Imagine, for example, your car settings (wheel and seat positioning) adjusting themselves based on who you are, not whose car keys you grabbed in the morning.

    As for computers...well, this would require that the seat and the computer be linked somehow.  If the trend of smaller and lighter computers means anything, I don't really see much of a future for this as a password alternative.

    Related Articles and Sites:

  • Encryption and Traveling: EFF's Tips Regarding Data Security And Cross-Border Travel

    The Electronic Frontier Foundation (EFF) has published a timely article for the holidays: "Defending Privacy at the U.S. Border: A Guide for Travelers Carrying Digital Devices".  In it, you'll find a number of recommendations and explanations regarding travel from and to the US, including:

    • Why and how the US government searches devices
    • What issues to consider when protecting data
    • Why backups are important

    And other issues.  Among the tips: the use of encryption software to protect sensitive data.

    Encryption: Useful Even If Cooperating with Border Agents

    EFF starts off the section on encryption by noting that disk encryption software protects your data if your computer is lost or stolen, so "it's a useful precaution even for people who plan to cooperate completely with border agents' requests for assistance in inspecting devices."  Personally, I'd say it's an odd thing to point out since encryption software was not developed as a way to stymie border agents....  Its purpose, after all, is securing data and not securing data against border agents.

    Anyhow, the section quickly breaks into a number of issues to consider when using encryption.

    • Account passwords vs. full-disk encryption.  How they differ.  How password-protection can be bypassed.
    • Choosing disk encryption over file encryption. Why whole disk encryption is preferable over protecting individual files.
    • How to choose a secure password.  Or, rather, passphrase.  EFF notes that built a device to crack passwords in 1999.  It could try 2^56 possibilities in under three days.  This means passwords of nine-letters or less could be compromised in less than a week.

    The contents of encryption section are not too different from the issues and observations found on this blog.  They are, however, consolidated in one nice article.

    The entire article is a very good read -- not just the section on encryption -- and the list of recommendations and tips, while some might consider them commonsense, are a timely and good reminder of what to do (and not to do) if you'll be traveling during the holidays.

    Related Articles and Sites:

  • HIPAA Wall Of Shame: More Than Meets The Eye

    Color me surprised: the number of breaches at the Department of Health and Human Services website is understated, possibly on a massive scale.  This is according to an article at  As many know, the Data Breach Rule under the HITECH Act mandates the notification of data breaches to patients when PHI is lost or stolen and is not protected with encryption software.  Hence the increased use of laptop encryption software like AlertBoot in medical settings such as hospitals and clinics.

    The HITECH Act also required the HHS to make public any breaches that involve 500 or more patients somewhere on their website.

    First, there are "tens of thousands" of breach reports pending to be released as the OCR investigates them.  This doesn't really come as a surprise, since many have pointed out (and I've personally experienced) instances where a HIPAA breach makes it to the news but a checkup on the Wall of Shame doesn't show anything.  In fact, the figure of "tens of thousands" isn't surprising because the HHS is supposed to get an annual consolidated report of any breaches involving less than 500 people from each covered-entity that is affected.

    What is surprising, though, is that they're being "hidden" while being investigated (hidden as in, the HHS won't release copies to journalists, etc.).  What's to investigate?  I mean, is someone going to claim a breach when there wasn't one?

    Second, and this is what surprises me, the OCR's posted breach numbers are low.  Again, from

    A Nov. 4 public notice on a breach reported by the UCLA Health System states that "some personal information on 16,288 patients" was stolen, but the wall of shame lists the "individuals affected" in the UCLA incident as 2,761.

    UCLA spokeswoman Dale Tate said in an e-mail that the nearly six-times-larger number in its notice "represents the number of individuals who had some information on the hard drive," while the 2,761 figure sent to the OCR "represents the number of people that met the specific criteria" under the federal breach notification rule.

    Under the federal rule, Tate says, "the information for these individuals could possibly cause more than a minimal amount of financial, reputational or other harm." Information on the rest of the individuals, Tate said, did not meet the criteria.

    I wasn't aware that the HIPAA / HITECH guidelines had a set of criteria for protected health information.  As far as I can tell, pretty much any data a medical organization collects from a patient is PHI -- be it their SSN, medical ID, address, phone number, hospital room number, eye color, etc.  There's very little that's not PHI.

    In fact, as I recall it, the rule on what's not PHI is not about "what type of data is it" (ID numbers, account numbers, medical tests, etc.) as much as it's about data protection: encrypted PHI is, for all intents and purposes, not PHI.  Likewise for anonymized / deidentified data.  And destroyed data, of course.

    Otherwise, it's all PHI.  I don't see why UCLA is reporting two sets of numbers.  The last time I heard about such practices, a person went to jail.  In Alcatraz (yeah, I know, it's not the same thing.  I'm just saying...).

    Related Articles and Sites:

  • Data Encryption Software: 3000 Affected By Laptop Theft, Dept. Of Human Services Gateway Center

    The Department of Human Services (DHS) Gateway Center in Springfield, Oregon, is contacting approximately 3,000 people whose private information was breached when a laptop computer was stolen.  The device was protected with encryption software (and password protection), making the notification a bit unusual.

    Staff and Others Affected

    According to, DHS has notified that people were fingerprinted at either of the following locations were at a risk of identity theft.

    • Willamette Street (Eugene, August 2008 - August 2010)

    • Gateway Center (Springfield, August 2010 - December 2011)

    The people affected include "DHS staff, volunteers, fosters parents, adoptive placements, respite providers and in-home care providers."  The type of sensitive information that was involved was not revealed.

    Again, the laptop was protected (I presume that full disk encryption software was used), so the risks of a data breach are actually very low, unlike claims I've explored earlier in the week, where the use of password-protection only led some "breachees" to believe that it led to low risk levels.

    Oregon has Data Protection Law, Encryption Safe Harbor

    Oregon has a personal data breach disclosure law on its books.  Because of the DHS's actions above, I thought it was one of the few states that didn't extend safe harbor to digital data breaches even when cryptographic solutions are used to protect data.  It turns out that this is not the case: Oregon provides exemptions from sending these notifications if data is encrypted, as long as the encryption key is not compromised.

    I'm not sure what to make of it.  Obviously, one explanation is that DHS really sent those notification letters "in an abundance of caution," a statement that's perfectly justifiable for them to use since real data protection was in place.

    Less rosy reasons could be that:

    • The laptop's encryption was hamstringed in some way.  For example, the encryption key or the password was compromised (e.g., Post-It notes)

    • Not secure enough encryption algorithm.  While there are many ways of encrypting data, the only one that's truly valid is strong encryption.  Depending on criteria, a particular way of encrypting data could be deemed weak encryption, meaning it can be brute-forced in a short period.  Currently, AES-128 and above is considered strong encryption, as well as its equivalents.

    • Stolen by an insider.  If the information was stolen by an insider who has the password, and is suspected of having ties to identity thieves.  This is a subset to the first bullet point above.

    Related Articles and Sites:

  • Drive Encryption Software: St Charles Health System Laptop Recovered

    About a week ago, I had commented on the theft of a laptop computer from St Charles Health System.  A (what I assume to be) unencrypted but password-protected (factual) computer was stolen from an employee's car.  As readers of this blog know, crypto solutions like AlertBoot's drive encryption software are the only way to tilt the odds of PHI remaining secure in the event of a computer theft.

    Well, the laptop computer was recovered.  According to (via,

    The laptop was found in brush by an 8-year-old girl riding horseback near Horse Butte at the end of November. It was returned to the hospital by the family Dec. 16.

    Good news for Charlie.  I guess they can write another 145 letters explaining the situation and relaying the good news to patients.

    I did a search on Google Maps, and it looks like the girl may have found the device near this place seeing how it involves an 8 year old (I'm assuming her horse (pony?) is not her own, or at least not kept at her house.  A professional facility must be involved) and a place called Horse Butte.

    Still Believe Computers are Stolen for the Hardware?

    Aside from the fact that a little girl found the computer in the bushes in the middle of nowhere, the conclusion to this saga is interesting in what did happen.

    First, password protection appears to have stopped an irrefutable data breach from occurring (as opposed to assuming it's a low-risk situation because there's no way to know whether data was accessed or not.  Not a problem you get with the use of whole data encryption software).  Well, kind of.

    The problem with not having adequate security allows so many unlikely scenarios to unfold.  For example, what if the laptop was originally stolen by a data thief who made a bit-by-bit copy (called ghosting) of the computer's contents; left it on a public bench; was picked up by a less savvy person who tried to gain access to the computer; and left behind when the second "hacker" couldn't gain access to it?  That could explain the forensic results.

    Certainly, it's a remote possibility.  As remote as a hospital computer being found by an 8-year old who was gallivanting around on a horse.

    Second, here we have a scenario that, no matter what actually transpired, it cannot be refuted that this wasn't about hardware: a computer was stolen; an attempt to access it was made; and the device was tossed away when it wasn't possible to do so (or, at least, we believe the data wasn't accessed.  Logs can be modified).

    The entire ordeal is patently bizarre: could this be the equivalent of a joyride?  A computer is stolen because some kid wants to check his Facebook account, finds he can't get past the password, and dumps it because he can't take it home; and what does he care, it's not his computer.  Would that cover what happened here?

    How does one protect against such a thing?  Data-wise, you use encryption software.  Hardware-wise...can't help you there.  For starters, though, don't keep your laptop in an easy-to-steal place like your car.

    Related Articles and Sites:

More Posts « Previous page - Next page »