This Blog




AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.


AlertBoot Endpoint Security

AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.

September 2011 - Posts

  • iPad Theft Leads To Data Breach For Eventbrite

    Eventbrite, a ticketing and event management company based out of San Francisco, California, has alerted the authorities of a potential data breach, as well as publicizing the incident in their blog.  Two iPads with customer information were stolen.  Thankfully, iPads are always encrypted by default (they use something similar to AlertBoot full disk encryption) and there is the ability to perform a remote wipe.

    One of the Most Straightforward Breach Notifications, Ever

    I read a lot of breach notification letters in my job.  I estimate that I've read at least two hundred of them.  Eventbrite's blog entry is possibly the best and most transparent data breach notice I've read to date.

    You can read it for yourself, but to summarize:

    • Two iPad stolen on September 20.
    • Data collected via "Eventbrite At The Door" App
    • Remote password lock and remote data wipe put into place
    • Authorities and credit card companies alerted

    The breached data includes 1) names and email addresses, 2) email addresses and last four credit card digits, and 3) complete credit card numbers for 28 clients (the data was collected via separate methods).

    It was also noted that the "Eventbrite at the Door" application had a bug which led to the improper encryption of the credit card data for the 28 clients.  The bug is currently fixed.

    All in all, it sounds like the company has a firm grasp of what it had to do in the event of a data breach and, compared to all the other data breaches I've covered over the years, these guys deserve an "A" despite the circumstances surrounding this grade.

    The Only Problems I Have....

    Or maybe they deserve an "A-".  An "A+" is a grade you get if, obviously, you don't have a data breach.  I do see some problematic statements, though.

    First, there is the issue that encryption was not implemented correctly for the 28 transactions.  I'm not sure what exactly they're referring to (is it encryption for data-in-motion?  Data-at-rest?), but if that's true for these 28 transactions, this implies that it was also true for previous transactions as well.  I won't say it was true for all transactions since the bug could have been introduced as an update, effectively breaking the process.

    (It also leads me to wonder, why were they prompted to look into the issue in the first place?  I mean, a data breach generally doesn't lead you to check whether there are bugs in your encryption algorithm.  If I were my usual cynical self, I'd be wondering whether the "bug" was not having encryption in the first place; however, such a suggestion contradicts the levels of candor emanating from Eventbrite's blog.)

    Second issue: Will the remote security work?  We don't know if the devices were 3G enabled or just operated under wi-fi.  If the latter, the remote wipe and remote lock can only work if the device is connected to the internet.  Conceivably, the iPads could be put in a wi-fi deadzone and attempts made to access the devices...although, I've got admit I find this highly unlikely.  Apple's iPads are pretty highly prized, and I can definitely see how the thieves were after the hardware and not the data inside of them.

    Regardless, I would have preferred the use of local wipe, where entering the wrong password a certain number of times automatically wipes all data.

    Apple's iPads Use Encryption by Default

    These are direct quotes from Apple's "iPad in Business: Security Overview" paper.  While the security of present in the iPad is not perfect, it's pretty apparent that security was seriously considered during the design of the device.

    "...if the device falls into the wrong hands, users and IT administrators can initiate a remote wipe command to help ensure that private information is erased."

    Remote wipe
    iPad supports remote wipe. If a device is lost or stolen, the administrator or device owner can issue a remote wipe command that removes all data and deactivates the device. If the device is configured with an Exchange account, the administrator can initiate a remote wipe command using the Exchange Management Console (Exchange Server 2007) or the Exchange ActiveSync Mobile Administration Web tool (Exchange Server 2003 or 2007). Users of Exchange Server 2007 can also initiate remote wipe commands directly using Outlook Web Access.

    Local wipe
    Devices can also be configured to automatically initiate a local wipe after several failed
    passcode attempts. This is a key deterrent against brute force attempts to gain access to the device. By default, iPad will automatically wipe the device after 10 failed passcode attempts. As with other passcode policies, the maximum number of failed attempts can be established via a configuration profile or enforced over the air via Exchange ActiveSync policies.

    iPad offers 256-bit AES encoding hardware-based encryption to protect all data on the device. Encryption is always enabled and cannot be disabled by users.

    Related Articles and Sites:

  • Medical Data Encryption Software: Tricare/SAIC Backup Tape Theft Affects 4.9 Million

    The use of medical data encryption software like AlertBoot keeps data breach notifications away.  Like the one made public by Tricare, the US military's health plan: according to press releases, a total of 4.9 million current and former service members were affected by this latest data breach.

    Update (04 OCT 2011): Slashdot has a thread on backup tape encryption and SAIC.  They mention a lot of stuff I've covered and more.
    Update (13 OCT 2011): Tricare is trying to position the breach as an FTC issue, and not a HIPAA issue, according to
    Update (03 NOV 2011): The Tricare breach is in the HHS's Wall of a totally-dominating #1, with over 5 million affected.

    Backup Tape Stolen, Impermissible Action

    A SAIC (Science Applications International Corp) spokesman confirmed that a backup tape was stolen from a SAIC employee's car.  The backup tape -- which contained PII/PHI from 1992 through September 7, 2011 -- was being transferred to an off-site storage area in the San Antonio Area.

    The PHI/PII included "Social Security numbers, addresses and phone numbers, and some personal health data such as clinical notes, laboratory tests and prescriptions."  Financial data was not included.

    According to media, the breach was reported by SAIC on September 14 and not made public until two weeks later.  The explanation given was that Tricare had to figure out the extent of the data breach, and I more than understand the apparent "delay." (A delay of two weeks is actually quite fast.  HIPAA / HITECH gives up to 60 calendar days to notify affected individuals).

    What I don't understand, however, is's note that "according to the San Antonio Police Department report, the tapes were burglarized about 8 a.m. The incident was not reported to police until nearly 4 p.m. the following day."  Anyone care to explain to me why there was a delay there?

    Encryption Used (Here and There)

    This case is an ideal example of why some people prefer to use disk encryption instead of file encryption when it comes to protecting their computer data (despite the fact that the breached media is a data tape). notes that Tricare was making an effort to ensure patient data was protected with encryption software (my emphasis):

    "Some personal information was encrypted prior to being backed up on the tapes," the SAIC spokesman says. "However, the operating system used by the government facility to perform the backup onto the tape was not capable of encrypting data in a manner that was compliant with a particular federal standard. The government facility was seeking a compliant encryption solution that would work with the operating system when the backup tapes were taken."

    The first statement shows us the problem with file encryption.  While file encryption does a great job of protecting a file's contents, there is the logistical problem of actually having to select any and all files that require protection.  Plus, if files are encrypted using different software or using different passwords, the logistical nightmare of keeping track of everything increases geometrically.

    A better method to protect files, then, is to use disk encryption -- a solution that encrypts the storage medium itself.  This way, there is no question on whether a file was encrypted or not: with the disk encrypted, you know all files are encrypted as well.

    Unfortunately, disk encryption, as the name implies, only works on disks -- external hard drives, internal HDDs, USB memory sticks, etc.  Backup tapes, due to their nature, can't be secured using disk encryption.

    That's not to say that backup tape encryption software doesn't exist.  It does; however, there is an obstacle that medical venues need to surmount.

    FIPS 140-2 and 128-bit Encryption for HITECH

    The federal standard compliance I've emphasized in the above quote probably refers to the fact that encryption used to protect PHI under HIPAA/HITECH has to be FIPS 140-2 validated.  Now, neither HITECH nor HIPAA has any requirement for encryption.  However, the HHS (which is the final authority on HIPAA and HITECH-amended HIPAA issues) has deferred the job of defining "encryption" to the National Institute of Standards and Technology (NIST).

    Per NIST's own publications, nothing weaker than 128-bit symmetrical encryption algorithms can be used, and encryption software must be validated to NIST's own FIPS 140-2 standard.  I don't know how many backup tape encryption software vendors can claim they do so, but if Tricare's travails are any indication, it looks like it's not easy to select one (whether it's because they can't find any or because they've found too many...your guess is as good as mine).

    I should point out, though, that an encryption suite that doesn't live up to NIST's expectations is better than not using anything at all.  Hardcore security experts might disagree, but my opinion is that, until you can find the right solution, any type of temporary solution is necessary.  Between the odds of a breach being 0.5 (encryption is not up to snuff) and 1 (no encryption at all), I'll take that 0.5.

    Related Articles and Sites:

  • Texas Data Breach Law Amended To Include All US Residents

    It looks like it's time to make an update to my Texas data breach law post from a couple of years ago.  According to many sources, the Texas legislature amended Business and Commerce Code Section, 521.053.  This section originally required the notification of data breaches to Texas residents (and safe harbor was extended with the use of drive encryption such as AlertBoot).  Now, due to the amendment, it applies to breaches of anyone in the US, possibly the world.

    H.B. 300 Makes Amendments

    The amendment in question can be found in HB 300.  According to a copy stored at the Texas legislature site,

    Notice in the above how the words "resident of this state" are purposefully crossed out and substituted by the word "individual."

    Among other things, this change means that companies operating from one of the four US states without data breach notification laws (Alabama, Kentucky, New Mexico, and South Dakota) could be in breach of this Texas law.  As attorney Mark Rasch notes in

    ...if you "conduct business" in Texas, under a new Texas law, not only must you notify Texas residents (if any) that their data has been breached, but you have to notify residents in states that have no breach disclosure laws—or face the wrath of the Lone Star state.

    This means that Texas law would apply to the relationship between a retailer in Tuscaloosa and a consumer in Birmingham, AL, a retailer in Louisville and a consumer on Lexington, KY, a retailer in Albuquerque and a consumer in Santa Fe, NM, or a retailer in Sioux Falls and a consumer in Rapid City, SD.

    See what he did in the last paragraph?  Stores where breaches take place and affected clients exist in the four states without breach notification laws, and are not directly related to the Lone Star State except for the fact that the stores do business in Texas in one way or another.

    In fact, Rasch goes on to point out that the new law is worded broadly enough that its effects could be global:

    Under a strict reading of this statute, if the computers at Nestlé in Vevey Switzerland are hacked and the hackers obtain personal information about residents of South Korea, Nestlé—which sells candy bars in Dallas—must notify the residents of Seoul under Texas law. It is the "conduct of business" within Texas that gives rise to the jurisdiction

    No Harm Threshold Trigger

    Covering the same Texas amendment, notes that "resident of this state" was not the only thing that was rubbed off in the new version of the Business & Commerce Code, Section 521.053.

    Apparently, the harm threshold that existed before was also excised:

    A number of state data breach notification laws only require notification where illegal use of the breached personal information has occurred, or is reasonably likely to occur and that creates a risk of harm to a person.  However, under Texas’ law, notification is required only upon acquisition, without regard to a risk of harm.

    A review of my previous entry on the Texas notification law shows that this is actually the case.  Two years ago, I noticed that Texas residents needed to be notified "if it's reasonable to assume that clients' sensitive personal information was acquired by an unauthorized person."

    As it's often pointed out, this is like putting the fox in charge of the chicken coop.  After all, wouldn't it be in the interest of the breached entity to claim that the risk is not there?  It's good to see that Texas law has caught up to the failure in logic that is represented by the availability of the harm threshold.

    Use of Encryption Software Still Encouraged

    Some things haven't changed, though.  There are no amendments eliminating safe harbor provided by the use of strong encryption software.  In fact, if you read the rest of the bill, it looks like there is extra emphasis placed on the use of encryption, although it relates to health data protection issues.

    Related Articles and Sites:

  • Drive Encryption Software: Correction on Fairview, North Memorial Laptop Theft

    It looks like I might have jumped the gun on yesterday's article.  I've gone back to see if I could find anything else about the Fairview and North Memorial laptop encryption theft, and nowhere do I see a mention of the stolen computer being someone's personal laptop.

    Many of the original facts remain in place, such as disk encryption software not being used in the stolen laptop, despite company policies.

    However, now all signs point towards a company laptop being at the center of the breach.  According to,

    But as a result of human error, the missing laptop was not encrypted, meaning the files are potentially at risk of being accessed, Fairview said.

    The human error involved "bypassing a step before laptops are issued to an (Accretive) employee," Dahl said. "Accretive has assured us they fixed whatever process breakdown there was."

    Related Articles and Sites:

  • Medical Laptop Encryption Software: 16,800 Fairview, North Memorial Patients Affected By Laptop Theft (Updated)

    14,000 Fairview patients and 2,800 North Memorial patients in Minneapolis are being alerted that their protected health information was breached due to the actions of an employee at a subcontractor.  The laptop was not protected with drive encryption software like AlertBoot despite obligations to the contrary.

    Update (28 SEP 2011): I've just gone back to the article and have found that references to the laptop being a personal one have been deleted (the article shows an update on 10:19 AM September 28).  Furthermore, information that was not available yesterday, such as an explanation of what Accretive Health does, has been added.  I'm not sure what this means.  I think I might have jumped the gun on this one.

    Accretive Health is Blamed

    According to and other sites, a laptop belonging to a subcontractor's employee was stolen during a vehicle break-in.  This incident resulted in the breach of Fairview "patient names, addresses, dates of birth, account balances, dates of services, diagnostic information, and SSNs".  North Memorial informed patients that SSNs were not included.

    Hospital officials blamed the incident on a subcontractor, Accretive Health.  It's pretty rare to hear the name of the third party that caused a medical data breach, so this makes me wonder if the subcontractor did something to anger the company (besides causing the data breach, I mean) or if this is a new strategy taken by HIPAA covered-entities that are tired of bearing the brunt of negative PR for breaches they did not cause.

    Apparently, the information from Fairway and North Memorial was supposed to be protected with encryption software, as stated in company procedures (it's not specified which company: the hospitals or Accretive Health).  The stolen laptop was not.  It seems quite reasonable that it wasn't because the device in question was a personal one.

    The question then becomes, how did this data end up on this personal computer?  Did the unnamed employee take in his personal computer to work (one of those Bring Your Own Computer to work deals)?  Or did he copy the information to a USB stick (hopefully using encryption software, although very unlikely due to the turn of events) which was later used to copy the information to the laptop?

    Employees are Ultimate Failure Point

    Encryption is not a panacea.  I often note this, not because encryption software can't do its job, but because a lot of people seem to think that encryption is some kind of magic bullet against all the things that can go wrong with patient data:

    "Are you protecting patient data?"
    "Oh, don't worry.  We use encryption on all of our laptops."

    Let me tell you something, if this is the response you're getting, perhaps you should be worried.  As the above case shows, there are plenty of ways that encryption won't come into play.  Even if all computers at a company are making use of cryptographic defenses, there are plenty of ways for information to pass through safety nets:

    • Information is sent via email
    • Files are copied to external storage devices and media
    • Non-company laptops are used
    • Hackers break into your database

    A proper approach to security not only involves technological safeguards but other approaches as well, such as employee training (why and how information is secured) and the proper policies (such as penalties -- including termination of employment -- and rewards).

    You must also remember that, as long as you're employing people, the risk of a data breach will never reach zero.

    Related Articles and Sites:

  • BBC's Modern Sherlock Holmes Adaptation Teaches Data Security Lessons

    I finally got to watch the last episode of the first season of the new Sherlock Homes series by the BBC.  What does this have to do with data encryption like AlertBoot endpoint security?  Everything.

    Episode 3, The Great Game, is a mishmash of classic Sherlock Holmes stories.  Right off the bat I recognized "The Adventure of the Bruce-Partington Plans."  Its combination with "The Naval Treaty" did not come as surprise because as a child I used to get those two stories confused.  And, of course, the words "five" and "pips" instantly recalled to mind "The Five Orange Pips."

    However, what really caught my attention was the portion of the drama that was partly based on Doyle's "The Adventure of the Bruce-Partington Plans."  It's not really the story that roused my interest per se.  Instead, what awakened my brain from its pre-planned stupor (is there any other way to watch TV?) was the fact that top secret plans were saved to a USB flash disk.  That's pretty pathetic.

    Art Imitating Life?

    In the episode, Andrew West is found lying on train tracks, dead.  Supposedly, his USB flash drive, which contained the Bruce-Partington Program for some type of missile defense system, is missing!

    This already kicks my brain into overdrive.  What?  You've got this super-important defense system and a person's carrying it around in a USB memory stick?

    This is problematic.  Was the thing at least protected with strong encryption software?  (The answer is appears to be "no".)

    Furthermore, there is footage that shows how Mr. West went to a bar with this super-secret missile defense system plans, got drunk, and waved the USB key around and showed it off.  Jeepers, what kind of former MI6 agent is this, dropping state secrets after a couple of pints?  Maybe that's why he's formerly of MI6....

    On the other hand, this is not the first time such things have happened.  And I'm talking about real life, not some cockamamie made-up story.

    There is this story, where a USB flashdrive was lost at a pub, affecting over 26,000 people.

    Then there is this other story where 12 million residents were affected when a USB key was found in a parking lot.  The parking lot of a pub.

    How about this story where a military laptop is left in a pub by a drunk sailor (with a military rank of captain no less).

    Not all such stories have a sad ending.  In this story, an Army laptop is stolen at a burger joint.  Thankfully, the thing was protected with disk encryption software (like any portable devices with sensitive information ought to be).  Maybe non-pub visitors have a better grasp of data security?

    What Not to Do

    There are a number of lessons that can be gleaned from this Sherlock Holmes episode.  First off, don't go to a place of inebriation with your work, especially if said work happens to be classified top secret.

    Second, do not walk around with state secrets.  Especially stored in a USB drive.  If you must, at least ensure that the thing is encrypted.  Certainly, the likes of a Dr. Moriarty would find ways to get around it (the sand-bag decryption method is always a popular way of doing an end-run on strong disk encryption).  Then again, that's why the first sentence to this paragraph recommends not walking around with state secrets.

    Third, if possible, keep the number of people who can access data low.  According to the story, West was involved with the project in a minor capacity.  If that's the case, why is he allowed to walk around with secrets in a USB drive?

    Disk Encryption Works.  But There Are Limits

    Let us assume that the West's device had been protected with USB memory stick encryption.  Would it be in the interest of the government to find and retrieve the the device?  Or can they just say "meh, it was encrypted.  Let it go", or would someone have to get involved?  In other words, would Holmes and Watson get to proclaim that the game's afoot?

    The answer is yes.  Although encryption is virtually impossible to break if proper care is taken (whether it was or wasn't in West's case is another line of questioning that I won't go into for the moment), it should be noted that there are certain ways to get around such impregnable data fortresses.

    For example, threaten someone's life: your password or else.  Threaten West.  If he cares not for his own, then his fiancee's.  Most people have a breaking point.

    Then there is the sand-bagging decryption process.  That's when you repeatedly hit someone over the head with a sand bag until they can't take the abuse no more and spit out the password.  Obviously other forms of torture would work as well.

    Considering that it was unknown how West had ended up next to the railway with his head bashed in, wouldn't it be a real fear that someone had used the sand bag method?  Just because encryption was used doesn't mean Holmes and Watson wouldn't need to work on the case.

    Astute readers might ask, "well, what was your haranguing about strong encryption all about?"

    The point is that there are many ways you can have a data breach other than Professor Moriarty and his goons coming after you.  You could, for example, lose it at a bar or a McDonald's.  Heck, your idiot of a brother-in-law-to-be could steal it from you.

    If you deal with sensitive information, make sure it's protected.  Use encryption.  If you carry around sensitive information definitely make sure it's encrypted.

More Posts Next page »