This Blog




AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.


AlertBoot Endpoint Security

AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.

May 2012 - Posts

  • Smartphone Security: Locating Stolen Smart Devices Hit-Or-Miss In Berkeley Police Scandal

    How many police officers does it take to find a stolen iPhone that's pinpointing its location?  Apparently, more than ten, if a couple of articles are to be believed, such as one from the  It just goes on to prove that a mix of different approaches to security works best, and you've best have a contingency plan in place in case your primary risk-attenuation solution ends up not working.  I mean, there are various ways of achieving smartphone security: why not use them all?

    Police Chief's Son Loses Phone from Gym Locker

    Ten officers after one measly phone, albeit an oft-desired one.  If you think something's wrong with this picture, it's because it is: the stolen iPhone belonged to the son of the Berkeley Police Chief.  After learning of the phone's disappearance, the chief sent the officers -- who were part of the department's drug task force -- to track it down.

    Obviously, the chief is getting a lot of heat for this ill-advised move.  Let's ignore the civics lessons this episode teaches us, though, and concentrate on the technical details. 

    The chief's son found that his iPhone was stolen from his "unlocked gym locker."  The device, however, had "Find my iPhone" activated prior to the theft.  "Find my iPhone" is an iOS app provided by Apple (it comes built-in; however, it has to be activated before the device is stolen) that allows one to pinpoint the location of a device, among other things.  The son alerted his father who in turn got police officers to look for the missing device.

    The signal showed that the phone was on the move, and perhaps this contributed towards the officers not finding the smartphone.  Officers knocked on doors to see if they could find the phone but ultimately returned empty-handed.

    It is being reported by that there may exist security reasons for the getting all those detectives searching for the phone as the phone "may have contained personal-information of police department employees."

    Oh, Boy, Where to Begin On This One

    There are a number of things about this story that jumps at me, some of them own-faceslap inducing.

    1. Phone tracking doesn't always work.  There are numerous upon numerous stories and articles of people recovering their phone after activating a smartphone's tracking software.  But, there are also the ones where a phone is tracked but unrecoverable.  Phone tracking, just like any technology is not 100% effective.  A list of potential problems:

      • The phone is turned off, or the SIM card was swapped, so no way to track.
      • The phone is in a building but phone tracking data is two-dimensional.  What are you gonna do, get warrants for 50 apartments?  Good luck with that.
      • GPS and WIFI location is not as accurate as you think they are.  Of the two, GPS is infinitely more precise and accurate.  For civilian applications, GPS precision is 20 meters (66 ft).

    2. Unlocked locker.  Really?  A phone (sorry, a mini-computer that masquerades as a voice communications device) was stolen from an unlocked gym locker?  Wow.  What is the world coming to.  Even if you have proper security deployed on your devices, always remember to keep them in secure locations.  It's about minimizing risk.

    3. Your kid's phone has police department info on it?  'Nough said.  I mean, what kind of excuse is that for getting a bunch cops to work overtime?  The first rule of data security is to never, ever put sensitive data where it doesn't belong.

    Smart phones and tablets can be vectors for data breaches.  You want to prevent those breaches from taking place.  Because there is no silver-bullet that will work all the time, you need to have security in layers, one acting as a backup to the other.  Having a phone tracker is one solution, but also ensure that (a) you keep your devices in secure or safe places (b) never carry more sensitive data than you need to and (c) ensure that encryption is turned on your device and used by setting up a password.

    Whatever advice we have on this site for protecting laptops goes double for phones and tablets.

    Related Articles and Sites:

  • Hospital Disk Encryption: Upper Valley Medical Center Hard Drive Missing

    It's being reported that the "hard drive from a computer" has been stolen from Upper Valley Medical Center.  Thankfully, patient information was not present on the missing device, but it's grounds for reviewing the question: why is it, again, that HIPAA-covered entities are not required to use drive encryption software like AlertBoot?

    Nicked at Night

    The site reports that a man (featured in a security cam video) is a suspect in the theft of the missing hard drive, which was "stolen shortly after 10 pm Wednesday from the patient admitting area off the main lobby at the hospital."

    Based on the context of the story, it sounds like the thief popped open a desktop computer, unplugged the internal hard disk, and made his way out of the hospital, as opposed to stealing an external hard drive.

    Now, theft of computer equipment from hospitals is nothing new.  Neither is the loss of hard drives.  But, such thefts are a result of someone purloining something that's just lying around.  This is the first time that I've actually heard of where someone saunters into a hospital, takes the time to disconnect wires, and saunters back out with the goods in his...pockets (I'm guessing here because the video of the suspect shows him walking empty-handed.  I have a 5.5" internal drive next to me, and it's about the size of a NOOK Simple Touch).

    Was this stolen drive protected with hospital data encryption?  It's a moot point, really, because there wasn't any patient data on it (or, at least, that's what the folks at Upper Valley think.  It wouldn't be the first time that a deeper inspection of backups reveals that a PHI was taken).

    But, this again makes me wonder: why is the use of encryption software not required under HIPAA guidelines?

    Related Articles and Sites:

  • Data Encryption Software: Boston Children's Hospital Laptop Stolen In Buenos Aires

    Argentina: the land of tango, "hand of God" Maradona, and, now, location where an employee at the Boston Children's Hopsital lost a laptop with information on over 2,000 people.  The laptop computer in question was not protected with laptop encryption software like AlertBoot, triggering a medical data breach.

    South American Conference

    Why was the laptop is Argentina?  It's a long way from Boston, MA.  As it turns out, an employee went to a conference in Buenos Aires, and took the laptop.  So far, so good.

    But, here's the critical follow up question: knowing that your computer is being taken out of the country, why were the contents of that laptop computer not protected with encryption software?  Password protection was used, according to several articles, but password-protection is not really protection, as I often note.

    Well, it turns out that perhaps the data breach was unexpected -- although, who cannot claim the same for any data breach? -- because, according to,

    The file, which did not include financial data or Social Security numbers [but did include names, birth dates, diagnoses, and treatment information for 2,159 patients], was not saved to the hard drive but was on the laptop in an e-mail attachment when it was stolen.

    "Was not saved to the hard drive, but was on the laptop?"  Obviously, someone has a weak grasp on what's going on.  If it's on the laptop, it's saved to the hard drive, period; I'll come back to this later.  Regardless, it's this passage that leads me to believe that the breach was "unexpected," and why the laptop was not encrypted.

    Consider how the breach was triggered: a file with medical data that was sent as an attachment to an email message.  It's implied that, otherwise, there is nothing that would have triggered a breach of HIPAA and HITECH regulations.  This indicates that the laptop normally does not hold sensitive patient information.  If so, it explains why the employee was allowed to take the device out of the country.  It also explains why strong disk encryption was not used: there was no reason to.  At least, there weren't any obvious ones.

    In summary: no sensitive data = no need for data security tools = data breach surprise!

    The usual rule of thumb when it comes to data security is to analyze your data security situation, find out your data security needs, and then implement them.  So, not encrypting the laptop is, under the circumstances, understandable.  Assumptions were made, the situation was analyzed, and it didn't look like encryption was necessary.

    But, that's only when things go according to plan.  Hence, the other school of thought when it comes to data security: follow the same steps listed above, but also overcompensate when it makes sense to do so.  For example, there is a growing body of professionals who'll deploy full disk encryption on any employee laptops, regardless of what the laptop is used for or who it is used by.

    The reason?  With files shared over the network, and with ad hoc employee role changes, it's impossible to figure out who has what, when, and where.  Any assumptions you make at the onset about the contents of a device have to be thrown out the window.  What does that leave you with?  Disk encryption for the entire computer.

    Incidentally, this is why companies ought to be insisting that employees actively use device encryption on their personal smartphones and tablets.

    Was Not Saved to the Hard Drive...?

    Circling back to my criticism above: when you're announcing a data breach, anything on a laptop is saved to the hard drive.  Otherwise, you wouldn't go public with the breach because there wouldn't be a breach.

    Perhaps an attempt was being made to differentiate between the physical act of actively saving a file to a hard drive and automatically downloading an attachment, but not opening it.  However, to claim that the attachment was not saved to hard drive of the laptop is extremely misleading.  For example, let's say the employee at the center of the Children's Hospital data breach used Microsoft Outlook to manage emails, not an unreasonable assumption.  Any emails with attachments found in the inbox of Outlook are, by definition, saved to the hard drive.

    And saved to the laptop.  For goodness sake, the hard drive is where things on the laptop are saved to.  If you haven't already grasped on to this factoid, I can see why you wouldn't use proper data security.

    Related Articles and Sites:

  • Laptop Encryption Software: A Strange Case Of Laptops Stolen Because Of The Data

    Whenever I run across stories of lost or stolen laptops that caused data breaches, it's quite evident that the portable devices were taken because of what they are: hardware commodities that can be sold easily and anonymously.  The advantages of having full disk encryption like AlertBoot is pretty evident under such circumstances, especially if the computers hold sensitive data.

    There are instances, though, where the use of FDE is even more welcome: when laptops are specifically targeted because of who they belong to.  Such stories are rarely found in the media.  However, its rarity doesn't mean that it doesn't happen.

    South Africa Lawyers' Laptops Stolen

    According to, laptop computers belonging to advocates (basically, lawyers who appear in court, it sounds like) in Cape Town, South Africa, were stolen on two consecutive days of the weekend.

    One office was broken into on Saturday, in late April.  In addition to the laptop computer, memory sticks and backup DVDs were stolen.  The second office was burglarized on the following Sunday.  A laptop was stolen as well as a "toga."  From the context of the story, this toga appears to be part of an advocate's dress for court, and not what one would wear while working the floor at Caesars Palace or at a theme party.

    How can it be concluded that the laptops were stolen because of who they belonged to?  There were thirty-five advocates' offices in the office complex, and while the burglarized offices were in the same building, they were on different floors.  However, the advocates who were affected by the thefts -- MM Pienaar and De Bruyn -- were working on the same case.

    It's not mentioned whether encryption software was used to secure the data.  However, it may be a moot point: cryptographic solutions are used to prevent others from accessing one's data.  In this case, it wouldn't be preposterous to think that the aim was to prevent one from accessing one's own data.  The two advocates lost "everything they were working," according to, although De Bryun had backups, so not "everything" was lost.

    Backups - The Flip Side of the Coin?

    It goes without saying but backups are an important part of your data security regimen. Backups may not protect you from a data breach -- in fact, stories like these sometimes make me question whether backups couldn't be one of the leading causes of data breaches -- but they're the only thing that can guarantee the recovery of your data in you’re your laptop's stolen or lost.

    Related Articles and Sites:

  • Information Protection: Central London Community Healthcare NHS Trust Fined £90K, Fights Back

    The UK's Information Commissioner's Office has assessed a monetary penalty of £90,000 on Central London Community Healthcare NHS Trust for erroneously sending 45 faxes with sensitive medical information.  It's one of those technological issues that cannot be prevented in any way or form with a technical solution: not VPN, not data encryption software, not DLP....guess humans as a weak link in the chain strikes again.

    Recipient Finally Contacts Trust

    The data mishap continued for three months until the mistaken recipient of the faxes contacted the NHS and alerted them of the data breach.  In that time, 45 faxes with information on 59 different people were sent via the facsimile machine, including "their diagnoses and information about their domestic situations and resuscitation instructions," according to

    The person who received the faxes had been shredding the electronic missives.

    The ICO stated that,

    "The fact that this information was sent to the wrong recipient for three months without anyone noticing makes this case all the more worrying."

    and the NHS Trust was fined because

    The ICO said that the trust didn't have enough checks in place to make sure that sensitive faxes went to the right people and it wasn't training its staff adequately on data protection.

    On the face of it, this sounds ridiculous.  £90,000 because someone wasn't calling up the other side and confirming the fax's receipt?  Turns out, though, that this is exactly what the NHS was doing.  From the ICO's monetary penalty write-up:

    On or about 28 March 2011, an administrator at the Pembridge Palliative Care Unit (the "Unit") received a verbal request from St John’s Hospice (the "Hospice") to send their inpatient lists to an additional fax number to ensure that service provision was unaffected during the leave of absence of one of the out of hours doctors.  The administrator then created a template/fax coversheet listing both numbers, and printed a number of copies for use when the inpatient lists were faxed to the Hospice.   
    A fax protocol had been agreed between the Hospice and the Unit whereby the administrator would telephone the Hospice to confirm whether the inpatient lists had been received and the Unit would confirm receipt.  However, the administrator did not update the fax protocol with the second number or obtain approval from his manager.
    The administrator at the Unit then sent the inpatient lists to the second fax number in addition to the agreed fax number provided by the Hospice.  After each transmission the administrator telephoned the Hospice as agreed and on each occasion the Hospice confirmed they had received the fax.  However, unbeknown to the administrator the Hospice was only confirming receipt of the inpatient list sent to the fax number contained in the fax protocol and not the second fax number.  As a result, the administrator continued to send the inpatient lists to the second fax number.

    Does the £90,000 penalty sound fair?  Granted, the incident was on-going for three months, and chances are that it wouldn't have stopped had the erroneous recipient not called up the NHS to alert them of their mistake.  But, it seems like such a small mistake that anyone could make.

    And, yet, I'm left with an odd taste in my mouth.  Assuming the above "administrator" is the same person throughout the ICO's write-up, what does it matter that the fax protocol was never updated?  The same person who added the second fax number to the coversheets is the same person who ended up faxing them.

    Would it have killed him to ask "did you get the faxes sent to both numbers?"

    Trust Challenges Penalty

    Central London Community Healthcare NHS is not taking on the penalty handed out by the ICO:

    But despite accepting that the breach was "hugely regrettable", the trust is making a legal challenge against the ICO's penalty.

    "We deeply regret that the Information Commissioner has decided to impose a fine and so we have instructed our lawyers to commence an appeal against this," a spokesman for the trust said.

    "We consider that the commissioner has acted incorrectly as a matter of law and so we have no alternative but to bring an appeal." []

    The challenge is a rare one.  I dimly remember one previous challenge -- although I can't find a record of it, so I could be imagining things -- so this is either the first or second such challenge since the ICO gained the power to issue monetary fines to violators of the Data Protection Act.

    The challenge is also curious.  Many complain that the fines assessed on such public bodies is ridiculous because (1) it's an indirect tax on people, possibly on those who were affected by the data breach and (2) it doesn't directly affect the people in charge at the NHS, meaning it's not much of a deterrent.

    On the face of it, it makes sense.  But here we have a NHS Trust that, quite contrary to being blasé about the fine, is willing to fight it.  Makes one wonder why?  What's the motivation?


    Related Articles and Sites:

  • Drive Encryption Software: Our Lady Of The Lake Medical Center Laptop Loss Affects 17,000

    Our Lady of the Lake Medical Center is notifying former ICU patients that a laptop with "limited health information" is missing.  It appears that the device was not protected with the likes of medical computer drive encryption like AlertBoot.

    Part of Patient Outcomes Project

    Sometime between March 16, 2012 and March 20, 2012, a laptop computer went missing from a physician's office.  An extensive search turned up nothing.

    The laptop computer stored names, age, dates of admission and discharge from the ICU, and the results of the treatment for 17,130 people who visited the Intensive Care Unit between 2000 and 2008.  This information was collected as part of a "quality and patient outcomes" project.

    As such, financial information was not part of this data set, and neither were SSNs, addresses, or dates of birth.

    This is a HIPAA Security Violation

    Or is it?  Whenever medical that isn't protected with encryption software data goes missing, my knee-jerk reaction is to say that it's a HIPAA violation.  HIPAA states that medical data needs to be protected from unauthorized access, and, in the above case, it's quite apparent that this requirement is not being met.

    But, HIPAA doesn't require the use of encryption.  Indeed, it only requires that adequate security be in place to protect data.  If the Our Lady of the Lake laptop was placed in a locked environment and never taken outside this perimeter, it could be considered adequate protection.

    "Could," because there is a degree of uncertainty there: after an investigation, it might be concluded that it was, in fact, not adequate protection.  On the other hand, had this same laptop also been protected with hospital laptop encryption software and never taken outside its security perimeter, you can bet that it's not a HIPAA violation.  Take it outside of the security perimeter, and it's still not a HIPAA violation: encryption ensures patient data security.

    Thus, encryption is just about the only reason why the loss of a laptop full of medical data can go unreported to the authorities and the public in general.  It's agreed that such safe harbor is reflected in the HITECH Act's Breach Notification Rule -- the HHS, charged with upholding and implementing HIPAA and HITECH has admitted as much.

    The security afforded by encryption is infinitely better than whatever physical protections one could procure, and yet instances like Our Lady of the Lake crop up week after week.

    Related Articles and Sites:

More Posts « Previous page - Next page »