This Blog




AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.


AlertBoot Endpoint Security

AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.

April 2014 - Posts

  • Business Associates And HIPAA: Boston Medical Center Cuts Ties With Transcription Service Provider

    Boston Medical Center has cut ties with a vendor, a transcription company that they had been doing business with for 10 years, after the latter had a data breach, according to Over 15,000 patients were affected when their information was posted to a website operated and used by MDF Transcription Services (and its subcontractors).

    The records in question were not secured with a password which admittedly sounds bad, but with details of any kind being sketchy, it's hard to fathom why this was so problematic.  (For example, were these records posted on MDF's public website for anyone to see?  If so, the lack of a password is the least of worries).

    Whatever the details, the breach must have been egregious enough to deserve the termination of a business arrangement that spanned a decade.

    Purely Medical Information Involved

    The Globe article notes that the "records contained patients' names, addresses, and medical information, including what drugs they were taking, but did not include Social Security numbers or financial information."  Furthermore, there's no evidence that unauthorized people looked at these "exposed" personal records (I assume that MDF had some kind of log that kept track of who accessed which files and when).

    Of course, the fact that SSNs, credit card numbers, and other information that are routinely stolen and used for fraudulent ends were not breached is something of a relief.  However, legitimate reasons exist on why purely medical information should not be breached, and HIPAA rightly requires that sensitive medical information be properly protected from unauthorized access.

    Still, does it really warrant severing ties with a company that must have been doing things right?  After all, a 10-year-old relationship points towards MFD having done a good job, at least, when it comes to transcribing information (or whatever it is that MFD did for Boston Medical Center).  

    The severity of the outcome surprises me.

    HIPAA Business Associates Should Be Wary

    When we are contacted by a company that identifies itself as a HIPAA business associate, they generally are concerned with what the Office for Civil Rights (HHS, OCR) could possibly due to them (fines, sanctions, etc.) as well as the negative public relations impact of having a data breach associated with their business.

    Based on the Boston Medical Center case, it looks like BAs have even more concrete reasons for complying with HIPAA Security Rules.  Losing a sizable and prestigious client is one of those things that no business wants to face.

    Related Articles and Sites:
  • HIPAA Laptop Encryption: Second Coordinated Health Data Breach In 30 Days

    Why is PHI encryption recommended by the HHS, Office for Civil Rights, HIPAA experts, and just people in general?  It's because encryption software can act as a safety net for unforeseen data breaches, as the following story shows.

    Coordinated Health, a network of hospitals that has seventeen locations all over Pennsylvania, has announced a second data breach in one month.  In the first instance, which was announced towards the end of March 2014, they were victims of an office burglary.  A free pass could be given to Coordinated, though, seeing how "someone pried open a cabinet" to steal money and patient information (although, the latter has to make you think that perhaps petty cash was not at the root of the illegal caper).

    However, this second PHI breach won't illicit such sympathy seeing how an unencrypted laptop computer was stolen from an employee's car, affecting over 700 people.  With so many documented cases of laptops (or any object of value, really) having been stolen from cars, it's a wonder that we're still reading of such data breach vectors.

    Email Attachment Cause of HIPAA Breach

    Actually, perhaps I've misspoken on showing sympathy to Coordinated Health.  If you read the explanation of what occurred, you'll see that the cause of the HIPAA breach is ultimately tied to "an email message with an attached file of 733 patient files." (

    Assuming there was no other information that would violate HIPAA Security Rules, it makes sense that one wouldn't find HIPAA compliant disk encryption software on the laptop: the computer in question was not supposed to hold PHI and so most encryption solutions would have been unnecessary.  Perhaps the use of VPN would have been warranted if the laptop was serving as an endpoint for connecting to a central server, but the lack of PHI on the device itself means that HIPAA risks were significantly lowered, if not non-existent.  And, at the end of the day, that's what HIPAA is looking for: lowering risks to a manageable level.  It certainly does not require 100% protection of sensitive data.  (It would be impossible to reach the 100% mark, to be honest).

    On the other hand, covered entities face, and have always faced, problems when it comes to controlling employee actions.  Computer usage policies and data security policies are drafted to delineate what is, and what is not, allowed, but people break these policies all the time, often unknowingly, sometimes purposefully.  Knowing this, does it really comes as a surprise that an employee's laptop computer contained, surprise!, PHI?

    Risk analysis is great and all, but at some point you've got to wake up and smell the coffee: maybe you're risk analysis is leaving certain important parameters out.  Laptops that even have the remotest chance of storing PHI should be encrypted.

    Related Articles and Sites:,0,4927782.story


  • HIPAA Desktop Encryption: Finally A Sign On Encrypting Non-Laptop Computers

    Desktop computer encryption under HIPAA: is it really necessary?  Most people have argued over the years that the answer is "no," not only because the use of encryption software under HIPAA rules happen to be addressable (as opposed to required), but because nobody really expects that to happen.

    Reasons generally given as a clarification to "nobody expects it" include desktops are hard to steal and they're impossible to misplace.  I've even heard this particular piece of logic: if desktop computers were meant to be encrypted, wouldn't the Health and Human Services department (or its Office for Civil Rights, OCR) have brought action against the many covered entities who've had their desktops stolen over the years?  After all, the HHS "Wall of Shame" lists plenty of instances where a desktop computer is at the heart of a data breach affecting more than 500 people.

    While I haven't found such an OCR settlement that centers around a desktop computer, I've found a hint in one case that desktop encryption under HIPAA cannot be dismissed offhandedly.

    Concentra Health Resolution Agreement

    Honestly, "proof" of any sort is unnecessary.  The rules make it clear: as an addressable issue, encryption on computers and other data storage devices must be looked into.  If encryption is an inappropriate solution (for whatever reason), then an alternate form of security that is equivalent to encryption must be used.  (Clarified in this manner, it's almost disingenuous to say that encryption is not a requirement.  When encryption is already available, why would you use something that is equivalent to encryption?)

    HIPAA rules do not say anything about device size, or their portability, or their street value, or their popularity as commodities in the black market, etc.  Is encryption an appropriate solution for securing ePHI?  This is the question that needs to be answered.  Everything else just flows from there.

    But, some people are not getting the message.  It would be nice if OCR just came out and said it, but since they're not doing that yet, it's up to people to search official documents to divine what the maintainers of HIPAA "want."

    Look at the resolution agreement for the Concentra Health data breach of 2011 as an example; You'll see under Section V, Corrective Action Obligations (my emphases):

    B. Encryption Status Update Requirements
    1. Within 120 days of the Effective Date, at one year following the Effective Date, and at the conclusion of the one year period thereafter, Concentra shall provide an update to HHS regarding its encryption status, which shall include:
    a.    The percentage of all Concentra devices and equipment (laptops, desktops, medical equipment, tablets, and other storage devices) that are encrypted at that point in time.
    b.    Evidence that all new devices and equipment (laptops, desktops, medical equipment, tablets, and other storage devices) have been encrypted.
    c.    An explanation for the percentage of devices and equipment that are not encrypted.
    d.    A breakdown of the percentage of encrypted devices and equipment for each specific Concentra facility and worksite.

    Apparently, OCR has learned over the years that people don't read so well when it's not spelled out for them, and have decided to be more specific by adding the types of devices and equipment that one should be encrypting (again, if appropriate).

    Notice how desktops is in there.  People who are not even considering encryption software on desktop computers need to take a pause and seriously consider whether encryption on desktops is inappropriate.

    What Exactly is "Appropriate?"

    Which brings up an excellent question: what is "appropriate" when it comes to encryption?  Obviously, an appropriate encryption solution is one that follows NIST's guidelines (and preferably is certified by the institute itself.  Who knows what untested weaknesses exist in a solution that is NIST compliant but not NIST certified?  The latter at least carries a seal of approval).  However, that doesn't answer when it is appropriate (and inappropriate) to use encryption.

    The use of encryption is inappropriate if there isn't a solution for the type of hardware you're using.  For example, perhaps the covered entity uses Windows 98 and none of the current encryption solutions in the market support such an old operating system.  Using encryption would be inappropriate (or impossible) in this scenario.

    There are also other instances where encryption may not be appropriate and technical specs don't come into play.  Perhaps the computer is being used in a hospital's operating room, and having a password in place is ill-advised.  There's also the fact that one generally cannot find unauthorized personnel in ORs, so perhaps the risk of a computer or other medical device being stolen is extremely low.  I'm sure you can come up with other instances where the appropriateness of encryption comes into question.

    I guess this is what you really should be considering: if you think that you can live with the consequences of having a HIPAA data breach on a particular device, and defend that position, then don't encrypt it.  For example, if you think that, in essence, saying "we don't need to encrypt desktop computers because they're harder to steal than laptop computers" is an adequate explanation to OCR, then go ahead and don't install encryption software on your desktop computers.

    Just make sure you document the reason why you decided encryption was not a good idea.

    Related Articles and Sites:


  • Smartphone Security: Thefts Doubled In 2013 And Other Stats

    According to Consumer Reports, 3.1 million smartphones were stolen in 2013, a significant increase from the 1.6 million smartphones thefts the previous year.  Despite the tremendous growth in purloined smart devices , the same report found that most people are not following even the most basic of steps to protect data on mobile devices, such as turning on smartphone encryption.

    There are a number of things that smart phone users can to do ensure that nothing more goes awry once they're the victims of a theft.

    Number of Missing Smart Phones Waaaay Up

    In addition to the 3.1 million smart phones that were stolen, and additional 1.4 million were lost for a total of 4.5 million missing devices.  Some may attribute this growth to an increase in people who are using smartphones...and they would be right.

    According to a February 2014 post, approximately 66% of US consumers own smartphones, an increase from 44% in 2011.  Some rough calculations, and assuming a linear rate of growth, indicate that the growth of smart phone users from 2013 to 2014 is around 12%.

    The increase of lost phones from 1.2 million to 1.4 million is 16%, which is roughly in line with the growth of smartphone users.  However, the rate for stolen phones is nearly 94%.  Obvious conclusion: thieves are out to get ya.

    No Security Measures: 34%

    In addition to the above stats, the Consumer Report findings show that:
    • 34% did not use any security measures whatsoever
    • 36% used a 4-digit PIN
    • 11% used a PIN that's "longer than 4 digits, a password, or unlock pattern"
    • 7% user some other security feature, like encryption

    Seeing how mobile device encryption is useless without a password, I take it that the 7% figure also includes some form of password aspect to it.  In other words, a total of 54% of smartphone users are securing their devices with the most basic of data protection for mobile devices.

    Unfortunately, it's only the 7% of users who are fully protected, seeing how reading a smartphone's contents only requires connecting it to a computer if encryption is not enabled on a smart phone (or a tablet, for that matter).

    Still, the numbers show an improvement.  When I read of a similar survey a year or so ago, I'm pretty sure the stats showed that about 25% or so were using passwords on their devices.  Now, the number has doubled, meaning that people are definitely more aware of the potential problems of not securing their mobile devices.

    As many smart phone TV ads note, these devices aren't only part of your life, they hold your life.  It's in every person's interest to be part of that group that's using encryption to protect their device's content.

    Related Articles and Sites:
  • Data Breach Cost: South Carolina Earmarking $27 Million For 2-Year-Old Hacking Incident

    South Carolina is in the news again for a data breach that occurred in 2012.  If you'll recall, that was the year when SC admitted that its tax collection department had suffered a data breach, affecting 6.4 million people.  Two years after, the effects of the data breach are still being felt.

    $20.7 Million Earmarked for Next Fiscal Year

    The one silver-lining on government data breaches, if you can call it such, is that a lot of information is made public, giving us a better understanding of what an organization has to go through if it experiences a data mishap.  For example, figuring out the total cost of a data breach has always been like peering through an opaque window: some figures are revealed but the whole of the picture always remained hidden, either because that's really none of our business (how much a company is investing in security software) or because no one's interested anymore (what costs did a company incur on the third year after the data breach?).

    A lot of articles covering the South Carolina budget for next year are noting that at least $27 million is being earmarked for information security and credit monitoring.  Of this figure, about $6.5 million will be used for a third year of credit monitoring services.

    Of the remaining $20.7 million, $5.7 million is for a 12-person information security division and 3-person privacy office; $6.1 million is for maintaining and expanding the division's services; and $8.7 million is for upgrades to current data security.  The bulk of the money, as you can see, is being used for security operations that the state should have had before the 2012 data debacle.  Under the circumstances, it's impossible to say that, three years in, the ongoing cost of a data breach is $27 million.

    The only ongoing cost appears to be $6 million, which is not exactly small change (and also a dubitable expense.  Brian Krebs of describes his first-hand experience on such services.  Conclusion?  Not really worth it unless you get it for free).

    Other Goodies

    In addition to the extra money being spent to ensure that proper data security is in place, the South Carolina legislature is also making it mandatory for state agencies to implement information technology policies as well as shoring up other weaknesses, such as not knowing whether IT security is actually being implemented.

    Of course, such things don't necessarily require state-level legislation.  It's a matter of getting the right tools for the job.  For example, the AlertBoot managed encryption solution uses a cloud-based console, making it easy to checkup on endpoint deployments and installations from any web browser with a connection to the internet.

    Related Articles and Sites:


  • HIPAA Business Associate: Not Having A Written Agreement Is Grounds For Reporting A Data Breach

    When it comes to preventing HIPAA data breaches, one of the best ways of doing so is via the use of PHI encryption software.  However, there are so many aspects to the HIPAA Security Rule that sometimes it gets confusing.  For example, what happens if you violate one HIPAA rule while you have encryption in place?  Under most scenarios, you should be protected under the safe harbor clause.

    But the Berea College Health Services (BCHS) case shows that it may not be so simple.

    The Non Data Breach

    The site site has unearthed a relatively interesting data security violation.  Berea College in Kentucky has notified patients of BCHS that they were involved in a HIPAA breach.  Apparently, a billing contractor had gotten a hold of and used BCHS patient information, as intended.  This triggered a data breach, however, because there wasn't a written business associate agreement between the two:
    Although this contractor had access to medical records, including names, addresses, dates of births, insurance numbers, social security numbers, and diagnosis and treatment information, BCHS has no reason to believe that any patient information has been misused or disclosed inappropriately. We did not have a written agreement in place because BCHS failed to request it. The contractor has advised us that patient health information was used and disclosed only for BCHS billing and for no other purpose, and we have been assured that the contractor has returned to BCHS or destroyed any patient information that she might have accessed. Nevertheless, we are obligated to notify you of this issue.
    There is no reason to believe that there was any foul play involving PHI.  Indeed, if the notification letter is to be believed, the only transgression is the lack of a formal agreement.  I also noticed that the failure to encrypt PHI data went unmentioned, leading me to believe that everything was taken care of in that area.

    Lack of Agreement Trumps Safe Harbor?

    The HHS Office for Civil rights has made it clear over the years: encrypt your data and you're protected (although there are certain caveats.  For example, the encryption that was used must be something that NIST has approved or is likely to have approved...although that last one is never a sure thing, making the former the only sure-fire option).

    Does the situation with BCHS mean that data encryption does not provide as much safe harbor as people are led to believe?  Or perhaps BCHS was being a little too cautious?  After all, there's nothing forbidding a covered entity from issuing a letter of apology even if they don't have to.

    My own conclusion is this: at the most fundamental level, BCHS has run into one of those caveats regarding encryption and safe harbor.

    You see, even if the data was sent to the business associate in encrypted form, and was stored in an encrypted format while she was working with the data, she accessed it.  She had to if she was going to work with the information.  But without a formal agreement, she was technically an unauthorized third party and shouldn't have had access to the information.

    In other words, encryption was breached.  Encryption safe harbor is a moot point if a hacker were to somehow gain access to encrypted data.  While BCHS is not dealing with a hacker, the lack of a formal agreement means that they were operating under a similar situation.

    The moral of this story?  Make sure all your tees are crossed and eyes are dotted, literally as well figuratively.

    Related Articles and Sites:


More Posts Next page »