This Blog




AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.


AlertBoot Endpoint Security

AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.

March 2014 - Posts

  • HIPAA Security: Don't Decrypt Data Before You Destroy It?

    HIPAA experts know that there are only two ways to obtain safe harbor for PHI: encrypt it or destroy it.  Seeing how it's hard to work with destroyed data, most opt to use PHI encryption software to protect their patients' sensitive information from unauthorized access.

    However, the rules also clearly state that any data that is being thrown out must be destroyed.  This makes sense for paper-based documents and other physical manifestation of information, like x-rays.  It also makes sense for digital information, but the reasoning behind it is not so apparent for encrypted data.  After all, lost or missing data is protected by safe harbor rules if encryption software is used to protect it, indicating that the encrypted information is perfectly safe.  Why must it also be destroyed?

    One in a Million: So You're Telling me There's a Chance

    The movie Dumb and Dumber has a number of notable quotes and scenes, many of them terrible, but one's always held a special place in my heart: when Jim Carrey's character asks his crush what his chances are, she tells him it's like "one in a million," and Carrey replies, a small smile forming at his lips, "So you're telling me there's a chance."

    And that, in a nutshell, is why you're supposed to destroy any data you're going to throw away.  This includes encrypted PHI data because there is always the chance that (a) someone will somehow figure out the password to the encrypted data or (b) someone will run across the encryption key.  The chances of it are remote, of course.  But not impossible.

    Destroy Your Data the Right Way

    Making it so computerized data becomes inaccessible is both surprisingly hard and easy.  Anyone who's had to deal with dead hard drives knows that computer storage is sensitive to bumps, humidity, electric shocks, magnetic fields, etc.  On the other hand, just because you're unable to use your device doesn't mean that the information is inaccessible.  There are plenty of business built around recovering information, and they're successful (and profitable) for a reason.

    Methods for destroying digital data are myriad.  One of the more popular methods is physically destroying it.  For example, you can punch a hole through a hard drive's magnetic platters, or even better, three or more holes through them.  There's the "sledgehammer" approach to it, which doesn't require an explanation, I think.  You can also melt it – the internet is surprisingly full of raconteurs who've used thermite to do so.

    There are also non-physical methods, like degaussing the data (i.e., running storage media through a gigantic magnet) or copying junk data to it.

    Here's a tip: no matter what approach you take, destroy you storage device while it's encrypted.  Why?  Well, for starters, you can think of it as insurance.  In the event that something goes awry, you'll have the encryption to as a security backup (which is win-win if you are a HIPAA covered entity).

    For example, what if you outsource your data destruction and the company does a poor job?  Or what if one of their employees decides he'll pass the data for a price, like in this story?
    Related Articles and Sites:


  • Cost of a Data Breach: MCCCD Data Breach Could Cost Up To $17.1 Million

    The Maricopa County Community College District (MCCCD) data breach has, in some respects, been one of the more controversial data breaches of 2013.  The district didn't notify people affected by the breach until seven months after they found out about the intrusion.  Furthermore, they only found out about it when the FBI had contacted them.

    But there is one notable thing about their actions: they've given the public a full accounting of how the data breach costs break down, a piece of transparency I haven't previously seen, ever.

    Whole Shebang May Cost $17.1 Million

    In early March, MCCCD provided "an updated account of the costs" to, revealing that the information security fiasco is expected to cost $17.1 million.  The breakdown, per
    • $2.25 million to Oracle to repair the computer system.
    • $2 million to Bishop Fox, a security-consulting firm that has an office in Phoenix.
    • $2.6 million to Eagle Creek, a Minnesota-based application-development company that worked with Bishop Fox to analyze and identify security problems in the system.
    • $2.7 million to Wilson Elser, a Chicago-based law firm.
    • $7 million to Kroll Advisory Services, a firm hired by Wilson Elser to send letters to the millions of people affected and offer credit monitoring and remediation.
    • $600,000 for additional, smaller contracts.

    This is $10.1 million more than the previously reported figure.  Plus, there are reports that identity theft can be traced back to the exposed data, meaning there will be legal grounds for a lawsuit.  Of course, not all two million people who were affected by the breach can make the claim. But, when you consider that the courts have been summarily dismissing data breach lawsuits for the lack of "cognizant harm," this does not bode well for MCCCP.

    Plus, remember how I mentioned the FBI had contacted MCCCP about the data breach?  The FBI had been monitoring underground personal information bazaars when they ran across the MCCCP data.  I don't know how this will play out, but anyone whose information was found by the FBI could make the claim that they face a real risk of having their IDs stolen...or that they were irrefutably stolen.

    Overall, it looks like not even MCCCP knows what the total cost of the breach will be (understandably), and the $17.1 million figure might be the current high mark until the next big revelation.

    Credit Monitoring – Only 3% Signed Up

    Another reason why the total cost cannot be determined is that MCCCD has no idea how many people will sign up for the free credit monitoring services they are providing:
    Part of the contract with Kroll Advisory Services was based on the estimated number of people who would sign up for credit monitoring. As of Feb. 21, about 80,500 people had done so — only about 3 percent of the people who received letters.

    The window for free credit monitoring ends Dec. 31, and the district will then know how many people signed up. []
    Why the low sign up rate?  Perhaps it's because people know that credit monitoring doesn't really do any good.  Or, perhaps people are procrastinating.  They do have another nine months to sign up, after all.  Or perhaps they're already signed up with another credit monitoring service from a previous data breach, and thus don't feel the need to sign up for another one.

    Regardless, three percent sounds awfully low, especially considering how the data breach first saw the light of day.  I think that we can expect sign up rates to increase as the year progresses.

    Containing a Breach - A Losing Game

    It's been pointed out that MCCCD will have to raise tuition rates after having to go through the above ordeal, not only to deal with it but also for improvements and updates that will prevent the recurrence of similar data breaches.  In cases like these, an ounce of prevention really is worth a pound of cure.  Instead of spending $17 million to deal with the aftermath of a data breach, imagine how much it could have done for preventing such a data breach from happening in the first place.

    Indeed, when you consider some of the expenditures – approximately $10 million for credit monitoring, lawyers, and other smaller costs – it appears that preventive measures could have cost MCCCD $7 million; possibly less, seeing how there's more room for negotiation and other maneuvers when you're not scrambling to contain a problem.
    Related Articles and Sites:
  • HIPAA Laptop Encryption: Gig Harbor Psychologist Must Undergo Mental Health Evalation Over Suspended License

    One of the most bizarre data breach stories I've heard over the years has reared its head again.  According to, disgraced psychologist Dr. Sunil Kakar will have to undergo a mental health evaluation if he wants to resume practice of his trade.  If you'll recall, the entire calamity could have been easily prevented via the use of HIPAA-compliant laptop encryption.

    As part of my research into the story, I've come across a number of facts that I hadn't been privy to before.

    Suspension and Mental Health Evaluation

    When I originally covered the story, it was noted that the doctor had been suspended from practicing psychology because of the data breach.  I found it hard to believe at first, but then it was noted how over 600 patients would have to be assigned to new therapists and restart their ordeal (and hence revisit all over again their agonizing ordeal).  In light of this, it made sense to punish this doctor who allowed a prostitute to run away from his car with his laptop computer (Once more, I note how the use of PHI data encryption would have relegated this incident to a mere sexual peccadillo, which occurs quite often across the world).

    However, it turns out that perhaps there is another reason why the doctor's license was suspended.  At the least, we know for a fact that it's one of the reasons why it continues to be suspended:
    The state said Kakar also remains suspended because he failed to take part in a required substance abuse monitoring program ordered after an April 2011 incident for which he was charged with unprofessional conduct. []
    I've heard from personal friends that medical doctors can automatically lose their licenses for DUI/DWI charges; at minimum, they face severe disciplinary actions.  I've never looked into it, but if this is true, then it makes it harder to understand how Dr. Kakar even has the option of having his license reinstated.

    Especially after I found out the below.

    Laptop with PHI was Used as Temporary Payment

    According to the, the doctor had made it easy for his lady companion to take flight with the laptop full of patient data:
    The allegations against Kakar stem from an incident that took place February 4 when the 46-year-old newly single doctor left his personal laptop with a hooker as collateral while he went to withdraw money from an ATM.

    By the time Kakar returned to his car, both the woman and his laptop were gone.
    I watch a lot of procedural TV dramas (and I've heard stories), and let me tell you, this is not something you do.  You just don't.  Plus, what is this thing about "collateral"?  This implies that she was free to take the laptop if, say, the doctor had run low on funds.

    Does this sound like the actions of someone who's concerned about his patients' privacy?

    Regulations Other than HIPAA

    There's also one thing of note in this story: HIPAA/HITECH has, as far as I can tell, had no direct bearing on this case.  Yes, it was a HIPAA breach, and the Department of Health and Human Services has posted the case incident on their "Wall of Shame," seeing how more than 500 people were affected.

    However, one should note that all actions against the doctor, at least those that show up in the media, appear to have been brought forward at the state, not federal, level.  One should remember that, while HIPAA is a very important regulation to comply with, laws also exist at the state level and these are a force to be dealt with as well.
    Related Articles and Sites:
  • Email Encryption: Doing It The Right Way (And The Wrong Way)

    More and more data is moving to the cloud.  It shouldn't come as a surprise, then, that online hacking is currently the top reason for data breaches (depending on which metric you're using), followed quite closely by yesteryear's number one contender – the loss of laptops and other data storage devices that weren't safeguarded with full disk encryption or similar protection.

    There are many approaches to securing data, but protecting email may be one of the trickiest due to the nature of the medium.

    Securing Email – A Sisyphean Endeavor?

    Email powers modern business and will do so for many years to come; reports of its death, like Mark Twain's at one point, have been greatly exaggerated.  The continued use of this communication medium makes the use of information security tools imperative.

    The reason why email is hard – some say impossible – to completely secure it is that, first of all, you can't go about it alone.  Correspondence requires a second party, and if that second party is not interested in securing his electronic communiques, there's not really much you can do.

    (Incidentally, this is not true for certain work or business environments.  For example, if you happen to be a contractor or subcontractor to a HIPAA covered entity – i.e., you're a HIPAA business associate – you could be asked to secure your email and other data that contains protected health information.)

    Even when focusing on things that are supposedly within your control, it's impossible to completely secure email because of the way people work with it.  Take me, for example.  I am able to access my email using a number of methods: (1) via the web, using a browser that connects to my company's email server or by downloading it to (2) my laptop, which runs Outlook; (3) my iPad, via the configuration of the Mail app; and (4) my Android smartphone and its Mail app.  Obviously, BYOD has made inroads in our company.

    The most secure way of accessing email (dubious as the claim may be) would be to work with #1 regardless of the device.  Why is this the most secure?  Because the email message always remains on the server (which we assume is properly protected by professionals who know what they're doing).  The other three methods require that email be downloaded to each respective device – which is just another way of saying that the email is being copied – and thus means there is more than one thing to secure, one server vs. "n" objects (in my case, three.  Imagine what this would mean if all employees had the same number of devices.  With 25 employees, you'd have to secure email on 76 devices, the central server and 25 x 3 devices each.  That can't be easy for the guy who's managing IT security.)

    Thankfully, there are a number of ways to prevent email on devices from becoming a data breach trigger.

    Use Full Disk Encryption on Devices

    The easiest and most pragmatic approach to securing data on portable devices is to use smart phone encryption and laptop encryption.  Both methods use full disk encryption to secure the entirety of a device, meaning that emails as well as other data are protected from unwanted access.

    In addition, for smartphones and tablets, a mobile device management (MDM) solution allows finer, granular control over other security aspects of the device, such as the setting up VPNs, password policies, Wi-Fi provisioning, remote data wipe and deletion, and other security policies.

    The same goes for laptops.  Full disk encryption (FDE) prevents access to the computer, ensuring that email and other sensitive documents remain unread by unauthorized eyes.  With a solution like AlertBoot FDE, remote data wiping is also possible (although such technology cannot be relied on because it requires a missing laptop to be online for data erasure to work.  It's always better to use encryption with a strong password and use remote deletion as a backup that builds on FDE).  Other security features, like password policies, are also available.

    Use Email Encryption on Servers

    For the actual servers holding and powering a company's email system, there are basically two approaches to securing the data: server encryption and email encryption.  When it comes to email, the latter is a better approach.

    Server encryption is essentially the use of FDE on a server, although more than likely, it's a technology known as volume encryption.  This widely used approach is not an effective way to secure emails on a server.  While it does provide some protection, the game changes if a hacker gains access into the OS, especially at the root control level.  At this point, it's as if the hacker was accessing the server just like a system administrator is accessing the server.  Indeed, from a computer's point of view, there is no difference.

    AlertBoot Email Encryption works using a different approach.  Instead of encrypting a server, each mailbox is encrypted with its own independent key.  If a hacker (or even a rogue or nosy sys admin) were to try to access the email server, he would find a formidable obstacle when attempting to read the email.  The attempt to crack encryption would yield the fruits of one mailbox only, further limiting a data breach in the event a hacker gets lucky.

    Combing both FDE for independent endpoints and email encryption for the email servers, modern businesses can ensure that the risk of a email data breach is minimized as much as possible.
  • Disk Encryption Against Insider Attacks: Greenleaf Book Group Data Breach Notifications Sent

    Does it make sense to use computer encryption software on desktop computers?  Or laptop computers that are never taken out of locked facilities?  The answer is a resounding "yes" for both as the below story involving the Greenleaf Book Group shows.

    Janitor Steals Computers

    Within the data breach community, there is a type of data breach known as "insider attacks."  These are situations where the attack happens within an organization's supposed security perimeter, as opposed to attempts to hack into a network or company laptops being lost at the airport.

    Insider attacks are largely attributed to employees who are dissatisfied with their employer (as in this story where a Microsoft employee – now ex-employee – was charged with leaking trade secrets).  However, the label can be applied to any instances where a data breach is caused by somebody who had the right to be within the security perimeter, even if they are not a proper employee.  Janitors, for example, who are contracted to provide their services.

    In Greenleaf Book Group's case, a letter filed with the Maryland AG Office relates how the publisher became a victim of an insider data breach when a janitor stole five desktop and laptop computers.  The computers were "password protected (but unencrypted)" and had files and emails that included names, credit card information, email addresses, and (in some cases) a mailing address.

    They admit to a total of 6 Maryland residents being affected but only because the letter is meant for the MD Attorney General's Office.  Who knows how many more were affected?  More importantly, why was data encryption software not used to protect the data?

    False Sense of Security

    More often than not, companies do not properly protect their desktop computers because they're under the impression that such technological behemoths are not "portable" enough.  On a relative measure, this is true.  You can slip a smartphone into your back pocket, making it the "poster device" of device portability.  Desktop computers, in comparison, are not especially designed for portability.  Indeed, they're not designed for portability at all.

    But that doesn't mean that desktop computers are not portable.  Many of today's desktop computers are about the size, and sport the heft, of a college textbook.  Mid-towers from the late 90's are bigger but not much heavier, and I can say from personal experience that I can carry two of them easily, one under each arm.

    If you're a janitor with a cart, who knows how many you can lift before placing a smattering of balled-up sheets of paper on top to camouflage your deceitful activities?

    The risk of such a thing happening is low enough that, at a certain level, it makes sense that encryption was not deployed.  And yet, if you're storing credit card numbers (which is something of a no-no under PCI-DSS rules) on your computers, many would consider it neglectful not to have adequate protection regardless of what the risk happens to be.

    False Sense of Security II

    Let us consider this potential scenario: a janitor – who happens to be a nuclear physicist in his old country – decides he can make some cash by stealing sensitive information from improperly secured computers.  He accesses the unencrypted computers at the offices he cleans, searches for potential credit card numbers, and downloads it to a USB flashdrive that is the size of a US quarter (i.e., 25 cents bearing the profile of G. Washington).

    You say, ah! but the computers were password protected!  And I say, ah! but the nuclear physicist probably knows that password-protection means bupkis because it's not encryption!  Despite the name, depending on the system that is used, bypassing password protection could be as easy as starting up the computer using a free-to-download software that is burned to a DVD disc.

    Without encryption software – which employs system access control as well as data access control – it would be hard to tell whether a desktop or laptop computer was breached or not.

    Related Articles and Sites:


  • BYOD Encryption: Personal Flashdrive Breached IRS Employees' Info, 20K Affected

    BYOD encryption is not just for ensuring smartphone security and tablet protection.  Flashdrives are devices that employees bring from home, too, and have been at the center of some remarkable data breaches.  And, it doesn't look like they'll be going away soon.

    According to a short blurb at, the IRS has experienced a little data breach that affects 20,000 people who work (or have worked) out of Pennsylvania, New Jersey, and Delaware.  The Internal Revenue Service Commissioner has admitted that employees (going back to 2007, possibly earlier) were affected when an unencrypted thumb drive was plugged into an unsecured home network.

    The article raises more questions than it answers.  For example, how did the IRS find out about the data breach?  And how was the employee behind the data breach able to carry it out?

    USB Encryption

    There are certain offices in the US where, if you look closely, you'll find the coppery profile of Abraham Lincoln superglued over computers' USB ports.  Pennies, or any other metal for that matter, are effective and cheap physical barriers that prevent the use of USB ports (and consequently the use of USB devices).  The pragmatism of this pseudo-temporary solution became less so as the years passed, and USB cables were used for powering pretty much all computer accessories.

    What to do?  How can one prevent the exchange of data via the USB port while allowing its use for other purposes, such as powering a seat warmer or charging a smartphone?  The answer could very well be the use USB encryption software.

    For example, file or folder encryption could be used to secure a USB flashdrive, ensuring that the device is only usable within the office.  Connect it to a home computer and the files show up as unformatted, unstructured data that cannot be read.

    Related Articles and Sites:,0,1202563.story


More Posts Next page »