in

This Blog

Syndication

Tags

News

AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.

Archives

AlertBoot Endpoint Security

AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.

October 2014 - Posts

  • HIPAA Disk Encryption: Laptop Theft Affects 3400 In Georgia

    If an organization announces a data breach, but does not reveal whether it used data security software – like AlertBoot's managed HIPAA laptop encryption [ http://www.alertboot.com/ ; HIPAA disk encryption ] solution – is there a way to tell whether it was indeed using it?  You can if you're affected by HIPAA.

    Georgia Department of Behavioral Health and Developmental Disabilities

    According to phiprivacy.net, the Georgia Department of Behavioral Health and Developmental Disabilities (DBHDD) has alerted nearly 3,400 people that their information was breached when an employee of the department lost his laptop at a conference.  Well, "lost" isn't the right word.  It was stolen.  From the employee's car.

    Directly as a result of the theft, DBHDD has sent letters to the affected patients and followed other steps mandated by HHS (my emphasis):

    Because the laptop contains information of more than 500 individuals, the Health Insurance Portability and Accountability Act (HIPAA) requires that DBHDD notify the media about the incident. We have followed the reporting procedures mandated by the U.S. Department of Health and Human Services. The media notice, letter and this website provide information on how to contact DBHDD and federally-approved companies that offer free credit reports and free fraud alerts on those credit reports.

    It's not mentioned whether encryption software [http://www.alertboot.com/disk_encryption/mobile_security_byod_mdm.aspx ; cloud managed encryption ] was used, although the notice does acknowledge that "there are security measures in place on the laptop which will wipe the data and prevent access to the PHI if an unauthorized user attempts to access the internet."

    Now, this could either refer to (a) a disk encryption software [http://www.alertboot.com/disk_encryption/disk_encryption_product_tour.aspx ; disk encryption ] whose key can be erased from a remote source, like AlertBoot or (b) something that's not an encryption solution but manages to erase the information nonetheless.

    If I were a betting man – and I am – I would say that the notice is referring to the latter for the following reasons.

    HIPAA, GA Data Breach Notification Law, and Encryption

    A strong indication that encryption was not used lies in the department's actions as detailed in the above quoted blurb.  Under HIPAA, the use of strong encryption provides safe harbor from all those things that HHS mandates.

    Which is not a bad deal, seeing how going public with a data breach that has a real possibility of identity theft and other crimes leads to lawsuits; appropriating funds for rectifying the mess, including contacting breach victims and setting up answering services; the loss of face in the community at large as well as nationwide; and triggering an investigation by the HHS, which could lead to monetary penalties (up to $1.5 million per violation), annual security reports to the HHS for up to 20 years, and a full-blown inquiry into the policies and practices of the breached organization.  The latter can take years to complete.

    In short, if you've used encryption, you wouldn't be going through the Breach Notification Rule's mandates.  If you are, the odds are extremely good that you didn't use encryption.

    In addition, Georgia is one of those states where the state's own data breach notification laws provide safe harbor for encrypted data.  If the DBHDD breach had taken place in New York, one of the few states where there is no such protection, you may have to still have to report the breach (I'm not sure whether the law provides an out for data governed by HIPAA), and would provide an alternate reason for announcing the theft of a laptop with PHI.

    Otherwise, there's no real reason to do so because legally you're not required to and because the protection afforded by encryption is real (and not just some theoretical concept on a professor's whiteboard), so you're not doing wrong by your patients.

    Related Articles and Sites:
    http://www.phiprivacy.net/georgia-department-of-behavioral-health-and-developmental-disabilities-notifies-almost-3400-of-breach/
     
  • Managing Smartphone Encryption: Rehashing Myths

    The folks over at vice.com have commented on the latest smartphone security debacle – namely, the turning on of smartphone disk encryption by default – and the complaints from law enforcement over this decision.  Befitting the nature of the site, vice.com notes why law enforcement is wrong to raise the alarm.  And how some well-meaning people have bought into the arguments because they don't know better.

    Encryption Wars Redux

    What's probably most frustrating to people who are opposing the government's stance is that we've all been here before.  The encryption wars of the 1990s, where the government tried to rein in the use of cryptographic tools, covered the same arguments that are being made today, and led to the logical conclusion that backdoors should be anathema to everyone – including the government.

    The government's requirement that a backdoor be installed on security solutions for law enforcement is beyond the pale because it can't be guaranteed that only the government will be able to use it.

    Think about it.  Think about all the data breaches we've seen and heard of where hackers from Russia, or some Baltic state, or China, or wherever compromised the security of banking giants (who purportedly use the latest technology in security and hire the brightest), or the security of some government agency (including the military), or even the leading tech companies like Google.

    Granted, in these cases, it wasn't really a backdoor that was manipulated – there aren't any backdoors, as far as I know – but bugs, security holes, and other weaknesses.  From a technical standpoint, however, there is no difference between these weaknesses and a backdoor, although there is a difference in terms of policy or intent: a backdoor is put there on purpose.

    In other words, a backdoor is a weakness you plant on purpose.  That's it; nothing more, nothing less.  And while the government can promise to only use it in accordance with the law, what it cannot do is promise that everyone else who finds this backdoor will stick by that promise.

    Or, as the authors at vice.com put it more eloquently:
    So the next time a law enforcement official demands that Apple and Google put backdoors back into their products, remember what they're really demanding: that everyone's security be sacrificed in order to make their jobs marginally easier. Given that decreased security is only one of several problems raised by the prospect of cryptography regulation, you should ask yourself: Is that trade worth making?

    It's like that Refrigerator Joke

    This latest fight over encryption reminds me of that observation, that a person will open the fridge, late at night, looking for something to munch on.  He (or she – but usually he) finds nothing to his liking and closes the refrigerator door.  He then comes back 5 minutes later and opens it again, eyeing the contents again, then closes the door; and then comes back again… despite the fact that nothing has changed.

    Likewise with encryption and the argument for a backdoor.  Nothing has fundamentally changed in terms of the argument against encryption (and hence the need for a backdoor), while the arguments for the use of encryption have increased dramatically.

    Related Articles and Sites:
    https://news.vice.com/article/what-default-phone-encryption-really-means-for-law-enforcement
     
  • Why You Shouldn’t Be Afraid of the Cloud

    Code Spaces lived every business’s nightmare when a hacker deleted customer data, causing irreversible damage to the company.
     
    But that doesn’t have to happen to you. Just as you install a home alarm system to prevent burglaries, you can employ security measures — such as two-factor authentication — to keep your data out of reach. You can still take advantage of the cloud’s services without sacrificing data safety.
     
    In this article, Tim Maliyil discusses why cloud-based software is still safe and describes how you can protect your company’s data from hackers.

    http://www.entrepreneur.com/article/238061

     
  • iPhone Encryption Management: What Does Apple's Encryption Policy Change Affect?

    Apple and Google recently made headlines for changing their policy on smartphone encryption, namely making device encryption the default on their products, and designing their systems so that neither company holds the encryption keys.  Purportedly, this will keep the government out of your personal life when it comes to rifling through your smartphone and tablet computer – and has been welcomed by the general populace who is interested in privacy issues.

    Of course, the truth is that this is nowhere close to what people think it means, as Andrew Zonenberg writes in the article "Why Apple's iPhone encryption won't stop NSA (or any other intelligence agency)".  Although not meant to be technical in nature, the article does bring some elements that are traditionally used in the digital data security field, requiring a little heavy reading on the side (0day exploits; Alice, Bob, and Eve; UID keys; etc).

    The gist of the article is this:  there are many ways to collect data that are associated with your smartphone.  The use of device encryption covers security issues specific to one particular risk.  Incidentally, this is why I think the government going crazy about "helping the terrorists" is over the top.

    Device Encryption: Preventing Access When Smartphone is Lost

    When evaluating what Apple's new stance means, it helps to understand what device encryption (aka, disk encryption) is and does.  Device encryption, as the name implies, encrypts the device, specifically, the data storage portion of the device.  It does not mean that files are encrypted, or that data is encrypted.  It means the disk is encrypted.  You should let that sink in for a moment.

    The implications of this are huge: if you send an email from a disk-encrypted computer, the email is not encrypted.  If there's an attachment to that email, it's not encrypted.  If you copy a file to a USB device, that file is not encrypted.  If making a making a call from a smartphone that has disk encryption, the call is not encrypted.  Neither are texts, tweets, SMS messages, pictures, video clips, etc.

    None of that is encrypted; at least, none of it is encrypted by the disk encryption.  It's the device's storage disk that is encrypted with disk encryption – and thus anything placed inside of it.  A real-life analogy would be placing documents in a safe and locking it with a key.  The documents, coins, family photos, cash, jewelry, etc. do not change in any way whatsoever when placed inside a safe, and are only protected as long as they are inside the safe.  The same is true for disk encryption.  Zonenberg put it more succinctly (Eve is the "bad" person in the following):
    There is only one situation where disk encryption is potentially useful: if Alice or Bob's phone falls into Eve's hands while locked and she wishes to extract information from it. In this narrow case, disk encryption does make it substantially more difficult, or even impossible, for Eve to recover the cleartext of the encrypted data.
    Ultimately, what all of this means is that your calls, texting messages, things backed to the cloud, and other data are still well within the reach of spooks and others.

    But you're very safe from intruding eyes if someone were to steal your smartphone, or if you were to lose it -- which happens a lot.

    Related Articles and Sites:
    http://siliconexposed.blogspot.com/2014/10/why-apples-iphone-encryption-wont-stop.html
     
  • HIPAA Encryption: Cedars-Sinai Breach Bigger Than Believed Earlier

    The LA Times writes that a data breach at Cedars-Sinai Medical Center is larger than previously reported.  According to latimes.com, a stolen computer that was not protected with medical laptop encryption software contained information on more than 33,000 patients; in August, the number was "more than 500" which implied a figure closer to 500 than 33,000.

    Details that were unavailable before have been revealed as well.

    Oversight

    Apart from the admission that the breach affected tens of thousands of people, it was revealed that the laptop in question was not protected with a solution like AlertBoot's managed encryption software because of an oversight:
    The laptop was password-protected, but did not have additional encryption software that would have further protected the sensitive data. The software was mistakenly not reinstalled after a change to the computer's operating system…
    This, of course, is something well within the realm of possibility.  Indeed, I had mused that an oversight was the most probable reason why encryption was not used in an earlier blog post.

    There are only two ways of catching such a problem: (1) as a user, you notice that the encryption login prompt that used to be there is not there any longer or (2) as an administrator, you're running your weekly / monthly / semi-annual / whatever audits and notice that the number of encrypted machines doesn't match up to the number of laptops out there.

    There is another way, though: "Hospital staff are in the process of confirming that all employee laptops are properly encrypted".  This brute-force method, despite the outlay of time and energy, is probably the most surefire method of ferreting out any laptops that are not properly encrypted.

    Not Stolen for Personal Information

    As is usual in cases like these, Cedars-Sinai released the following observation:
    Cedars-Sinai said it has no indication that the stolen laptop was used to access the medical records. After the theft, the hospital blocked the laptop's access to its computer network.
    "We believe that the laptop was stolen as a piece of personal property, not for any information it contained," the hospital said.
    Let me give you a brief explanation as to why the above reassurance is problematic: if I steal a car because I'm going to sell it to a chop shop, it doesn't prevent me from rummaging the glove compartment and trunk to see if there's anything of value.  The fact that I stole a laptop because it's a laptop doesn't mean I'm not going to boot it up and see if I can find a secondary source of illegal profits, especially if I know that medical data is worth $10 per record on the black market.

    As the saying goes in computer security circles, encryption is not the be all, end all of data security.  However, it goes without saying that it's an effective solution for a lot of ills.

    Related Articles and Sites:
    http://www.latimes.com/business/la-fi-cedars-data-breach-20141002-story.html
     
  • HIPAA Encryption: American Family Care of Birmingham Laptops Stolen

    American Family Care of Birmingham, according to phiprivacy.net, has announced that a pair of laptops stolen over this summer contained sensitive patient information.  Based on the fact that the breach has made it to the media at large suggests two things: (1) HIPAA encryption software like AlertBoot was not used on the laptops and (2) the breach involved more than 500 people.  These conclusions are based on HIPAA and HITECH requirements covering medical covered entities, which provide a loophole for lost but encrypted data, and a requirement that the media be contacted if more than 500 people are affected.

    Laptops Stolen from Vehicle

    While many details are lacking, a number of things are clear: the laptops were stolen from an employee's vehicle, which is exasperating.  While I don't keep quantified track of such stories, I'm under the impression that I've heard of this particular HIPAA data breach at least every quarter since 2007.  When you take into consideration that not all breaches are made public (even if they should be, according to the law) or that I'll miss a story because my news filters don't catch it, once a quarter probably doesn't even begin to approach the actual number of data breaches caused by laptop thefts from cars.

    I must rant; I can't even begin to understand why such stories even exist: laptop disk encryption is cheap and plentiful, and does a great job of safeguarding data from being accessed by unauthorized parties.  As an added incentive, HIPAA covered entities are given safe harbor from – let's admit it, pretty onerous data security and reporting duties – if encryption is used.  Plus, any idiot can see that a car's trunk, passenger seat, back seat, etc. is not exactly part of a company's data security perimeter.

    There's Questionable Data Security Advice Out There

    Of course, there could be a logical explanation why American Family Care didn't use laptop encryption on these two machines: perhaps they weren't supposed to have any sensitive data on them.  Look at this part of the breach announcement (my emphasis):

    The company also stated no evidence points to the information being accessed, but it was discovered in August that the laptops might contain certain patient information…

    In other words, they're not really sure (or they don't want to admit to it).  Why are they not sure?  Who knows.  What I do know is this: rarely do people follow data and computer policies to a "T."  It's not that people are less than honest, or idiots, or lazy, or whatever (although they can be).  Often times it's because the same policies can be contradictory, or because they impede the carrying out of duties, or because they're so long and complex that the policies cannot be followed.

    From a security and policy standpoint, one should use the approach that realistically minimizes a particular risk (as opposed to one that works in theory).  When it comes to laptops in the workplace, contrary to certain advice out there, I think that all laptops should be encrypted.

    Why?  Because there's no way to guarantee that a laptop computer will not store sensitive data on it.  It could be downloaded from the internet, or from a USB stick, or from an email attachment.  It could be done on purpose, or by accident, or just "temporarily" parked when the unfortunate event descends.  Just encrypt the laptop and you're covered, no matter how the data ended up on the machine, or how the machine ended up in an unauthorized individual's hands.

    Of course, this doesn't mean that you just "set it and forget it" – you're not handling a rotisserie chicken here. There are a number of security procedures, processes, actions you have to commit to.  But in the event that things go wrong, you've still got a way out that pays extremely high dividends: protection of the stolen data; compliance with regulations; no need to alert anyone – and consequently, no frivolous lawsuits; no investigation by the regulatory body; and more.

    Related Articles and Sites:
    http://www.bizjournals.com/birmingham/morning_call/2014/09/american-family-care-alerts-customers-of-stolen.html
    http://www.phiprivacy.net/american-family-care-alerts-customers-of-stolen-laptops-containing-patient-information/
     
More Posts « Previous page