in

This Blog

Syndication

Tags

News

AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.

Archives

AlertBoot Endpoint Security

AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.

June 2008 - Posts

  • Full Disk Encryption And Internal Security Breaches (Update To May State Street Incident)

    The site pogowasright.org has a link to a letter of notification to the Maryland Attorney General from Exeter Trust Company.  According to the letter, State Street—who announced in late May that it had lost “computer equipment”—was a subcustodian to Exeter, and the trust is now revealing details on what happened.  It makes me wonder if disk encryption solutions like AlertBoot would have helped in this case.

     

    According to the notification letter, an employee of a third party vendor hired by IBT stole a computer tower that contained four million internal e-mails with sensitive details like names, Social Security numbers, and checking account numbers.  Since one assumes these e-mails would have to be gone through one by one, it’s not wonder that it took State Street four months to figure out who to call.

     

    A computer tower, eh?  I haven’t heard that term in ten years.  If they’re referring to what I think they’re referring, we’re talking about something that is about two feet or more in height.  Not an easy thing to lug around.  Stealing one of those would definitely require an insider’s help.  Which goes to show that form factor is not to be relied upon for security.

     

    The question, as far as I’m concerned, is “would full disk encryption have helped in this case?”  I’m going to assume that file encryption would have been a less than ideal solution, just because of the massive number of e‑mails to be protected.  I just know someone somewhere would have forgotten to encrypt at least one e-mail.  It’s just the nature of things.

     

    Why am I even questioning the effectiveness of hard drive encryption?  Quite obviously, because an insider was involved.  Hmm… I probably should note that this employee is technically not an insider.  Yes, he was physically inside the venue where the computer was located; that much is clear.  However, whether he was working on the computer, or next to the computer—we don’t have that detail.  After all, he wasn’t an employee at State Street.

     

    And therein lies the problem.  If he was working next to the computer, full disk encryption would have protected the e-mails and any other data stored on that computer.  One assumes that the username and password for decrypting the information wouldn’t have been taped to the bottom of the keyboard, for example.  If the thieving employee had worked on the computer, then that means he had access to the data begin with, and full disk encryption would have been of limited protection.  I say “limited” because encryption solutions like AlertBoot—which use a centrally‑managed console to control the encryption of computers and access by employees—would have given State Street a chance to disable access to the encrypted drives after the theft.  Potentially, this could have prevented the thief from getting to the data, if conditions were right.

     

    What’s really tragic about this story is that State Street caught the perp and managed to recover most of the equipment that was stolen.  But not the server that contained the sensitive data.

     
  • Hard Drive Encryption Missing From Seven Stolen UK Hospital Laptops. A Defiant Act?

    The Telegraph is reporting that two breaches involving NHS data have occurred in the UK.  Based on the details provided by the article, it looks like one was a case of egregious behavior (labeled as “defiant” by the writer), and the other less egregious—but just as non-compliant as the first one.  These wouldn’t have been problems—in a pragmatic sense as well as a compliance/legal sense—if full disk encryption solutions like AlertBoot had been used.

     

    The first case involves the loss of a laptop from a car.  Amazed that this is still happening.  Not thefts from a car; that’s going to happen until the second coming of Jeebus, although by that time it will be more like breaking into a rocket or something.  Nope, amazed that people are still taking unencrypted laptops and leaving them in their cars.  Not good news for the 11,000 patients whose information got stolen.  Apparently, the compromised data involves names, addresses, NHS numbers, and personal medical histories.

     

    In the second case, six laptops were stolen from locked file cabinets at St. Georges hospital.  The laptops were a temporary measure, according to an internal e-mail that was sent to St. George’s staff.  I ask, a temporary measure to what?  Did they have to replace six desktop computers at the same time?  (Because it would be weird.  The article makes it sounds as if they were all in the same office.)  If so, and this is the more important question, would those desktop computers have been safe from the thieves?  After all, they showed the determination to break into locked drawers and file cabinets.  My guess is that desktop computers wouldn’t have been safe either.

     

    Those following this blog will recall that I’ve often preferred people using laptop computers, and locking them up at the end of the day, over desktop computers that are left in plain sight.  I’d like emphasize that I’ve also noted that the lock up has to be done in some seriously secure containers…like a safe.  Or a safe‑like file cabinet.  You know, those designed not to be broken into?  A simple file cabinet will not do (which makes me wonder about the security of plain, but sensitive, paper documents).  On the other hand, the fact that hospital staff were locking them up does show that they were being security conscious.

     

    If they really knew their stuff, though, they would have encrypted the laptops—which is required: “Department of Health rules say.... Mobile storage devices including laptops must be fully encrypted.”  Those are the rules.  And, they’re stricter than what HIPAA requires.  As far as I know, under HIPAA laptop encryption is an option, not a requirement.

     

    Of course, hard drive encryption wouldn’t have prevented the thefts.  The point is to protect the data if (some would argue “when”) the computers get stolen.  Nothing gets the job done like disk encryption when it comes to hard drives, be they in computers or external.

     
  • Full Disk Encryption For The State Of Kansas Government Computers

    The audit arm of the Kansas state government has released a report on the disposal of surplus computer equipment, and whether Kansas is effectively dealing with any sensitive data contained on those computers.  The conclusion is that they’re not.  Certain practices have been recommended to the state for ensuring that sensitive data is disposed of correctly in the future.  It seems to me that they could have included full disk encryption as an alternative to those recommendations.

     

    The process of getting rid of surplus computers by the government seems straight forward: they send it to the aptly‑named Surplus Property agency (probably a government entity as well).  This agency, I assume, arranges for the sale of equipment to the general public.  However, it seems that people forwarding equipment to the agency assumed Surplus Property was in charge of erasing the data as well.

     

    Of course, this doesn’t make sense when you think about it.  Certain branches of the government don’t—and shouldn’t—have access to the data of another.  I’d assume that employees of the Surplus Property have no business accessing data on government staff payroll details, for example.  Clearly, deleting data is the responsibility of each respective government agency, no matter how sensitive (or not) the data happens to be.

     

    The report also shows that poor data deletion practices contributed to the problem as well.  For example, people are under the impression that deleting files and emptying the trash (icon), reformatting the hard drive, or reinstalling the OS will delete data.  As the report correctly points out, these practices do not delete data.

     

    What the layman does not realize is that data will remain on disks forever.  The only way to get rid of data is to replace it.  This is why one of the recommended practices listed by the auditors is overwriting data.  And because this is not guaranteed to remove the original data, the usual practice is to rewrite the data at least three times.  The other way of ensuring data breaches don’t happen is by physically destroying the disk or demagnetizing it, a process that ensure that people cannot access the data.  It’s different from replacing/rewriting the data, but does its job of preventing information from leaking.

     

    However, there is one more way of ensuring that data doesn’t leak: full disk encryption.  Products like AlertBoot full disk encryption are already designed to protect data and prevent data breaches, so they certainly have a place when it comes to data security for computers that were replaced.  And, let’s not forget that hard drive encryption is something that is employed at the beginning of the computer life cycle—and the perpetual possibility of computers with sensitive data getting stolen.  By using disk encryption, Kansas would be able to ensure data security during and after a computer’s useful life.

     
  • Hard Drive Encryption A Quest For Quest Diagnostics

    Quest Diagnostics has filed a report with the Maryland Attorney General that three MD residents may have had their personal information compromised due to the theft of a laptop, which featured password‑protection.  The personal information breached for the three people include names, addresses, and Social Security numbers.  I assume that full disk encryption like AlertBoot was not used in this instance.

     

    Quest Diagnostics.  The name is not unfamiliar to me.  There was one down the street from where I lived in Cambridge, right next to La Luna (good coffee, pastries).  I never knew what they did, though, because the only thing I had to go on was their logo, a half sun and…I don’t know what.  I figured that with the word “diagnostics,” the slightly techy‑looking logo, and the proximity to MIT meant that they were probably into the electro‑silico‑mechanics business.  (I also thought they may be a software company because I was also reminded of King’s Quest.  That’s how old I am.)

     

    If they had been in that business, Quest would probably have realized that password protecting the contents of a laptop would have meant bupkis.  However, Quest is not in the computer scene.  They’re in the health diagnostics scene—think blood samples, DNA testing, STDs, etc.—meaning that…they still should know that password protection means bupkis.

     

    I mean, I’d imagine this company would have to comply with HIPAA regulations.  Sure, HIPAA is not the easiest thing to go through, but it seems that their directives pretty much come down to this: lock everything; bolt down everything; don’t let outsiders see any of the patient data—even if it’s glancing at your screen in passing; don’t let anyone on the inside see the data either, unless they must; constantly keep looking over your shoulder to make sure no one’s looking over your shoulder.  Oh, and throw in data encryption into the mix if you think you can’t pull it off.

     

    I’ve always wondered, personally, why encryption was not made a requirement.  It certainly works wonders in the event something gets stolen.  Plus, it’s easier to comply with than constantly looking over your shoulder—literally as well as figuratively.  I guess it’s not a requirement because you have those rare instances where disk encryption or file encryption is not an option.  However, wouldn’t patients be better served by making data encryption a requirement? And allowing exclusions in the event the base technology cannot support it?

     

    If the rash of stolen laptops with sensitive data means anything, I’d imagine so.  Of course, there are a lot of people who claim that laptops shouldn’t be used to begin with when it comes to sensitive data.  I disagree.  If you have the right type of protection, there’s no reason why you shouldn’t use a laptop for storing sensitive data.  In fact, I’d be more willing to trust a laptop with full disk encryption using RSA than a mainframe being guarded by a rent‑a‑cop.

     
  • Data Breach Drama: Lost Tapes, Full Disk Encryption, And The University Of Utah

    Looks like I jumped the gun on the happy ending for the University of Utah Hospitals and Clinics.  If readers recollect, Perpetual Storage Inc. lost backup tapes that contained information on 2.2 million patients treated at the U of U.  Word is out on the street that the U of U may be sued; it’s just a matter of time.  On the other hand, the company that actually lost the tapes has already been served.  I’ve already blogged how AlertBoot encryption solutions could have been helpful in this situation. (Comments left behind at Computerworld show that encryption is not always simple for backup tapes—that is, encryption may not be an option, period.)

     

    This certainly is surprising.  Initially, I figured  a lawsuit would be filed, what with the security vendor’s employee picking up the tapes in a private car, and leaving it parked outside his home overnight (with the tapes in the car).  I wondered if the hospital would also be sued.  However, that thought vanished when I read that the information on the tapes had been encrypted.  If the university hospital had encrypted the data, technically those records are safe, regardless of their physical location.  Perpetual Storage, on the other hand, couldn’t be taken off the hook.

     

    Now, however, I keep reading that the university has declined to answer whether the data on the tapes was encrypted or not—and there’s a looming lawsuit—so I’m wondering if the data being encrypted was just a rumor.  As far as I know, The Daily Utah Chronicle—where I learned of the tapes’ encrypted status—hasn’t published a retraction on the encryption issue.

     

    What’s really interesting about this case is that it has caused a lot of inflamed rhetoric out in comment‑land.  In any forum where comments can be left behind, people are debating the issue (look up the news at the Salt Lake Tribune or at Computerworld, for example).  I guess when lawyers decide to go after a not‑so‑deep pocketed hospital (some have called it a charity hospital—and others haven’t been so kind), and where a data security breach occurred because of an outside vendors’ unprofessionalism, words will be exchanged.

     

    Here’s the thing.  I’m not a lawyer, but as far as I know, the owners of the data are the custodians of the data—and hence, are responsible for the data’s safety.  It doesn’t matter that someone else caused a blunder.  This is why when the Gap’s third‑party vendor lost a laptop with information on 800,000 job applicants last year, it was the Gap that sent out the press release and had to take it up the rear.  I don’t think the name of the vendor was ever released.  The Gap had made it a condition that the vendor use solutions like full disk encryption to protect the data, but the vendor didn’t complete the terms of the agreement.  And yet, the Gap had to take responsibility.

     

    Likewise, it seems that the university will have to bear responsibility for the vendor’s screw up.  But, is there a case here?  I guess we’ll find out sooner or later.  If the data on those tapes were encrypted, then the lawsuit against the university will be meritless.

     
  • Full Disk Encryption Better Than Data Breach Notification? A Lawyer, I Think, Implies Yes

    And I agree.  Encryption solutions and other data protection solutions are an infinitely better way of protecting consumer data than letting people know they should be on the look out for financial fraud.  However, I do find issues with the points raised by Bart Lazar in a column he wrote for computerworld.com.  Granted, I’m not a lawyer, and I think he is, so I’m at a disadvantage when it comes to such matters.  However, there are certain statements that I find confusing at best.  I’m not sure if it’s a matter of editing or what, but the gist of Lazar’s argument seems to be:

     “There is no credible evidence that demonstrates that the supposed benefit to consumers outweighs the administrative burden and expense caused to companies.” [There is a second part to the quote, which will be followed below. -S.L.] 

    As I understand it, this is most definitely true.  My guess is it will remain true.  My beef with this statement, though, can be summed up as “so what?”

     

    As a consumer, I want to know that I’m at risk because Company X fell victim to a data breach.  Not everyone may share the concern, but I feel that way.  After all, if the incident does result in criminals pulling off fraud using my information, it’s not Company X that gets dinged—it’s me.  I have to rectify my credit history.  I have to deal with faceless people on the phone, eating up my time.  I have to deal with credit companies demanding money on “my” purchases.  In other words, there are a lot of hassles that I have to deal with.  I’d like to know that I should be on the lookout to prevent such hassles; if I can’t prevent them, at least I know what to expect.

     

    Company X has administrative burdens and extra expenses because they had a data breach that occurred due to Company X not having the proper controls?

     

    So what?  Cry me a river.

     

    How would you even calculate the break‑even point where benefits to consumers do outweigh the burden to a company? Well, the quote above really means this:

     

    (Notification is a YES) if (Benefit to consumers) > (Burden to companies)

     

    That in turn means this:

     

    (Notification is a NO) if (Benefit to consumers) < (Burden to companies)

     

    The thing is, “Benefit to consumers” will always remain an unknown quantity, so the above is senseless rumination.

     

    [More importantly, the spirit of the statement implies that Company X can keep quiet about a breach while potentially thousands or hundreds of thousands of people suffer.  Let’s say that it costs Company X $20 million to get in touch with all affected: 20 million people.  However, Company X projects (God knows how) that only 10,000 people will benefit from the alerts, at $1000 per person, for a total of $10 million.  Benefits don’t outweigh company costs—no breach report.  However, the 10,000 people have a real problem on their hands, and they have no idea of how or when it happened.

     

    I know (well, technically, I assume) this is not what Mr. Lazar was trying to say, but this is how it sounds to me…]

     

    To his credit, Lazar is not suggesting that companies do nothing.  The above quote by Lazar is immediately followed by the following.

     

    “Because the alleged benefits are illusory, a company's time and resources would be better spent on proactive efforts to prevent data breaches.”

     

    Ah, yes.  Now it makes sense.  Let’s be pragmatic.  Efficient.  Be proactive.  The thing is, a company that decides to do something—anything—regarding data braches after experiencing a data breach is not being proactive.  It’s being reactive.

     Overall, I think that perhaps Mr. Lazar is confusing why the states have those laws in place.  Despite the fact that data security solutions, like AlertBoot full disk encryption, exist and have existed for a long time, a lot of companies do not have them in place nor are they interested in using them.  The reasons are myriad: cost; no time; no problems will happen here; what we have is good enough; employees don’t like it; etc. Cut short, there aren’t too many people being proactive about data security.  The data breach laws are there to prod companies to be proactive about data security. (Yeah.  It sounds oxymoronic.  If you have to prod, it’s not being proactive, right?)  Companies do not want to publicize bad news, and people know it’s easier to prevent any instances that may result in bad news, as opposed to cleaning up the mess once it’s happened. California et al. don’t have the laws in place as a direct solution to identity theft and data breaches, whatever the wording on the bills may happen to be.  The laws are a roundabout way to encourage companies to employ and implement data security solutions and practices.  After all, if there is no legal requirement to announce data breaches, why would companies feel compelled to protect sensitive consumer data?  Consumers won’t know who’s responsible in the event of a data breach anyway.  This means the companies don’t have to fear reprisals. 

    The only way Lazar’s argument above would work is if the US passed a law making it mandatory to encrypt all data.  Call it a hunch, but I don’t think that’s going to happen.

     
More Posts « Previous page - Next page »