A laptop computer belonging to an employee with the Oklahoma Housing Finance Agency has been stolen in a house burglary. It affects 225,000 Oklahomans, a little over 6% of the state's population. The agency was in the process of protecting computers with the use of drive encryption software; however, the stolen computer had not been equipped with encryption at the time of the theft. (This article, however, pegs the number at 90,000.)
The information on the stolen laptop included names, SSNs, tax identification numbers, dates of birth, and addresses of people in the Section 8 Housing Voucher Program. In other words, people who can least afford to have fraud conducted against them.
Was the employee allowed to have all that information? The answer is "yes." The employee worked in the field--although, I doubt he could have helped nearly a quarter of a million of people at any given time. It looks like, without the security that laptop encryption affords mobile computing devices, it would have been smart to at least reduce the amount of data on that laptop (perhaps only download files necessary for the week?)
According to newsok.com, there were two layers of passwords used to protect the data.
One of them, it looks like, is the usually Windows prompt when you initially boot up a computer. I've already covered previously how these can be easily bypassed.
The other password is tied to the file itself. Unfortunately, this, too, can be bypassed, although it requires the correct software, a hex editor (I'm not sure whether I should say it's "easily" bypassed, although it is pretty easy to do). As long as the contents of the file are not encrypted, that information will show up on this program.
A hex editor allows you to open a file to show you everything in that file. The contents of the files, yes, but also the rest of the stuff you normally don't see. The programming language, if you will. Included will be the password. After all, the password has to follow the file around, if one's going to allow access to the file when it's typed in correctly. (That entire process about comparing to see if there's a match.)
But, don't take my word that password protection is a weak (almost non-existent) form of security: Just take a look at the OHFA's actions. If password protection is good at protection like data encryption, why were they in the process of encrypting all computers?
An international survey conducted by Websense reveals that 30% of respondents think that CEOs and board members at companies where a data breach occurred should be jailed. Now, I wouldn't find this too surprising, except that it was a survey done on security professionals at the 2009 e-Crime Congress. That's got me scratching my head.
The use of firewalls, data redaction, data encryption, and other data loss prevention and information security measures can radically minimize data security breaches. However, it's also agreed that the same can only do so much.
For example, how are these technologies going to prevent the "grand poobah" database administrator from copying data to a USB disk which will subsequently be sold to the competition? They can't. And if said admin is also in charge of the logs, he can get rid of his activities. (The trick is to have someone else in charge of the logs, but there are other issues as well: how do you differentiate the illegal activity from normal operations?)
Obviously, there's very little one can do in instances like the above. It's the classic case of who's going to police the police. I can tell you, it's not going to be the CEOs--they generally don't have the necessary skills.
And that brings me back to my head-scratching. Will jailing CEOs for data breaches really make a difference? Isn't that similar to updating firewalls after hackers get through or installing full disk encryption like AlertBoot on laptops after computers get stolen or lost? (Which is what's happening currently.)
Maybe what they meant is that CEOs should feel the pressure to really take a good look at their company's data security measures. But that can be achieved via other methods: 62% of the survey respondents opined that companies should be fined (not sure if there's any overlap with the jail-the-CEO crowd) for breaches. Make the fine big enough, and CEOs are bound to take notice.
Also, this is just a guess, but I figure the CEOs wouldn't really change their priorities even if they face the potential for jail time. The reason? Most people--CEOs included--pay scant attention to data security not because they don't have a personal stake, but because they believe it won't happen to them.
It's like jaywalkers: the threat of being run over is not enough to prevent them from crossing at designated areas because they don't think they'll ever become roadkill.
Will jail time for CEOs get their attention? Sure. Will it prompt them to assign priority to data security over the bottom line? Doubtful.
Related Articles and Sites:http://www.prwire.com.au/pr/12553/jail-for-data-loss-ceos-says-e-crime-congress-surveyhttp://security.cbronline.com/news/data_breach_ceos_should_face_jail_survey_300409http://www.cxotoday.com/India/News/No_Mercy_for_CEOs_of_Defaulting_Companies/551-101533-909.html
Saint John Regional Hospital in Canada has announced that an outside contractor, Cook Medical, has lost a laptop computer with some (literally!) sensitive data. It is claimed that "extensive security" was used, but I beg to differ. If they had used hard drive encryption I would be willing to entertain such claims. It turns out the security in place was the use of two passwords. That's computer security in the sense that a uniformed mannequin is security.
According to dailygleaner.com and timestranscript.com, the laptop was stolen from a Cook Medical employee back in January. Saint John Regional Hospital, however, only received notice this month. Letters were sent to the three patients whose names and birthdates were stored on the laptop. Credit monitoring was offered. The RCMP is investigating, but has not turned up anything so far. With only three affected, it makes one wonder why that information was on that laptop. The only Cook Medical I was able to turn up is the developer of health care devices. Makes you wonder how names and birthdates of patients figure into the design of devices...
According to dailygleaner.com and timestranscript.com, the laptop was stolen from a Cook Medical employee back in January. Saint John Regional Hospital, however, only received notice this month.
Letters were sent to the three patients whose names and birthdates were stored on the laptop. Credit monitoring was offered. The RCMP is investigating, but has not turned up anything so far.
With only three affected, it makes one wonder why that information was on that laptop. The only Cook Medical I was able to turn up is the developer of health care devices. Makes you wonder how names and birthdates of patients figure into the design of devices...
I've already pointed out that the use of encryption software like AlertBoot endpoint security systems would have indicated extensive security. It's not total and complete (what if the owner of the laptop kept the password taped to the bottom of the device?), but it's certainly better than what I'm reading here. According timestranscript.com, Gary Foley, vice-president of professional services for Regional Health Authority B, pointed out that: "...the laptop was equipped with two security passwords, which...made it extremely unlikely that any information on the computer could be accessed." I'd drop the word "extremely" from that sentence. Consider the word "extreme." A "security" measure that takes less than 10 minutes to disable, with little need for technical knowledge...does this sound like a data breach would be an "extremely unlikely" scenario? Now compare it to encryption, where there is a high need for technical knowledge in order to bypass it and, even when having it, may require no less than a century to gain access to the data. Which sounds like extensive security?
I've already pointed out that the use of encryption software like AlertBoot endpoint security systems would have indicated extensive security. It's not total and complete (what if the owner of the laptop kept the password taped to the bottom of the device?), but it's certainly better than what I'm reading here. According timestranscript.com, Gary Foley, vice-president of professional services for Regional Health Authority B, pointed out that:
"...the laptop was equipped with two security passwords, which...made it extremely unlikely that any information on the computer could be accessed."
I'd drop the word "extremely" from that sentence. Consider the word "extreme." A "security" measure that takes less than 10 minutes to disable, with little need for technical knowledge...does this sound like a data breach would be an "extremely unlikely" scenario?
Now compare it to encryption, where there is a high need for technical knowledge in order to bypass it and, even when having it, may require no less than a century to gain access to the data. Which sounds like extensive security?
Health Minister Mike Murphy said despite efforts to improve security, some breaches are bound to occur."We have 19,000 employees in the Department of Health and there are going to be privacy breaches from time to time," he said. [dailygleaner.com article] I can't argue with that. Even if the rate of breaches were a low, low 0.01% per year (that means the chances of not having a breach is 99.99%. Obviously, the number is not grounded on real life), with 19,000 employees, you'd have almost two breaches annually. Plus, consider how many contractors and outside vendors the Department of Health must be working with, and the number of "employees" actually increases, even if the above hypothetical rate stays the same. So, Mr. Murphy is right--he's being pragmatic and pointing out the obvious. (Kind of unusual when you consider he's a politician.) On the other hand, there is a difference between being pragmatic and being a defeatist. Just because you know it's going to happen doesn't mean you can't do anything about it. For example, you could work to further decrease the odds of a breach. Instead of relying on questionable security measures like double, triple, or quadruple passwords, why not engage the use of encryption?
Health Minister Mike Murphy said despite efforts to improve security, some breaches are bound to occur."We have 19,000 employees in the Department of Health and there are going to be privacy breaches from time to time," he said. [dailygleaner.com article]
I can't argue with that. Even if the rate of breaches were a low, low 0.01% per year (that means the chances of not having a breach is 99.99%. Obviously, the number is not grounded on real life), with 19,000 employees, you'd have almost two breaches annually.
Plus, consider how many contractors and outside vendors the Department of Health must be working with, and the number of "employees" actually increases, even if the above hypothetical rate stays the same.
So, Mr. Murphy is right--he's being pragmatic and pointing out the obvious. (Kind of unusual when you consider he's a politician.)
On the other hand, there is a difference between being pragmatic and being a defeatist. Just because you know it's going to happen doesn't mean you can't do anything about it.
For example, you could work to further decrease the odds of a breach. Instead of relying on questionable security measures like double, triple, or quadruple passwords, why not engage the use of encryption?
Related Articles and Sites:http://telegraphjournal.canadaeast.com/front/article/650641
A health worker at Bradford Teaching Hospitals NHS Foundation Trust has lost a USB memory stick with patient details for 2,650 surgical patients and 3,000 patients on a waiting list. There is no mention whether data encryption software like AlertBoot was used to secure the contents, although circumstances hint that it wasn't used. The missing information may include patients' names, addresses, dates of birth, hospital numbers, and medical treatment.
The missing information may include patients' names, addresses, dates of birth, hospital numbers, and medical treatment.
Needless to say, saving sensitive and confidential information unto an unsecured USB device is against NHS policies. I know this is true because I must have covered at least 10 cases of NHS breaches this year alone. You know, I'm amazed that so many hospitals still point out the above policy. I realize the hospital has to do some damage control, and divert anger from themselves to an employee (at least, I think that's the idea). But really, is this the best excuse they can come up with? (By the way, I suspect the NHS knows that the policy is useless--their announcement to sign up for 100,000 encrypted USB flash devices pretty much proves it. I expect "against policies" announcements to decrease in the future...) The USB device was lost at the Leeds Metropolitan University library. Based on what I'm reading here, it almost sounds like the health worker left it stuck into a USB port: "The Trust has worked with the university to try to locate the device and students have been identified who logged on to the computer from which it was lost. All but one of these students has been contacted but the device has not been traced."[thetelegraphandargus.co.uk. My emphasis] Another favorite quote of mine: "There is no ... evidence of the memory stick being stolen, or of the information being used or disclosed." As if an identity thief sends out a clarion call before engaging in criminal acts.
Needless to say, saving sensitive and confidential information unto an unsecured USB device is against NHS policies. I know this is true because I must have covered at least 10 cases of NHS breaches this year alone.
You know, I'm amazed that so many hospitals still point out the above policy. I realize the hospital has to do some damage control, and divert anger from themselves to an employee (at least, I think that's the idea). But really, is this the best excuse they can come up with?
(By the way, I suspect the NHS knows that the policy is useless--their announcement to sign up for 100,000 encrypted USB flash devices pretty much proves it. I expect "against policies" announcements to decrease in the future...)
The USB device was lost at the Leeds Metropolitan University library. Based on what I'm reading here, it almost sounds like the health worker left it stuck into a USB port:
"The Trust has worked with the university to try to locate the device and students have been identified who logged on to the computer from which it was lost. All but one of these students has been contacted but the device has not been traced."[thetelegraphandargus.co.uk. My emphasis]
Another favorite quote of mine:
"There is no ... evidence of the memory stick being stolen, or of the information being used or disclosed."
As if an identity thief sends out a clarion call before engaging in criminal acts.
There are several ways that this data breach could have been averted. Have people follow the data security policy. Yeah, right. But, some--not many--do. Well, there'll be more people following it now, since the aforementioned worker has essentially been fired. Denying administrative staff access to sensitive medical data. I mean, why do they have access to it to begin with? I can understand names, addresses, etc. But medical treatment? As far as I can tell, the staff member was a "health worker" not because she was a nurse, but because she worked for a health trust. Using encryption to secure data. If all else fails, using encryption software to protect sensitive data is a great way of greatly minimizing the risks of a data breach. Of course, there's no guarantee that a data breach cannot happen because encryption is in place. For example, if the USB disk was lost while connected to a computer...well, that's problematic. Chances are the password was provided to gain access to the disk: it's the only way to gain access to read and write to the device. And if someone other than the owner happens on that still-connected memory disk, the potential perpetrator could, say, copy off the contents of the USB disk to the computer, and copy the files back to his own personal USB disk. (On the other hand, if someone unplugs it from the computer, the contents will remain safe, since the password needs to be supplied the next time the USB stick is connected to a computer.) Depending on whether file encryption or disk encryption was used, a breach may be contained or not. (Click here to learn about the difference between file encryption and disk encryption.)
There are several ways that this data breach could have been averted.
Of course, there's no guarantee that a data breach cannot happen because encryption is in place. For example, if the USB disk was lost while connected to a computer...well, that's problematic.
Chances are the password was provided to gain access to the disk: it's the only way to gain access to read and write to the device. And if someone other than the owner happens on that still-connected memory disk, the potential perpetrator could, say, copy off the contents of the USB disk to the computer, and copy the files back to his own personal USB disk.
(On the other hand, if someone unplugs it from the computer, the contents will remain safe, since the password needs to be supplied the next time the USB stick is connected to a computer.)
Depending on whether file encryption or disk encryption was used, a breach may be contained or not. (Click here to learn about the difference between file encryption and disk encryption.)
Related Articles and Sites:http://www.yorkshirepost.co.uk/news/Confidential-hospital-records-lost-by.5216361.jp
Earlier this month, the Treasury Department's Inspector General reported how the IRS had made slow progress in securing its computers. There were many factors at play, among them poor management. But it seems to me that a bigger factor was the sheer size of what the IRS was trying to protect. AlertBoot's own managed encryption service could have helped, although I'm not sure if would have made a dent in the overall progress
The problem facing the IRS is gargantuan, to say the least. They are trying to secure 98,000 computers, both desktop as well as laptops, in 670 facilities nationwide. Just trying to secure 98,000 computers is a gigantic endeavor in itself, never mind having these computers dispersed all over the place.
Now, if the IRS was just trying to install hard disk encryption on their computers (one of the requirements the IRS faced), it would have been a piece of cake with encryption as a service.
Centrally-managed encryption by AlertBoot, for example, uses the Internet to distribute the encryption installers. All the enduser/government employee has to do is download it and go through the registration process to receive a username and password, a 5-minute process (not dissimilar to signing up for Gmail).
Since the enduser is ultimately in charge of the installation, the (comparatively) diminutive IT staff isn't extended in its duties, nor is there a situation where IT staff have to travel 600-plus locations (or have computers sent to the IT department from 600-plus locations, a whole different ballgame).
Also, since IT staff are not actually implementing encryption software themselves, a powerful set of security audit reports allow administrators to keep track of which computers are protected--and more importantly, which ones are not. This latter is used to follow up with non-compliant employees.
However, the IRS had other stuff to do as well.
For example, the IRS ultimately decided to implement 254 security settings. Per computer. I figure that by "settings," they're referring to actual settings like screen savers that lock out a user and firewall settings, but other adjustments like the use of encryption, disabling USB ports, and myriads of other modifications for better data security.
Plus, they also had to test each and every piece of software that was ever created for them--we're talking specialized software only used by the IRS.
I'm not sure how long the project was supposed to take, but I imagine that with the above conditions, whatever deadline they set up would probably not have been enough.
Related Articles and Sites:http://fcw.com/articles/2009/04/06/web-irs-security-settings.aspxhttp://www.treas.gov/tigta/auditreports/2009reports/200920055fr.htmlhttp://www.webcpa.com/news/31244-1.htmlhttp://www.nextgov.com/nextgov/ng_20090406_2265.php
Chateau Office Building in California burglarized - 60 businesses affected What are the damages? Data was targeted Information is hard to get to? The reality about password protection California encourages the use of encryption The Chateau Office Building in Woodland Hills, California (billed as a landmark building by the media) experienced a "brazen" theft: approximately 60 businesses at the complex were burglarized in one night. The thieves only targeted computers, and the case shows why the use of hard drive encryption software like AlertBoot is so important to secure computer data, even if other security measures are in place. To say that the thieves were targeting data would be more accurate, as some of those affected have concluded. The evidence? Computers and files taken from businesses Other electronic equipment still in place [cbs2.com] "In one office, a pile of hard drives had been stacked in a corner, ready to be hauled out." [latimes.com] That last one, it seems to me, is pretty conclusive. Only people targeting data would take the time to take disks out of computers. The office building did have security, including a guard making the rounds, as well as a security camera system. The latter was disabled, and the guard was called away on some type of emergency. It is believed the thieves had a master key that gave them access to all offices.
The Chateau Office Building in Woodland Hills, California (billed as a landmark building by the media) experienced a "brazen" theft: approximately 60 businesses at the complex were burglarized in one night. The thieves only targeted computers, and the case shows why the use of hard drive encryption software like AlertBoot is so important to secure computer data, even if other security measures are in place.
To say that the thieves were targeting data would be more accurate, as some of those affected have concluded. The evidence?
That last one, it seems to me, is pretty conclusive. Only people targeting data would take the time to take disks out of computers.
The office building did have security, including a guard making the rounds, as well as a security camera system. The latter was disabled, and the guard was called away on some type of emergency.
It is believed the thieves had a master key that gave them access to all offices.
It looks like businesses are still trying to seize up the damages, but initial accounts don't sound so good. According to the latimes.com: Credit card numbers for 7,000 clients stolen (at one business alone!) Tax documents of 800 clients (probably includes SSNs, possibly bank account info) An attorney's computer contained "all kinds of stuff." (names, credit cards, e-mails, etc.) Based on the stories covering this breach, it looks like the damages will be extensive.
It looks like businesses are still trying to seize up the damages, but initial accounts don't sound so good. According to the latimes.com:
Based on the stories covering this breach, it looks like the damages will be extensive.
The latimes.com has quoted a victim who stated that password-protection was used on his computers, so "it's not going to be easy to get [to that sensitive information]." I'd disagree with this assessment. It is common knowledge among computer and security professionals and enthusiasts that password-protection does not live up to its name. The easiest way to get around it? Hook up the hard drive with password-protection to another computer. Getting the wires is probably the hardest thing: driving 10 miles to the nearest computer store and plunking down $25. Of course, password-protection would work in those cases where the thieves don't know much about computers (and even then, they could just google the process). My question is this (and it's a rhetorical one): does the above story sound like the work of some guys who don't know much about computers? At this point, I'd say the only business owners who can sleep well at night will be those who had used full disk encryption to secure the contents of their hard drives, or those who had used file encryption software to protect just the important files.
The latimes.com has quoted a victim who stated that password-protection was used on his computers, so "it's not going to be easy to get [to that sensitive information]." I'd disagree with this assessment.
It is common knowledge among computer and security professionals and enthusiasts that password-protection does not live up to its name. The easiest way to get around it? Hook up the hard drive with password-protection to another computer. Getting the wires is probably the hardest thing: driving 10 miles to the nearest computer store and plunking down $25.
Of course, password-protection would work in those cases where the thieves don't know much about computers (and even then, they could just google the process).
My question is this (and it's a rhetorical one): does the above story sound like the work of some guys who don't know much about computers?
At this point, I'd say the only business owners who can sleep well at night will be those who had used full disk encryption to secure the contents of their hard drives, or those who had used file encryption software to protect just the important files.
And the use of encryption software would have been especially useful for other reasons, too. California is a state where the use of encryption gives business a respite from having to alert those who are affected by a data breach. I mean, you lose a bunch of computers, and now have to mail out letters to 7,000 customers? And lose those customers? Often times, the use of encryption is not just about protecting your customers' data--it's about protecting your business as well.
And the use of encryption software would have been especially useful for other reasons, too.
California is a state where the use of encryption gives business a respite from having to alert those who are affected by a data breach. I mean, you lose a bunch of computers, and now have to mail out letters to 7,000 customers? And lose those customers?
Often times, the use of encryption is not just about protecting your customers' data--it's about protecting your business as well.
Related Articles and Sites:http://www.contracostatimes.com/california/ci_12233315?nclick_check=1http://www.sfgate.com/cgi-bin/article.cgi?f=/n/a/2009/04/26/state/n183933D66.DTLhttp://www.databreaches.net/?p=3291