Aneurin Bevan Health Board (ABHB), a Welsh health board, has become the first NHS organization to be fined under the Data Protection Act. Based on the number of breaches that the NHS has been reporting over the years, it's surprising that this hasn't happened sooner. For example, plenty of USB memory sticks, external hard drives, and laptop computers that were not protected with full disk encryption like AlertBoot have been lost or stolen over the years. In the ABHB case, the cause of the breach defies belief: a mistake in spelling a patient's name, and a secretary that apparently just stabbed a guess at what the name might be.
Aneurin Bevan Health Board (ABHB), a Welsh health board, has become the first NHS organization to be fined under the Data Protection Act. Based on the number of breaches that the NHS has been reporting over the years, it's surprising that this hasn't happened sooner. For example, plenty of USB memory sticks, external hard drives, and laptop computers that were not protected with full disk encryption like AlertBoot have been lost or stolen over the years.
In the ABHB case, the cause of the breach defies belief: a mistake in spelling a patient's name, and a secretary that apparently just stabbed a guess at what the name might be.
ABHB was fined £70,000 after sending a patient's health report to the wrong person. The incident took place in March of last year, which beckons the question: why is ABHB being fined now? According to various sources, the breach's Rube Goldberg machine-like series of events began with a doctor (in some cases, a consultant) emailed a letter to a secretary for formatting. In the letter, the patient's name was misspelled, as well as spelled correctly. The letter, however, did not contain enough information for the secretary to identity the patient. At this point, one would imagine the secretary emailing the doctor and asking him/her to identify the patient. But, no, the report was sent to another patient with a similar name. According to guardian.co.uk, Stephen Eckersley, the ICO's head of enforcement, said the mistake could have been prevented if the information had been checked before being sent out. Even more worrisome, An investigation by the ICO found neither member of staff had received training in data protection and there were inadequate checks in place within the board to ensure personal information was only sent to the correct recipient. These poor practices were also used by other clinical and secretarial staff across the organisation. [bbc.co.uk, my emphasis] A spokesman for ABHB had this to say: We have 14,000 staff and have hundreds of thousands of contacts with patients each year, with systems in place to discharge these patient contacts confidentially," said the spokesman.... This was a genuine and unintended individual error, which was self-reported by the organisation to the information commissioner, because of the importance the health board places on information governance and in line with the commissioner's own guidance. [bbc.co.uk] While I don't doubt that ABHB places a lot of emphasis on patient data security, and that it has systems in place...well, it doesn't do one much good if they're upended by something so simple as not checking on who the patient is, does it? Consider, for example, a letter that addresses both a "Mr. Brown" and a "Mr. Browne." Are you just going to gloss over the difference in spelling? Which one is the misspelled name? The right move would be to get back to the original person who wrote the missive.
ABHB was fined £70,000 after sending a patient's health report to the wrong person. The incident took place in March of last year, which beckons the question: why is ABHB being fined now?
According to various sources, the breach's Rube Goldberg machine-like series of events began with a doctor (in some cases, a consultant) emailed a letter to a secretary for formatting. In the letter, the patient's name was misspelled, as well as spelled correctly. The letter, however, did not contain enough information for the secretary to identity the patient.
At this point, one would imagine the secretary emailing the doctor and asking him/her to identify the patient. But, no, the report was sent to another patient with a similar name.
According to guardian.co.uk,
Stephen Eckersley, the ICO's head of enforcement, said the mistake could have been prevented if the information had been checked before being sent out.
Even more worrisome,
An investigation by the ICO found neither member of staff had received training in data protection and there were inadequate checks in place within the board to ensure personal information was only sent to the correct recipient. These poor practices were also used by other clinical and secretarial staff across the organisation. [bbc.co.uk, my emphasis]
An investigation by the ICO found neither member of staff had received training in data protection and there were inadequate checks in place within the board to ensure personal information was only sent to the correct recipient.
These poor practices were also used by other clinical and secretarial staff across the organisation. [bbc.co.uk, my emphasis]
A spokesman for ABHB had this to say:
We have 14,000 staff and have hundreds of thousands of contacts with patients each year, with systems in place to discharge these patient contacts confidentially," said the spokesman.... This was a genuine and unintended individual error, which was self-reported by the organisation to the information commissioner, because of the importance the health board places on information governance and in line with the commissioner's own guidance. [bbc.co.uk]
We have 14,000 staff and have hundreds of thousands of contacts with patients each year, with systems in place to discharge these patient contacts confidentially," said the spokesman....
This was a genuine and unintended individual error, which was self-reported by the organisation to the information commissioner, because of the importance the health board places on information governance and in line with the commissioner's own guidance. [bbc.co.uk]
While I don't doubt that ABHB places a lot of emphasis on patient data security, and that it has systems in place...well, it doesn't do one much good if they're upended by something so simple as not checking on who the patient is, does it?
Consider, for example, a letter that addresses both a "Mr. Brown" and a "Mr. Browne." Are you just going to gloss over the difference in spelling? Which one is the misspelled name? The right move would be to get back to the original person who wrote the missive.
And the road to data breaches is paved with "genuine and unintended individual errors." For example, what is the claim that is generally made when a laptop computer with sensitive data goes missing? That it was an unintended error. A one-time mistake. Won't happen again. They had systems to ensure "certain things don't happen." But did they use medical data encryption software, which pretty much guarantees that "certain things don't happen"? No, of course not. And yet, an organization finds itself "disappointed" to be fined under the law. While it might be a poor comparison, take this example. If you purposefully run over someone with your car, that is murder. If you run over someone without the intention of killing that person, it's manslaughter. Whatever the intention may have been, both are followed with punishment because real harm was done. I don't see why it should be any different for data breaches.
And the road to data breaches is paved with "genuine and unintended individual errors."
For example, what is the claim that is generally made when a laptop computer with sensitive data goes missing? That it was an unintended error. A one-time mistake. Won't happen again. They had systems to ensure "certain things don't happen."
But did they use medical data encryption software, which pretty much guarantees that "certain things don't happen"? No, of course not. And yet, an organization finds itself "disappointed" to be fined under the law.
While it might be a poor comparison, take this example. If you purposefully run over someone with your car, that is murder. If you run over someone without the intention of killing that person, it's manslaughter. Whatever the intention may have been, both are followed with punishment because real harm was done.
I don't see why it should be any different for data breaches.
Related Articles and Sites:http://www.ico.gov.uk/news/latest_news/2012/ico-issues-first-penalty-to-the-nhs-following-serious-data-breach-30042012.aspxhttp://www.guardian.co.uk/government-computing-network/2012/apr/30/nhs-data-breach-fine-ico?newsfeed=truehttp://www.bbc.co.uk/news/uk-wales-south-east-wales-17894006
According to a letter sent to clients of Desert AIDS Project (DAP), the theft of an office computer has triggered a data breach. It has not been revealed whether the computer in question was protected with drive encryption like AlertBoot. But, a "strong password" was used, so there's that.
Desert AIDS Project reported to clients and the State of California that a thief broke into DAP offices on April 12, 2012 and stole a receptionist's computer. The computer did not contain medical details nor certain personally identifying information (SSNs, driver's license number, credit or debit card number, health insurance number, or other account numbers). However, there was a spreadsheet that contained client names, staff names, client status (active, discharged, etc), internal client identification number, and date of birth. The letter goes on to note that the "spreadsheet itself does not include DAP's name" but that "other documents stored on the stolen computer may reveal its connection to DAP." Not to be sarcastic, but so does the fact that the thief took it from the office, doesn't it? I mean, it's not as if the computer was stolen from a car parked in a shopping mall garage. The connection to DAP is pretty obvious.
Desert AIDS Project reported to clients and the State of California that a thief broke into DAP offices on April 12, 2012 and stole a receptionist's computer.
The computer did not contain medical details nor certain personally identifying information (SSNs, driver's license number, credit or debit card number, health insurance number, or other account numbers). However, there was a spreadsheet that contained client names, staff names, client status (active, discharged, etc), internal client identification number, and date of birth.
The letter goes on to note that the "spreadsheet itself does not include DAP's name" but that "other documents stored on the stolen computer may reveal its connection to DAP."
Not to be sarcastic, but so does the fact that the thief took it from the office, doesn't it? I mean, it's not as if the computer was stolen from a car parked in a shopping mall garage. The connection to DAP is pretty obvious.
The use of a strong password, unfortunately, is meaningless. A strong password tends to be long, random, and is composed of upper and lower case letters, numbers, and special characters. The password ASF23$GaSDFSAfaSdfsad@TR3r23332rgERVfwfWwGwhLKu,MNwWQF/./.<ewqf would be considered to be a very strong password. The problem is that if this password is not securing a computer protected with disk encryption, then getting around it is pretty easy. You just pop out the hard drive and connect it to another computer. In effect, the popped-out drive becomes an external hard drive and the password never comes into play because the operating system on that disk lies dormant (whereas the active operating system is the one set up by the thief or hacker). When you're in a business where patient confidentiality is at its utmost, you must ensure that you've got more than adequate security. At the same time, you can't go crazy: DAP probably can't afford all the things an outfit like Goldman Sachs is using to protect their data. But, some are more affordable than others while offering enhanced protection. Like centrally managed encryption software that uses the AES-256 to guard a computer's contents.
The use of a strong password, unfortunately, is meaningless. A strong password tends to be long, random, and is composed of upper and lower case letters, numbers, and special characters. The password ASF23$GaSDFSAfaSdfsad@TR3r23332rgERVfwfWwGwhLKu,MNwWQF/./.<ewqf would be considered to be a very strong password.
The problem is that if this password is not securing a computer protected with disk encryption, then getting around it is pretty easy. You just pop out the hard drive and connect it to another computer.
In effect, the popped-out drive becomes an external hard drive and the password never comes into play because the operating system on that disk lies dormant (whereas the active operating system is the one set up by the thief or hacker).
When you're in a business where patient confidentiality is at its utmost, you must ensure that you've got more than adequate security. At the same time, you can't go crazy: DAP probably can't afford all the things an outfit like Goldman Sachs is using to protect their data.
But, some are more affordable than others while offering enhanced protection. Like centrally managed encryption software that uses the AES-256 to guard a computer's contents.
Related Articles and Sites:http://www.desertaidsproject.org/notification/http://datalossdb.org/incidents/6325-receptionist-s-computer-stolen-during-office-burglary-contained-spreadsheet-with-aids-clients-names-assigned-staff-person-client-status-active-discharged-etc-internal-client-identification-numbers-and-dates-of-birthhttp://oag.ca.gov/ecrime/databreach/reports/sb24-23035
The Trend Micro blog brings us news that a website (blocked for our own good) is offering software that purportedly provides encryption for Skype (Skype Encription v 2.1.exe). Which seems redundant because encryption is already used in Skype (it is the same that is used in AlertBoot hard drive encryption: AES-256). You can corroborate this by visiting the official Skype support page.
The Trend Micro blog brings us news that a website (blocked for our own good) is offering software that purportedly provides encryption for Skype (Skype Encription v 2.1.exe). Which seems redundant because encryption is already used in Skype (it is the same that is used in AlertBoot hard drive encryption: AES-256).
You can corroborate this by visiting the official Skype support page.
As it turns out, the software in question doesn't actually encrypt anything. Rather, it's a Trojan for injecting DarkComet Version 3.3, which allows hackers to take control over a computer. One thing of interest that Trend Micro noticed was that "SyRiAnHaCkErS" (Syrian Hackers) appear to be behind this latest software offering.
Why would anyone be looking for software that encrypts Skype communications? And what's the Syrian hacker connection? As Trend Micro helpfully points out, Syria's ongoing uprising (part of Arab Spring) has spilled over into cyberwarfare, as seen in this CNN article. For example, an aid worker was tricked into installing spyware: The man chatting with Susan via Skype passed her a file. She recalled what he said to her to coax her to open it: "This makes sure that when you're talking to me, it's really me talking to you and not somebody else." She clicked on the file. "It actually didn't do anything," she said in a baffled tone. "I didn't notice any change at all." No graphics launched; no pop-up opened to announce to the user that the virus was being downloaded. The link appeared to be dead or defected, said Othman. But something did happen. Susan's computer was infected with spyware that monitors her computer activity. What did that Trojan do? According to Symantec: The Trojan then allows a remote attacker to perform the following actions on the compromised computer: Capture webcam activity Disable the notification setting for certain antivirus programs Download and execute arbitrary programs and commands Modify the hosts file Record key strokes Retrieve system information about the computer Start or end processes Steal passwords Update itself
Why would anyone be looking for software that encrypts Skype communications? And what's the Syrian hacker connection? As Trend Micro helpfully points out, Syria's ongoing uprising (part of Arab Spring) has spilled over into cyberwarfare, as seen in this CNN article. For example, an aid worker was tricked into installing spyware:
The man chatting with Susan via Skype passed her a file. She recalled what he said to her to coax her to open it: "This makes sure that when you're talking to me, it's really me talking to you and not somebody else." She clicked on the file. "It actually didn't do anything," she said in a baffled tone. "I didn't notice any change at all." No graphics launched; no pop-up opened to announce to the user that the virus was being downloaded. The link appeared to be dead or defected, said Othman.
The man chatting with Susan via Skype passed her a file. She recalled what he said to her to coax her to open it: "This makes sure that when you're talking to me, it's really me talking to you and not somebody else."
She clicked on the file. "It actually didn't do anything," she said in a baffled tone. "I didn't notice any change at all."
No graphics launched; no pop-up opened to announce to the user that the virus was being downloaded. The link appeared to be dead or defected, said Othman.
But something did happen. Susan's computer was infected with spyware that monitors her computer activity. What did that Trojan do? According to Symantec:
The Trojan then allows a remote attacker to perform the following actions on the compromised computer: Capture webcam activity Disable the notification setting for certain antivirus programs Download and execute arbitrary programs and commands Modify the hosts file Record key strokes Retrieve system information about the computer Start or end processes Steal passwords Update itself
The Trojan then allows a remote attacker to perform the following actions on the compromised computer:
As I already mentioned before, Skype uses encryption to protect its calls. The encryption keys are generated by the computers that are engaged in the calls, and there is no central command control structure for keeping track of the encryption keys. At least, this was true as far back as 2009, as can be seen in this video. In the comments section, you'll see comments that cast suspicion. Such arguments can be countered with actions, though. For example, India threatening to ban Skype because the government can't monitor the calls and Germany complaining about the same. Unless these are some elaborate false flag misinformation exercises, it's pretty apparent that Skype's calls are secure. Indeed, it's the reason why AlertBoot managed disk encryption software uses the same AES-256 algorithm to secure information on laptops.
As I already mentioned before, Skype uses encryption to protect its calls. The encryption keys are generated by the computers that are engaged in the calls, and there is no central command control structure for keeping track of the encryption keys. At least, this was true as far back as 2009, as can be seen in this video. In the comments section, you'll see comments that cast suspicion.
Such arguments can be countered with actions, though. For example, India threatening to ban Skype because the government can't monitor the calls and Germany complaining about the same.
Unless these are some elaborate false flag misinformation exercises, it's pretty apparent that Skype's calls are secure. Indeed, it's the reason why AlertBoot managed disk encryption software uses the same AES-256 algorithm to secure information on laptops.
Related Articles and Sites:http://blog.trendmicro.com/fake-skype-encryption-software-cloaks-darkcomet-trojan/
A doctor who works with children in the Fresno area has offered a reward for the return of her computer. It does not sound as if hard disk encryption like AlertBoot was used to protect the device.
According to kmph.com, Dr. Gloria Traje-Quitoriano's computer, full of patient information, was stolen from her husband's car. The PHI include names, home addresses, phone numbers, dates of birth, and Social Security numbers. The doctor is particularly concerned about "the files because they can get identity, my patients' identity." She is hoping that a $500 reward for the safe return of the computer may curtail the ramifications of the laptop theft. Incidentally, this is why I'm assuming that the computer was not encrypted. PHI encryption pretty much ensures that thieves will not be accessing a computer. So, there wouldn't be a realistic concern on the safety and integrity of patient data, and you certainly wouldn't be offering a $500 reward -- unless your objective is to regain your hardware.
According to kmph.com, Dr. Gloria Traje-Quitoriano's computer, full of patient information, was stolen from her husband's car. The PHI include names, home addresses, phone numbers, dates of birth, and Social Security numbers.
The doctor is particularly concerned about "the files because they can get identity, my patients' identity." She is hoping that a $500 reward for the safe return of the computer may curtail the ramifications of the laptop theft.
Incidentally, this is why I'm assuming that the computer was not encrypted. PHI encryption pretty much ensures that thieves will not be accessing a computer. So, there wouldn't be a realistic concern on the safety and integrity of patient data, and you certainly wouldn't be offering a $500 reward -- unless your objective is to regain your hardware.
The return of the unprotected laptop computer, though, does not automatically mean that patient data is safe. Remember, if there is nothing preventing unauthorized access to the computer -- like encryption software that requires the correct password -- a thief could easily boot up the computer; copy any sensitive files to another computer (that he probably stole); and return the computer to the doctor to collect the $500. Depending on the situation, such as whether the doctor bills Medicaid, this is most certainly a breach of HIPAA Security rules. An unencrypted laptop has no business being inside a car. If you frequently travel with a device that stores patient data, you have to encrypt.
The return of the unprotected laptop computer, though, does not automatically mean that patient data is safe. Remember, if there is nothing preventing unauthorized access to the computer -- like encryption software that requires the correct password -- a thief could easily boot up the computer; copy any sensitive files to another computer (that he probably stole); and return the computer to the doctor to collect the $500.
Depending on the situation, such as whether the doctor bills Medicaid, this is most certainly a breach of HIPAA Security rules. An unencrypted laptop has no business being inside a car. If you frequently travel with a device that stores patient data, you have to encrypt.
Related Articles and Sites:http://www.kmph.com/story/17647629/doctors-computer-stolen-patients-alertedhttp://datalossdb.org/incidents/6323-laptop-stolen-from-car-contained-patients-information
Paranoia. It's one quality that grows in you -- if you didn't suffer from it already -- if you work in the information security space. But, apparently it doesn't affect everyone. A survey was taken among participants of London's InfoSecurity Europe show last year. The show being what it is, it's not far-fetched to assume that the crowd is "security aware." But, 44% of those who admitted to carrying sensitive information on their mobile device did not encrypt data. There are many mobile devices out there, but in today's BYOD culture, if you're carrying sensitive data on your mobile device, chances are that you have a smartphone, such as an iPhone or an Android OS-based phone. These devices already come with powerful encryption. All one has to do is turn it on. Of course, statistics being what they are, it could be that this 44% represents a tiny number of people. Remember, it's 44% of those who carry sensitive data. If you're truly paranoid, you ensure you don't carry sensitive info on your mobile device. Assuming such paranoia affects half the crowd, the above 44% would represent only 22% of the entire crowd. I guess it's not a stretch to assume that about a quarter of people who attended London's InfoSecurity Europe show were regular laypeople. But, data breaches will bite anyone. Regardless of your level of paranoia or job description, if you carry sensitive data on your mobile device, you should encrypt your device. Do it right now.
Paranoia. It's one quality that grows in you -- if you didn't suffer from it already -- if you work in the information security space. But, apparently it doesn't affect everyone. A survey was taken among participants of London's InfoSecurity Europe show last year. The show being what it is, it's not far-fetched to assume that the crowd is "security aware." But, 44% of those who admitted to carrying sensitive information on their mobile device did not encrypt data.
There are many mobile devices out there, but in today's BYOD culture, if you're carrying sensitive data on your mobile device, chances are that you have a smartphone, such as an iPhone or an Android OS-based phone.
These devices already come with powerful encryption. All one has to do is turn it on.
Of course, statistics being what they are, it could be that this 44% represents a tiny number of people. Remember, it's 44% of those who carry sensitive data. If you're truly paranoid, you ensure you don't carry sensitive info on your mobile device.
Assuming such paranoia affects half the crowd, the above 44% would represent only 22% of the entire crowd. I guess it's not a stretch to assume that about a quarter of people who attended London's InfoSecurity Europe show were regular laypeople.
But, data breaches will bite anyone. Regardless of your level of paranoia or job description, if you carry sensitive data on your mobile device, you should encrypt your device.
Do it right now.
Related Articles and Sites:http://www.marketwatch.com/story/more-than-40-percent-dont-encrypt-sensitive-data-on-mobile-devices-says-echoworx-2011-study-2012-04-24
Thanks to HITECH, we have been able to get a better grasp of medical data breaches: how many there are in a given year, how many people are affected, and how it occurs (paper documents, laptop loss, laptop theft, etc). All signs point for more use of data security tools like drive encryption software from AlertBoot, seeing how the loss and theft of portable digital devices account for more than half of breaches involving the PHI of more than 500 people, which is publicly listed. (Breaches involving less than 500 are not readily made public -- not at a granular level, anyhow. However, the Department of Health and Human Services does reveal the numbers as part of an annual report.) One of the things that I've often wondered about, though, is how often does a breach hit a HIPAA covered-entity? The answer to this question is not readily available using the HHS's tools.
Thanks to HITECH, we have been able to get a better grasp of medical data breaches: how many there are in a given year, how many people are affected, and how it occurs (paper documents, laptop loss, laptop theft, etc). All signs point for more use of data security tools like drive encryption software from AlertBoot, seeing how the loss and theft of portable digital devices account for more than half of breaches involving the PHI of more than 500 people, which is publicly listed.
(Breaches involving less than 500 are not readily made public -- not at a granular level, anyhow. However, the Department of Health and Human Services does reveal the numbers as part of an annual report.)
One of the things that I've often wondered about, though, is how often does a breach hit a HIPAA covered-entity? The answer to this question is not readily available using the HHS's tools.
According to a HIMSS Analytics report (covered by ihealthbeat.org), a survey of 250 IT executives at hospitals revealed the following: The number of US hospitals that experienced a data breach in the past 12 months was 27%. In a 2010 survey, the number hit 19%. Of those who did have a data breach: 31% experienced one breach (43% in 2010) - 12% increase 28% experienced two breaches (28% in 2010) - same 35% experienced three to nine breaches (15% in 2010) - 20% decrease 6% experienced ten or more breaches (15% in 2010) - 9% decrease Since the surveys are for the previous 12 months, the 2010 figures reflect breaches in 2009.
According to a HIMSS Analytics report (covered by ihealthbeat.org), a survey of 250 IT executives at hospitals revealed the following:
Of those who did have a data breach:
Since the surveys are for the previous 12 months, the 2010 figures reflect breaches in 2009.
Based on the above survey, the implication is that medical organizations are getting better at defending patient data. At AlertBoot, for example, we are aware that more medical organizations have been looking for disk encryption software to protect their laptops (no doubt, in part a result of the HITECH Breach Notification Rule, which provides safe harbor from sending breach notifications to patients if encryption software is used). At the same time, the distinct impression that I got from the "HIPAA Breach Wall of Shame" was that data breaches have increased over the years, not decreased. So, what gives? Assuming that we're not dealing sampling errors, it's pretty apparent that hospitals that had egregious patient data security practices have finally started to get a handle on their problems. This probably accounts for the large decrease in organizations that had more than three breaches in a given year. At the same time, when you reflect that all data breaches cannot be weeded out, some of these same organizations probably had one incident, contributing to the increase of hospitals reporting one breach in the past twelve months. Plus, the total number of breaches jumped from 19% to 27%, which is in line with what's being reported at the HHS site.
Based on the above survey, the implication is that medical organizations are getting better at defending patient data. At AlertBoot, for example, we are aware that more medical organizations have been looking for disk encryption software to protect their laptops (no doubt, in part a result of the HITECH Breach Notification Rule, which provides safe harbor from sending breach notifications to patients if encryption software is used).
At the same time, the distinct impression that I got from the "HIPAA Breach Wall of Shame" was that data breaches have increased over the years, not decreased. So, what gives?
Assuming that we're not dealing sampling errors, it's pretty apparent that hospitals that had egregious patient data security practices have finally started to get a handle on their problems. This probably accounts for the large decrease in organizations that had more than three breaches in a given year. At the same time, when you reflect that all data breaches cannot be weeded out, some of these same organizations probably had one incident, contributing to the increase of hospitals reporting one breach in the past twelve months.
Plus, the total number of breaches jumped from 19% to 27%, which is in line with what's being reported at the HHS site.
I get the feeling that, going forward, we'll probably see the same trend: the number of hospitals and other medical organizations reporting multiple breaches will continue to decrease as they get a better handle on securing and managing PHI in the digital age. But, data breaches are, unfortunately, unavoidable. They just are. For example, even if a hospital encrypts all of their laptops, desktops, and prevents 100% the use of USB devices (by physically gluing shut the USB ports), there are other ways to lose data such as emailing a spreadsheet to the wrong address, or someone printing a document and taking it home. So, we'll probably see an increase in organizations that report at least one or two breaches each year, even if they reach the peak of data security optimization. Plus, the total number of breaches will continue to increase as more and more organizations join the EMR/HER trend.
I get the feeling that, going forward, we'll probably see the same trend: the number of hospitals and other medical organizations reporting multiple breaches will continue to decrease as they get a better handle on securing and managing PHI in the digital age.
But, data breaches are, unfortunately, unavoidable. They just are. For example, even if a hospital encrypts all of their laptops, desktops, and prevents 100% the use of USB devices (by physically gluing shut the USB ports), there are other ways to lose data such as emailing a spreadsheet to the wrong address, or someone printing a document and taking it home.
So, we'll probably see an increase in organizations that report at least one or two breaches each year, even if they reach the peak of data security optimization.
Plus, the total number of breaches will continue to increase as more and more organizations join the EMR/HER trend.
Related Articles and Sites:http://www.ihealthbeat.org/data-points/2012/among-hospitals-that-reported-a-data-breach-last-year-how-many-breaches-did-they-face.aspx