in

This Blog

Syndication

Tags

News

AlertBoot offers a cloud-based data and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.

AlertBoot Endpoint Security

AlertBoot offers a cloud-based data and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.

April 2012 - Posts

  • Data Security: Google WiSpy Is Much Ado About Nothing? Depends On Where You Are

    I ran across an article by Mike Elgan, "Why Google should be allowed to 'harvest' your Wi-Fi data."  The gist of the argument is that "Google didn't do anything wrong" because "Wi-Fi is radio broadcast over the public airwaves."  Well, it depends, actually.

    To summarize Elgan's article: Google, in certain instances, collected personal information such as passwords, email messages, and browser information as part of their Street View project.  All that information was broadcast into public space by the people who owned the routers.  Hence, collecting it is fair game.  After all, it's not as if Google reached into people's homes to do so, or hacked their routers: the information was just there in the public airwaves.

    Uh...You Realize Laws Differ Country to Country, Right?

    Elgan is not wrong in making this argument.  I'm not a lawyer, but I've looked into such issues, and he's right.  In the US, privacy is not guaranteed in public space.  It's why the government can monitor suspects from the roads without getting a warrant, or why you can take pictures of pretty much anything in and from public spaces, such as the sidewalk.

    (Exceptions, of course, exist: I once was stopped in Boston from taking pictures of what turned out to be a federal building, even though I was standing on public property.  Another time, I was stopped from taking pictures of a non-federal building from federal property -- not that there were any signs alerting me I was on government land).

    So far, so good.  Then, Elgan commits a mistake of such epic proportions that I have wonder what he was thinking.  He overreaches (my emphases):

    The FCC did charge Google a pathetic £16,000 ($25,000) fine for taking too long to respond to requests for information during the investigation. But it didn't levy any fine for the actual data harvesting. Inconvenient truth: In a country ruled by law, you can't legally punish people or companies when they haven't in fact broken an actual law.

    Still, critics are coming out of the woodwork to denounce both Google and the FCC.

    "FCC's Ruling that Google's Wi-Fi Snooping is Legal Sets Horrible Precedent," said PC World's John P. Mello Jr. "Google Breaches Highlight Need for Regulation," said Jason Magder of the Montreal Gazette.

    And as they tend to do in such cases, the pandering politicians are trying to get in front of the parade....

    Other countries, including Germany, France and Australia, concluded (unlike the FCC) that Google was guilty of wrongdoing.

    Australian Minister for Communications Stephen Conroy called the it the "largest privacy breach in the history across western democracies." The Australian government forced Google to publicly apologise.

    France made Google pay an £88,000 ($142,000) fine.

    The global consensus is that Google's so-called "snooping" was an invasion of privacy, accidental or otherwise.

    Unfortunately, this consensus is based on emotion and knee-jerk populism, rather than facts and reason.

    Do you see the problem?  I do.  People are allowed to have differing opinions (just because it differs from yours doesn't make it a knee-jerk reaction), and laws regarding data privacy vary from country to country.  Some countries have stricter laws that others when it comes to the collection and dissemination of personal data.  So, if Elgan's argument is that Google did nothing wrong because the law is on Google's side...well, that's just in the US.  Why is he overreaching and applying US law globally?

    Take Europe, for instance.  Under the EU Privacy Directive, you have to get consent from people before you collect their data, and then only for the reasons stated.  So, if Google's StreetView project did collect passwords, emails, and other information by accident -- well, they broke the law.  (Each member of the EU is supposed to work under the EU Privacy Directive framework, but the details can vary.  So, what's considered a data breach in the UK might not necessarily be so in France.)

    Now, you could claim that passwords are not personal information.  Well, again, that depends on what the law states, right?  But, it's not necessarily just a matter of how the law defines "personal data."  The law concerning personal data is very general and has a broad reach because in Europe privacy is paramount.  So, the law errs on the side of "data related to people" as being personal data vs. non-personal data.

    Consider this bit that I've quoted before (my emphasis):

    Information may be recorded about the operation of a piece of machinery (say, a biscuit-making machine). If the information is recorded to monitor the efficiency of the machine, it is unlikely to be personal data (however, see 8 below).  However, if the information is recorded to monitor the productivity of the employee who operates the machine (and his annual bonus depends on achieving a certain level of productivity), the information about the operation of the machine will be personal data about the individual employee who operates it.  [section 7, Data Protection Technical Guidance - Determining what is personal data]

    The above is from a document published by the UK Information Commissioner's Office.  In the US, I doubt you'd find an official government body making the argument that biscuit-making data is personal data.  But, in the UK, it's part of its official guidelines. (Incidentally, the UK ruled that Google did commit a data breach).

    Google Didn't Do Evil

    If anyone is basing stuff on emotion and jerking knees, it appears to be Elgan, despite what appears to be logical reasoning.  I mean, Google did nothing "wrong?"  If so, why did Google admit they made a mistake?  A mistake generally involves wrongness.  Plus, is he really implying that countries like France, Germany, and England are not ruled by law?  Or that their laws are founded on emotion and knee-jerk populism and a lack of logic?

    I guess Elgan meant, Google didn't do anything evil.  I'll agree with that.  I doubt that Google's intentions were to collect personal data.  Otherwise, Google would have certainly collected more personal information than they actually did.

    But to say that Google didn't do anything wrong is like saying that I didn't do anything wrong if I run over someone's kid because I never meant to do that.  It just doesn't sound right, does it?  Ah, but, of course, we have laws that govern such instances. 

    But, if the U.S. of A. didn't, would it make it OK to run over someone by mistake?  Would you claim that there was nothing wrong with it?  I know I wouldn't.

    You probably think that collecting personal data cannot be compared to running someone over.  I'm not crazy: I agree with you.  All I'm trying to point out is that just because something isn't illegal doesn't mean it's not wrong.

    And certainly just because something is legal in the US doesn't mean it is (or should be) everywhere else in the world.

    A final note: Elgan appears to support his opinion that Google did nothing wrong by showing how the FCC fined them a "pathetic" $25,000 fine for seemingly unrelated charges, i.e., taking too long to respond.  Per the use of "pathetic," the insinuation seems to be that if Google had done something wrong, surely the fine would be larger?

    Two years ago, the UK's ICO only had the power to penalize £5,000 vs. the current £500,000 for data breaches.  Are we to believe that a company being fined £5,000 (the maximum amount) for breaching personal data on one million people represents a less egregious crime than a company being fined £500,000 (the maximum amount after updating the law) for breaching personal data on one million people?

    Of course not.  That'd be absurd.

    So, what is the maximum fine for being a slowpoke under FCC rules?  If it's $25,000, then isn't that a statement in of itself?  Why does the number of zeros matter?  For the record, I don't know what the maximum amount is.  However, does it even matter what that maximum amount is?  Shouldn't the focus be on why Google was fined?


    Related Articles and Sites:
    http://www.computerworlduk.com/in-depth/mobile-wireless/3353177/why-google-should-be-allowed-harvest-your-wi-fi-data/

     
  • Drive Encryption Software: PwC Causes Under Armour Employee Data Breach

    A number of sources are reporting that Under Armour, the athletics apparel company, had a data breach of employee payroll information when a USB memory stick was sent via mail.  The device was not protected with disk encryption software like AlertBoot.

    The blame falls on PricewaterhouseCoopers (PwC), auditors to Under Armour.

    Auditing Firm Loses USB Thumb Drive

    According to daytonadilynews.com and other sources, Under Armour's payroll information was lost on April 12, when a USB stick went missing in the mail.  The device was "lost in transit to a PwC facility," according to a PwC spokesperson.  The implication is that this was some kind of interoffice dispatch gone awry.

    The breached data includes names, SSNs, and salary information for an undeclared number of employees, although it's been pointed out that "the company employs 5,400 worldwide."

    If you look at Under Armour's latest 10-K filing, you'll see that

    As of December 31, 2011, we had approximately fifty four hundred employees, including approximately twenty nine hundred in our factory house and specialty stores and eight hundred at our distribution facilities. Approximately eighteen hundred of our employees were full-time. Most of our employees are located in the United States....

    Assuming that "most" means "majority," it could mean that approximately 2,700 people were affected by this latest incident (although, who knows, really?  It could have been the audit of top management only, limited to less than 100 people).

    Irony

    I happened on this Under Armour story right after reading that PwC had reported that "8 out of 10 organizations suffered staff-related security breaches in 2011."

    According to that report, per computing.co.uk,

    The survey found that 82 per cent of large organisations had reported security breaches caused by staff, with 47 per cent reporting incidents where staff had leaked or lost confidential information...

    [The survey's author Chris] Potter argued that the report's finding indicated that security training is being neglected.

    "One of the biggest things that large organisations can do is to invest in security awareness programmes," he said.

    Ironic, no?  At the same time, it's also "inevitable."  If you're part of an organization that handles sensitive data, it can be guaranteed that at some point you'll have a data breach, small or big, just because there are so many ways things can go wrong.

    PricewaterhouseCoopers, for example, is a global company that does business in the "knowledge economy."  Their product and raw materials is data.  So, it's not surprising that they have the occasional data breach like the above.  Or this one, from two years ago.

    At the same time, it's slightly depressing that easily-preventable data breaches like that involving Under Armour still occurs at companies that know better.  I mean, PwC already knows about the importance of using encryption software to secure what can only be viewed as sensitive data.  It must, seeing how their raw material for creating their product is sensitive data.  No doubt that their employees know this, too.

    So, why the use of a USB disk that was not protected with encryption?  While perfect data security might not be possible, there are surefire ways of lowering the risks of a data breach.  The use of an encrypted USB device is one of the easier, and definitely preventable, ones.


    Related Articles and Sites:
    http://www.daytondailynews.com/business/data-breach-hits-under-armour-1363303.html
    http://www.baltimoresun.com/news/breaking/bs-md-underarmour-20120422,0,4880808.story

     
  • Data Encryption Software: Eugene, Oregon Podiatrist's Computer Stolen

    A Eugene (Oregon) podiatrist's clinic was broken into in February of this year.  The thief made off with a number of items, including a computer with patient information.  The computer appears to not have been protected with hard drive encryption like AlertBoot, since affected patients are being contacted via the mail.

    Based on comments left at the kval.com site, one group of patients had their SSNs stolen while others didn't.

    Took One Month and a Half to Notify

    According to kval.com, someone broke into Dr. Rex Smith's practice and stole medication and a computer with patients' protected health information, including names, dates of birth, and Social Security numbers.

    Under HITECH Act, any HIPAA covered entities whose patients' information is breached needs to contact those who are affected no later than 60 calendar days since its discovery.  Plus, if over 500 people are affected, the breached entity is required to provide notice to the Department of Health and Human Services and alert the media.

    The doctor's clinic took around 47 days.  It seems that the time was put to good use.  While kval.com's story centers around a patient who received a letter stating that SSNs were compromised, there are at least two commentators that claim that there is no mention of Social Security numbers in their letters.

    This obviously means that at least two sets of letters were created: one for patients whose SSN was in the computer and another for those who didn't. 

    I think it's also safe to assume that less than 200 people were affected, seeing how the media picked up the story from a patient, and not from the doctor or doctor's representatives.

    Computer Disk Encryption: More than an Option but not a Requirement

    Obviously, the use of an encryption program for medical computers would have been a win-win in this situation.

    It's a win for patients because they don't have to be concerned about their information falling into the wrong hands.  It's a win for the clinic because they don't have to be concerned about their information falling into the wrong hands; because they don't have to deal with irate patients/clients; because HITECH provides safe harbor from breach notification requirements if encryption is used; and because they don't have to provide post-breach services like credit monitoring which affects the bottom line.

    Yes, HIPAA and HITECH refer to encryption software as an optional, "addressable" issue.  But, where the rubber meets road, there's nothing "optional" about encryption.  If you are a doctor with a private practice who has hundreds of patients (or more), encryption is as de rigueur as malpractice insurance.


    Related Articles and Sites:
    http://www.kval.com/news/local/Stolen-computer-jeopardizes-medical-records-148199695.html

     
  • Cost Of A Data Breach: University Of Hawaii Settles Data Breach Lawsuit

    The toll of a data breach can very costly; that's why the use of data protection tools like full disk encryption from AlertBoot is recommended.

    On the other hand, sometimes those costs can be, for the lack of a better word, reasonable.  For example, the University of Hawaii (UH) has settled its data breach class action lawsuit by offering two years of credit protection and fraud restoration services.  Class members have until May 1 to sign up.

    The lawsuit was the first of its kind in the state of Hawaii, according to kitv.com, and represented approximately 98,000 faculty, students, alumni, and guests.  It involved five separate data breach incidents.

    Depending on who's being quoted, credit monitoring and restoration services can cost anywhere from $5 to $20 per month.  So, the University is probably facing something in towards the lower end of the range.  Still, the numbers don't look good.

    • 100% signup: $490,000 to $1,960,000
    • 50% signup: $245,000 to $980,000
    • 25% signup: $122,500 to $490,000

    The last time I checked, people generally sign up at rates in the single digits or low teens (such as in this example from 2006), although it's not unheard of to see much higher rates.  About 6,000 people have signed up already (staradviser.com) with less than ten business days to go, so it looks like UH will see the "classic" low-teens scenario, and the settlement will probably cost them less than $200,000.  (One way of measuring that amount?  16 full-time non-resident students for the upcoming year, per their tuition schedule posted on-line).  According to govinforsecurity.com, UH estimates the services will cost $550,000.

    The above costs are in addition to the usual expenditures, of course: notifying those affected by the breach, conducting research on what went wrong, setting up a defense against the lawsuit, etc.

    So, how the heck is the cost "reasonable?"  Well, the settlement is, apart from doing nothing, the least the UH could have done.  Keep in mind that the word "reasonable" is relative: it's certainly better than bearing all of the above costs and also getting fined $1.5 million.  As data breaches on nearly 100,000 people go, this was a pretty inexpensive one, I'd say (for the defendant, not the plaintiffs).

    Of course, at the end of the day, it's money spent on cleaning up after an incident.  It would have been better spent on data protection, such as encryption software.  And locking file cabinets.  One of the breaches included the theft of paper documents.


    Related Articles and Sites:
    http://www.kitv.com/news/hawaii/Judge-approves-Univ-of-Hawaii-data-breach-class-action-settlement/-/8905354/10959520/-/4m7sx6z/-/index.html
    http://www.staradvertiser.com/news/breaking/147850005.html
    http://www.govinfosecurity.com/university-breach-settlement-approved-a-4685

     
  • Disk Encryption: Emory Healthcare Breaches Data on 315,000 Patients

    Emory Healthcare announced that the theft of 10 computer disks has affected approximately 315,000 people who received treatment at Emory University Hospital, Emory University Hospital Midtown (formerly Crawford Long Hospital), and the Emory Clinic Ambulatory Surgery Center.  It wasn't disclosed whether the information was protected with data encryption software like AlertBoot.

    However, considering that Emory is most probably a HIPAA covered-entity, and that HIPAA regulations require the public announcement of PHI breaches involving more than 500 people, especially if encryption software is not used to secure the data, it appears that encryption was not used.

    17 Years' Worth of Data, Including SSNs for 228,000 People

    If you have computer disks that store sensitive patient PHI data that your company has collected over a period of 15+ years, perhaps some due diligence should be paid to the use of medical info encryption software.  I mean, HIPAA does provide a carrot for using encryption to protect PHI.

    According to various sources:

    • 10 computer disks that contained backup data went missing from storage at Emory Hospital between February 7 and February 20, 2012
    • Approximately 315,000 people are affected by this latest PHI data breach
    • Information on the patients were collected between September 1990 and April 2007
    • Information included patients' names, dates of surgery, diagnoses, surgical procedure, device implant information, and surgeons' names
    • Approximately 228,000 patients also had their Social Security numbers breached
    • Updated: The CEO admitted that "the discs were not stored according to protocol." (ajc.com)

    Also, several sources are reporting that the disks were backups.  It sounds like Emory is pretty confident that the breach will be contained because "the data and discs are not readable on a computer and are only compatible with a system the group no longer uses" (patch.com).  The system was deactivated in 2007 (11alive.com).

    At least one site (ajc.com) is reporting that "discs had been stored in an office" although most are reporting that the data was taken from a "storage location."  The latter sounds a little more secure than the former, eh?

    Questions, Questions, Questions

    I've taken a look at the HIPAA Wall of Shame, and it looks like Emory Healthcare will rank #15 out of all data breaches reported to the OCR since 2009, indicating that the incident is not only a major data breach but a historic one as well.

    Which leaves one wondering "what were they thinking?"  In fact, the story puzzles me in many ways:

    • Were these disks or discs?  The former generally tends to mean any type of digital media (floppy disks, internal hard disks, external hard disks, etc.) whereas the latter is used for CDs and DVDs.  Emory's own press release uses "discs" and will assume it was, but I've seen it reported otherwise.  (Personally, I can't advise someone to use hard disks as a long-term backup solution, but I've seen it happen so I'm left wondering).
    • Why weren't they using encryption on the backed up data?
    • How secure was this office / storage location?  This ties in to the above bullet on "encrypted data."  While HIPAA does not require the use of encryption, it does require the data be reasonably secured.  It wasn't.  Why?
    • Why is the information not readable?

    Regarding the last question, I get the feeling that the information may have been compressed -- as backup data usually is -- before being burned to a CD.  The easiest way to protect this data would have been to encrypt the information after it was compressed but before it was burned to a disc.  (Always compress and then encrypt, as opposed to the other way around.  Since encryption randomizes data, doing the encryption first means you won't get efficient data compression later.)

    A further criticism: they got rid of the system that is used to read the data, but they did not get rid of the data?  Why?

    Update: I found answers to a couple of the questions above:

    The information on the discs was not encrypted because it was associated with such an outdated system, Fox said. Encryption would have made viewing the data more difficult. But Fox said the information on the discs could not be easily viewed without the software needed to read it, which was not on the discs. [ajc.com, my emphases]

    At the risk of sounding snarky, encryption is meant to make viewing data more difficult.  Yeah, I know what the CEO meant.  Still, there isn't any merit to the given explanation.  After all, it made viewing the data difficult and not impossible.  It's a system that's been deactivated: nothing about accessing its data is meant to be easy.

    Which brings us to the supposed "deactivated system" that is no longer used:

    The investigation has determined that the discs were removed sometime between February 7, 2012, and February 20, 2012. They contained data files from an obsolete software system that was deactivated in 2007. This deactivated system was accessed very infrequently and only as requested by either patients or their physicians. The last time data were accessed was in 2010. [emory.edu]

    Arguably, this is not a deactivated system.  It most certainly is not "not used anymore."  It's more like it's on stand-by.  Regardless, it helps explain why the discs were kept around even as the system was disabled and "no longer being used."

    This is possibly the worst risk-reward scenario ever: the reward of a little bit of convenience for one IT guy restoring the data, once in a while (possibly every five years or whatever), against the risk of breaching the information of over 300,000 people (who'll probably be irate) and possibly HIPAA fines and (definitely) a HIPAA breach investigation.

    The HIPAA Breach Angle

    The case also leaves me wondering how the BlueCross BlueShield of Tennessee settlement applies in this case.  The BCBS data breach involved over 1 million people and a $1.5 million penalty.  If we were to assign about $1 per record breached, Emory could be facing a penalty of $315,000.

    The fact that the information was not easily readable wouldn't be a factor, I think.  For starters, neither was BCBS's data, since it was mostly in audio format.

    Plus, so what if the data is not easily accessible?  Someone took CDs full of data from a storage location that, arguably, was used to store PHI.  It stands to reason that the motive behind the act was the patient data.


    Related Articles and Sites:
    http://news.emory.edu/stories/2012/04/ehc_missing_data/campus.html
    http://www.bizjournals.com/atlanta/news/2012/04/18/data-loss-at-emory-healthcare-exposes.html
    http://www.ajc.com/news/emory-says-315-000-1421327.html
    http://www.ajc.com/news/patient-data-missing-for-1421620.html
    http://northdruidhills.patch.com/articles/patient-data-missing-from-emory-healthcare-cbfff869
    http://www.11alive.com/news/article/238755/40/Emory-Healthcare-missing-info-on-patients-from-17-year-period

     
  • How To Hide Files? To Start With, Get Your Facts Straight

    I read a dated zdnet.com article titled "How To Hide Files From The Law."  The article is actually about Fifth Amendment rights, and not necessarily about hiding files from the law per se.  (That being said, how do you do it?  Use laptop encryption software like AlertBoot.)

    In fact, I had covered the same "forcing decryption of data is a violation of Fifth Amendment rights" story previously.  So, there was nothing extraordinary about the article.  I moved to see what insightful comments I'd run across, and much to my chagrin, I found this exchange:

    I'd like to take this moment to remind people that password protection is not encryption.  Let me put it this way: if you're using password-protection to hide files from the law, you won't be forced to decrypt your laptop's contents for the government because the government will easily access your data.

    Passwords are used to access encrypted information, in the same way physical keys are used to access locked doors; but, you'd never make the mistake of confusing that key for the actual walls, doors, barred windows, etc. that actually constitute a venue being "secure," would you?

    Well, when you state that a password protection is encryption, you're making that mistake.


    Related Articles and Sites:
    http://www.zdnet.com/blog/storage/how-to-hide-files-from-the-law/1624

     
More Posts « Previous page - Next page »