in

This Blog

Syndication

Tags

News

AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.

Archives

AlertBoot Endpoint Security

AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.
  • Schools In EU Could Face Heavy Fines For Data Breaches

    Beginning in May 2018, schools in EU member countries (including the UK despite Brexit) must comply with the new General Data Protection Regulation (GDPR). Not doing so would mean they could be subject up to 4% of their turnover, a figure that created quite the buzz when it was announced for businesses earlier this year (some news sites debated the implication for notorious privacy-stragglers like Facebook and Google).  

    Walk, Don't Run

    Poring over data breaches of the past ten years, it can be readily surmised that the education sector is not very good at protecting itself from data security incidents, be it hacking intrusions or lost data storage devices. It must be pointed out, however, that it's usually because they lack the resources to do so: schools trying to stretch their shrinking budgets cannot afford the latest and greatest in technology (leading to banks of old computers running long-abandoned software that is virtually impossible to secure), much less a proper IT security staff.
    In the short-term, it looks like GDPR could cause more problems than solve them, especially because schools are unaware of their responsibilities. In the UK, the Information Commissioner's Office (ICO) has provided some tips on what must be done – but as it's usually the case, this is not to be taken as a checklist where you can cross off items and declare yourself compliant.
    There is much to do, according to an expert interviewed by schoolsweek.co.uk:
    • Replace out-of-date IT equipment and ensure warranties exist for current equipment.
    • Designate a data protection officer.
    • Ensure a formal contract exists with data processors, which need to meet industry standards.
    • Document where data goes and how it is used.
    There are other experts, however, who advise not to rush into things as there are some issues that need to be resolved. On the other hand, they, too, advise to "start 'preparatory tasks.'"  

    Other Side of the Pond

    Meanwhile, in the US, a school district in Maryland has opted to stop collecting Social Security numbers for students:
    Director of Technology Infrastructure Edward Gardner, who oversaw the development of the new data policy, said the school system would not collect student Social Security numbers "unless explicitly necessary," and he could not think of a reason it would be.

    This may very well be the best approach to data security: if you don't need it, don't collect it. Far too many organizations take the approach of "collect it first and deal with it later." The problem is, of course, that it never gets dealt with at all. At least, not in the sense of securing the data – be it encrypting, scrubbing, deleting, etc.

    Unsurprisingly, a data breach ensues somewhere.

    Related Articles and Sites:
    https://schoolsweek.co.uk/schools-face-hefty-fines-for-data-breaches-under-new-eu-laws/
    https://www.databreaches.net/school-district-in-maryland-stops-collection-of-social-security-numbers/
    http://www.govtech.com/security/School-District-in-Maryland-Stops-Collection-of-Social-Security-Numbers.html

     
  • UK ICO to SMEs: Data Protection Laws Apply to You

    The United Kingdom's Information Commissioner's Office (ICO) has slapped Boomerang Video Ltd. (BV), a company that rents out video games, with a £60,000 fine. The monetary penalty is the result of a 2014 data breach in which personal details of 26,000 people were stolen.
    The fine deserves another look because BV's data breach was the result of an attack; it is not an instance of the "breachee" having a hand in the data breach, e.g., never changing the default password or using software that was out of date. Nothing that foolish.
    At the same time, BV certainly could have done much, much better to secure their online presence.

    SQL Injection Attack

    As the ICO notes, the breach took place via a SQL injection attack. This in turn allowed hackers to guess a password "based on the company's name," allowing access to the company's servers. Of course, once inside, all sorts of shenanigans took place.
    The hacker (or hackers) was aided by certain practices that BV engaged in, as listed by databreaches.net:
    • Boomerang Video failed to carry out regular penetration testing on its website that should have detected errors.
    • The firm failed to ensure the password for the account on the WordPress section of its website was sufficiently complex.
    • Boomerang Video had some information stored unencrypted and that which was encrypted could be accessed because it failed to keep the decryption key secure.
    • Encrypted cardholder details and CVV numbers were held on the web server for longer than necessary.
    The above is not a full list (for example, they also stored cards' security codes, which are prohibited once payment is processed). But, it already paints quite a picture.

    Surprising Requirements?

    What may be surprising to most Britons is the level of security awareness a business must have, even if they happen to be a small- or medium-sized enterprise. SQL attacks, password complexity, penetration testing, securing encryption keys… these are not terms one is generally familiar with. You may hear it here and there once in a while, maybe even have a passing knowledge of what it may entail.
    But actually doing it? Some of the listed practices lie firmly in the realm of professionals who charge a lot of money for their services. Unsurprisingly, business that are not necessarily raking it in do not seek or engage the necessary help that is required to protect their clients (and to meet the law's standards).
    On the other hand, BV's website debuted in 2005, and "remedial action" to secure the site was taken in 2015. That's a long time to go without checking whether things are secure, especially considering what the internet has morphed into: among other things, a speedy region where data crimes blossom with greater severity every passing second.
    The lesson to be parted with in this instance comes not in the insight you can glean from the nature of BV's digital sins and the monetary fine it was levied with, but from the ICO's enforcement manager's own words:
    "Regardless of your size, if you are a business that handles personal information then data protection laws apply to you.

    "If a company is subject to a cyber attack and we find they haven't taken steps to protect people's personal information in line with the law, they could face a fine from the ICO. And under the new General Data Protection Legislation (GDPR) coming into force next year, those fines could be a lot higher."
    The government is sending a signal, loud and clear, and in oh-so-many ways. Are businesses listening?  
    Related Articles and Sites:
    https://www.databreaches.net/uk-warning-to-smes-as-firm-hit-by-cyber-attack-fined-60000/
     
  • EU Proposes End-to-End Encryption and Other Security Measures

    Last week, the European Parliament's Committee on Civil Liberties, Justice, and Home Affairs released a draft proposal that would require the use of end-to-end encryption. It would also strike legal attempts to force backdoors in encryption software or weaken the security of services given by communications providers.
    Amendment 36
    Service providers who offer electronic communications services should process electronic communications data in such a way as to prevent unauthorised access, disclosure or alteration, ensure that such unauthorised access, disclosure or alteration is capable of being ascertained, and also ensure that such electronic communications data are protected by using specific types of software and encryption technologies.
    Amendment 116
    The providers of electronic communications services shall ensure that there is sufficient protection in place against unauthorised access or alterations to the electronic communications data, and that the confidentiality and safety of the transmission are also guaranteed by the nature of the means of transmission used or by state-of-the-art end-to-end encryption of the electronic communications data. Furthermore, when encryption of electronic communications data is used, decryption, reverse engineering or monitoring of such communications shall be prohibited. Member States shall not impose any obligations on electronic communications service providers that would result in the weakening of the security and encryption of their networks and services.

    Many of the proposals, but expressly the above two, run counter to certain governments' recent actions that would cripple encryption and other security measures for all in the name of fighting terrorism and other crimes.

    It is a welcome breath of sanity for a world that increasingly appears to be regressing back to an imagined time of stability.  

    Does This Mean the Bad Guys Are Protected?

    Of course, there will be those that, in typical knee-jerk fashion, will cry that we're giving the bad guys an upper hand. Nothing could be further from the truth.

    Laws protecting privacy always make an exception for illegal activities, and the EU provides exceptions for those who seek to abuse the system. For example, while Amendment 116 would make it impossible to decrypt any text messages that are stored in a smartphone (the smartphone itself is not protected, it seems, since the law specifically mentions "electronic communications data" – that is, information that is exchanged between two or more people), an exception would kick in if the messages were part of an investigation.

    What the new amendments will do is further cement the protections long afforded to law abiding citizens, and prevent those who would slowly decimate the same under one pretext or another.  

    Read the Fine Print, Though

    Some media outlets covering the amendment mention that this means the EU is recommending (some even go as far as saying banning) backdoors. This claim needs a little clarification, it seems, since it seems overly broad.

    In each instance of the amendments that were referenced, the term "electronic communications data" and "electronic communications provider" is included. Thus, it would appear that while backdoors are being given a red light, it is limited to encryption for data-in-motion. There is nothing here to suggest that the same is being extended to data-at-rest encryption, the type of cryptography that is generally used for securing all the contents of a laptop, for example.

     

    Related Articles and Sites:
    http://www.tomshardware.com/news/european-parliament-end-to-end-encryption-communications,34809.html
    https://blog.lukaszolejnik.com/proposed-amendments-to-eprivacy-regulation-are-great/
    http://www.europarl.europa.eu/sides/getDoc.do?pubRef=-%2f%2fEP%2f%2fNONSGML%2bCOMPARL%2bPE-606.011%2b01%2bDOC%2bPDF%2bV0%2f%2fEN

     
  • Louisville Hall of Justice Computer Stolen And Recovered, Hard Drive Still Missing

    There are reports out of Kentucky that a computer being used at the Louisville Hall of Justice has been stolen. The notable thing about this story, however, is that the computer was eventually recovered. Even more notable: the recovered computer was missing its hard drive. This small fact can be interpreted in many ways, but with the daily stories of stolen data being sold and traded on the internet, it's difficult not to conclude that the contents of the missing storage device have or will find their way to the dark net.  

    Terrible Security

    The stolen hard drive is presumed to have sensitive, personal information for "less than 175 individuals," the number of people contacted to alert them of the data breach. The personal information could include Social Security numbers, bank account numbers, and drivers' license numbers that were included in the emails of the two Assistant County Attorneys that used the computer.

    Will you stop and think what is being implied here, security-wise? Kentucky passed a breach notification law in 2014. It provides safe harbor if the sensitive information is encrypted or if it's believed that nothing will come out of the data breach.

    The absence of the hard drive negates the latter condition. The fact that people are being alerted means that the former condition has not been satisfied, either.

    But, remember, the sensitive data was stored in emails. So, not only was the computer not encrypted but neither were the emails. The implication, then, is that the two attorneys were shooting (or perhaps only receiving?) personal information around the internet without having it encrypted first.

    That's not good. When sending and receiving information from the internet, chances are that somebody, somewhere can intercept that data. ISPs have to do it, of course, because they're in the business of forwarding emails to the correct inbox.

    But criminals, too, can intercept emails by hacking strategic servers to either retain emails or to read an email's contents for specific patterns before it's sent on. So that's a digital info security no-no.

    Also, it was revealed that the computer was stolen from a publicly accessible conference room.

    Terrible, security-wise. The only way a non-encrypted computer with sensitive information should be allowed to remain in a publicly accessible conference room of a Hall of Justice is if the Hall of Justice is this one:

                                    Wikipedia Image of Superfriends Hall of Justice

    At least then you've some super-duper friends to stop any shenanigans.

     

    Related Articles and Sites:
    http://www.wdrb.com/story/35593341/computer-stolen-at-the-hall-of-justice-puts-some-at-risk-for-identity-theft
    http://www.alertboot.com/blog/blogs/endpoint_security/archive/2014/04/11/kentucky-data-breach-law-signed.aspx

     
  • Target Settles With 47 Attorneys General Over 2013 Hack

    One of the biggest hacks in history was the Target credit hack of winter 2013, which affected approximately 60 million people. Four years later, Target is finally putting the situation behind, settling legal action brought to it by 47 states. The amount: $18.5 million.
    This does not include the many millions the Minnesota-based retailer paid to credit card company Visa, victims, banks, and others, which pushes the total amount of legal fines and settlements to well over $100 million. (It also doesn't include such intangibles like the hit to its brand's goodwill; money Target spent trying to find how it was attacked; money fixing up its security issues; etc).
    Data breaches are expensive to deal with, as the Target and other incidents reveal. So far, so normal: the news about Target's settlement is non-news.  

    Raising Eyebrows

    Except there is a twist. Under a section termed "Specific Safeguards" of the settlement, Target agrees to specific data security protocols.
    Granted, you can find similar language in other agreements signed by many companies: the company agrees to use encryption anywhere sensitive data is stored, it promises to do a better job training employees, etc. But what Target is agreeing to is much more specific in comparison. For example:
    • "TARGET's Cardholder Data Environment shall be segmented from the rest of the TARGET computer network," or,
    • "TARGET shall deploy and maintain controls, such as, for example, an application whitelisting solution, designed to detect and/or prevent the execution of unauthorized applications…"
    There's more of that where it came from.
    Siloing data? Whitelisting? This type of language you expect from IT, not a group of people who spent time trying to pass the bar. It's not inconceivable that IT experts were hired as part of the settlement drafting process, or that the AGs (or their underlings) know their way around digital data security, and thus the settlement language reflects that.
    However, such specific details were inexistent in the past. It almost seems as if, realizing that the past 10 years of suing companies over data breaches has changed nothing, the government is now taking charge and including basic data security specs that companies should follow, minimizing any wiggle-room and loopholes for the lack of concreteness in wording.
    There will be detractors to this: codifying certain technologies into law today causes problems if the law doesn't keep up with progress. Conceivably, you could run into a situation where an offending party is protected by law despite not implementing adequate security. An example: a law passes requiring that a certain encryption algorithm be used, but a vulnerability in the code that cannot be fixed is highlighted soon after. Most companies switch to a different type of encryption but not all do. Subsequently, these stragglers are hacked using that vulnerability but are legally protected because the law wasn't updated in time.
    The good news in the Target case is that a settlement, while having legal effect, is not law. So, no unintended consequences there.
    In addition, it sends a signal to other companies on what is acceptable and what isn't. If Attorneys General across the continental USA slammed a Fortune 500 company for, for example, not siloing data, then it's not inconceivable that they'll do the same when they re-encounter a similar situation. Pointing out specific practices and technologies in settlements should provide ammunition to IT executives who try to implement them in the enterprise but find themselves hamstrung by higher-ups.
     
    Related Articles and Sites:
    https://www.databreaches.net/target-to-pay-47-states-18-5m-to-settle-data-breach-case/
    https://www.bna.com/target-pay-47-n73014451437/
    http://money.cnn.com/2015/03/19/technology/security/target-data-hack-settlement/
    http://src.bna.com/o8E
     
  • Global Malware Emergency Shows Why Backdoors Are Dangerous

    The big data security news this week is, of course, the WannaCry ransomware situation that reared its head last Friday, continued to grow over the weekend, and threatened to really become something had it not been for a serendipity: a kill-switch, possibly a mistake, baked into the malware.
    Many organizations and traditional news outlets have covered the situation from every angle possible, including:
    Of all the different reports and thought-pieces, though, this one might be the most controversial:
    One could be excused for thinking that Microsoft is just engaging in PR, passing the buck, refusing to shoulder responsibility, etc. After all, the ransomware hijacks a flaw that's been present in every single Microsoft operating system since Windows XP. That makes it a 15-year old flaw… and it means that Microsoft had 15 years to identify it and fix it. It certainly paints a bad picture for Redmond. However, blame cannot fall on Microsoft alone: for Act II, the same or different hackers could decide to target a flaw that's found in Apple's operating system or any of the various flavors of Linux. What's to stop them? After all, this latest attack, global in nature, was enabled by the leak of NSA hacking tools. And if rumors are true, the agency must have methods for exploiting weaknesses besides those found in Windows.
    It's not a secret that intelligence agencies make use of flaws (in fact, some have accused them of "hoarding" flaws). And now, unsanctioned hackers are getting into the game as well, in a big way, thanks to those same hoarded flaws leaked via WikiLeaks.
    An argument could be made that the flaws couldn't have been exploited if, instead of keeping a tight lid on them, the government had alerted companies of the flaw so they could fix it, or if the government had actually managed to keep a tight lid on things. But then again, agencies like the NSA are not in the business of identifying flaws and having these patched up. That's not their raison d'être. In fact, disgusting as it may be, it feels a little silly to criticize them for doing exactly what they were chartered to do. It'd be like criticizing a dominatrix for inflicting pain and humiliation.
    Another reason why you can't put all the blame on the government: there is some heft to the observation that Microsoft worked to fan the flames by coming up with a patch back in March, but allowing only paying customers to access it. One could argue that, NSA hoarding flaws or not, the situation would have been dampened if Microsoft hadn't tried to monetize the patch.
    Regardless of who you think is "responsible" for what happened, the disaster shows exactly why a security backdoor is a bad idea.

    Flaws => Backdoors

    Last year, the FBI took Apple to task for refusing to cooperate with a certain demand: that the company somehow provide a backdoor to encrypted iPhones. Of course, the FBI never said outright that they wanted a backdoor. But, in the end, it was exactly what they were arguing for.
    The counterargument from Apple and the rest of the technology sector was that backdoors cannot be completely controlled and thus will never be safe, a tune that data security professionals have been singing since the early 1990s. The tech sector's argument went so far as to imply that nothing less than the security of the free world was at stake. Many called such arguments spurious and melodramatic. An overhyped situation just like the Y2K bug scare (although, some argue that it's because a scared-witless world poured billions into the problem to fix it in time that the threat never materialized).
    As long as the situation remained theoretical, such criticism had a foot to stand on. But, with the temporary shutdown of an entire country's healthcare network (the UK's National Health Service) in our rearview mirror, it's hard to imagine that the tech sector's arguments can still fall on deaf ears. When it comes to data security, unintended flaws and purposefully placed backdoors are essentially the same because they lead to the same situation: at some point, someone who shouldn't know about it is bound to find it and exploit it.
    The many data security scares in the past, for all their coverage in the media, scarcely managed to turn fast-held opinions on the "need" for a backdoor. Some went as far as stating that they were sure that the brilliant minds behind encryption would find a way to create a secure backdoor that is inaccessible by the bad guys. (Despite the fact that the same brilliant minds were emphatic that it couldn't be done).
    One wonders how they could have been (and, possibly, currently are) so deluded. Perhaps it was because there wasn't a visceral-enough crisis to jolt their thoughts on what could be. Or, perhaps they thought that what could be wasn't really going to happen, at least not in their lifetime. This latest development will hopefully work to dampen their misguided but well-intentioned enthusiasm for hamstringing security, at least for the time being.

     

    Related Articles and Sites:
    https://www.bloomberg.com/news/articles/2017-05-14/hospitals-gain-control-in-ransom-hack-more-attacks-may-come

     
More Posts « Previous page - Next page »