in

This Blog

Syndication

Tags

News

AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.

Archives

AlertBoot Endpoint Security

AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.
  • Survey Says Data Breaches Result In Long-Term Negative Impact

    According to darkreading.com, a recent survey commissioned by CA Technologies has shown that there can be serious repercussions for companies that fall victim to data breaches. If the survey's conclusions are to be believed, about half of the organizations that were involved in a data breach see "long-term negative effects on both consumer trust (50%) and business results (47%)." Which is surprising, since the general feeling is that businesses involved in a data breach are not penalized at an appropriate level.
    For example, Equifax revealed a history-making data breach almost one year ago. Its stock price took a nose-dive, people were fired, financial penalties were proclaimed, people complained, lawsuit were filed, etc. Today, the stock price has recovered quite a bit from its one-year lows. Lawsuits are being battled in court, with the very real possibility of a summary dismissal; if not, the company will probably settle for an amount that will be a drop in the bucket for a company its size. The proclaimed penalties were withdrawn in exchange for Equifax upping their security. People don't complain as much as they grumble sotto voce. Year-over-year revenue is up at Equifax.
    All in all, it looks like Equifax has weathered this storm quite nicely. Such has been the basic pattern for major companies involved in data breaches since at least ten years ago.
    Once in a blue moon will you hear of a company that was so aversely impacted by a data breach that it made other companies sit up and take notice. But such instances are certainly far and few in between.  

    Survey Says…

    According to ca.com, among other things:
    • 48% - Consumers who stopped using the services of at least one organization due to a data breach.
    • 59% - Businesses that reported moderate to strong long-term negative impact to business results after a breach.
    • 86% - Consumers that prefer security over convenience.
    These figures are curious, especially the last one. It's known that people don't necessarily tell the truth on surveys, but the real issue in this instance is that a survey is but a snapshot in time. One need not doubt that nearly half the people surveyed stopped being a customer of a breached entity; however, it would be more informative to know how long they've been boycotting a company – one day, one week, one month, one year? – and whether they're still doing so when followed up some time later. (It should be noted that the survey did not define the length of "long-term" but one assumes it's longer than one year, in keeping with accounting terminology).
    Likewise for the figure on businesses negatively affected by a data breach. Equifax, for example, would have claimed that they were seriously affected if surveyed three months after their public outing; however, their answer would have been different one year later. And five years from now? Who knows?
    And then you have that counterintuitive 86% figure: a clear majority of people prefer security over convenience? That certainly is news, especially considering that people's actions have not supported such a conclusion over the past decade.  

    Strong Laws and Enforcement

    The concluding remarks of the survey, in a gist, are that companies need to improve their data security. (And, also, companies that are in the business of transacting personal information need to be more transparent about it. This was, after all, the year of the Cambridge Analytica scandal). Will companies improve their data security? Can they? The answer is yes.
    But not because of consumer demand.
    Consumers of goods and services have been raising hell over data breaches for a long time now. Data breach-related lawsuits that have been filed worldwide probably number in the thousands. Public spankings and shamings exceed that number. All of it to no effect. The only thing that's been shown to encourage attention to security is the passage and enforcement of laws.
    The world, due to its fractured nature, with each sovereign state approaching data breach ramifications in their own way, has become a living laboratory that reveals what works and doesn't when it comes to increasing data security and curbing data abuses.
    Simply put, companies respond to financial penalties, as can be witnessed from Silicon Valley's behavior toward China and Europe, or how the United States healthcare sector significantly increased their data security only after regulators started hitting them with million-dollar fines.
     
    Related Articles and Sites:
    https://www.darkreading.com/risk/48--of-customers-avoid-services-post-data-breach/d/d-id/1332452
    https://www.ca.com/us/company/newsroom/press-releases/2018/ca-technologies-study-reveals-significant-differences-in-perceptions-on-state-of-digital-trust.html
    https://www.ca.com/us/collateral/white-papers/the-global-state-of-online-digital-trust.html
     
  • FBI Director Says Legislation Possibly A Way Into Encrypted Devices

    Last week, FBI Director Christopher Wray said that legislation may be one option for tackling the problem of "criminals going dark," a term that refers to law enforcement's inability to access suspects' data on encrypted devices. The implication is that, in the interest of justice and national security, the FBI will press for a law that will guarantee "exceptional access" to encrypted information. This most likely will require an encryption backdoor to be built on all smartphones, possibly on all digital devices that store data.
    It should be noted that the FBI emphatically denies that they want an encryption backdoor. One hopes they have taken this position because they're aware of the security problems backdoors represent; however, it's hard to ignore the possibility that the FBI is in spin-doctor mode. Their Remote Operations Unit, charged with hacking into phones and computers of suspects, uses terms like "remote access searches" or "network investigative techniques" for what everyone else would call "hacking" and "planting malware." Mind you, their actions are legally sanctioned, so why use euphemisms if not to mask what they're doing?
    If turning to legislation smells of déjà vu to old-timers, it's because this circus has been in town before. It set up its tent about 20 years ago and skipped town a couple of years later. And while many things have changed in that time, the fundamental reasons why you don't want encryption backdoors has not.  

    A Classic Example of Why You Don't Want a Backdoor

    The FBI has implied time and again that they are in talks with a number of security experts that supposedly claim the ability to build "encryption with a backdoor" that cannot be abused by the wrong people. These security experts are not, the FBI notes, charlatans. Perhaps it is because of these experts that the FBI has not desisted from pursuing backdoors. This, despite the overwhelming security community's proclamation that it cannot be done.
    It should be noted that Wray was asked by a Senator at the beginning of this year to provide a list of cryptographers that the FBI had consulted in pushing forth their agenda. To date, such a list has not been produced.
    (As an aside, according to wired.com, Ray Ozzie, arguably one of today's greatest minds in computing, has recently and independently proposed a way to securely install a backdoor without compromising the security of encryption. Here's one of the world's leading security expert's take on it: the conclusion, in a nutshell, is that it's flawed and mimics unsuccessful solutions proposed in the past).
    What is it about backdoors that their mention result in knee-jerk reactions by the security community? The answer lies in the fact that they have been looking into this for a long, long time. In the end, it's the unknown unknowns that are the problem: Encryption solutions run into surprises (bad ones) all the time. No matter how well-designed, it's impossible to prevent stuff like this or something like this from happening.
    In June 2017, it was reported that over 700 million iPhones were in use. Not sold; in use. It can also be assumed that there are at least an equal number of Android devices in use as well. That would be a lot of compromised devices if a backdoor was in effect and a bug was introduced.
    These issues cannot be legislated away. Furthermore, bugs merely represent one situation where a backdoor can lead to disaster. Others include the deliberate release of how to access the backdoor (think Snowden or Manning or the leak of CIA hacking tools); the phishing, scamming, conning, or blackmailing of the custodians of the backdoor; and the possibility of stumbling across the backdoor. Granted, the last one is highly unlikely, even more so than the others…but so are the chances of winning the lottery. And there have been hundreds, maybe thousands, of them across the world.
    The point is that the chances of the backdoor being compromised are higher than one would expect.  

    Moral Hazard = FBI's Pursuit of the Impossible?

    One has to wonder why the FBI is so insistent on pursuing the impossible dream of an encryption backdoor that doesn't compromise on security. It would be easy to dismiss it as a case of legal eggheads not knowing math and science, or not having the imagination to ponder how badly things could go wrong.
    But perhaps it's an issue of moral hazard. Basically, there is very little downside for the FBI if a backdoor is implemented. Everyone knows that, if the FBI gets what it wants, they won't have direct access to the backdoor; it wouldn't be politically feasible. For example, prior to suing Apple in 2016, they suggested that Apple develop a backdoor and guard access to it. When the FBI presents an iPhone and a warrant, Apple unlocks the device. The FBI is nowhere near the backdoor; they're by the water-cooler in the lobby.
    The arrangement sounds reasonable until you realize that the FBI doesn't take responsibility for anything while reaping the benefits. The FBI does not have to develop, test, and implement the backdoor. Once implemented, it doesn't have to secure and monitor it. If there is a flaw in the backdoor's design, the FBI dodges direct criticism: they didn't design it, don't control it, etc. Last but not least, the onus is on the tech companies to resist foreign governments' insistence on being given access to encrypted data. Which you know will happen because they know the capability is there.
    It's a classic case of heads, I win; tails, I don't lose much.
     
    Related Articles and Sites:
    https://www.cyberscoop.com/fbi-director-without-compromise-encryption-legislation-may-remedy/
    https://www.theregister.co.uk/2017/10/05/apple_patches_password_hint_bug_that_revealed_password/
    https://www.ecfr.eu/page/-/no_middle_ground_moving_on_from_the_crypto_wars.pdf
     
  • Most of the Used Memory Cards Bought Online Are Not Properly Wiped

    According to tests carried out by researchers at the University of Hertfordshire (UK), nearly two-thirds of memory cards bought used from eBay, offline auctions, and second-hand shops were improperly wiped. That is, the researchers were able to access images or footage that were once saved to these electronic storage units… even if they were deleted.

     

    Free and Easy to Use Software

    Popular media would have you believe that extracting such information requires advanced degrees in computers as well as specialized knowledge and equipment. These would certainly help; however, the truth is that an elementary school student would be able to do the same. The researchers used "freely available software" (that is, programs downloadable from the internet) to "see if they could recover any data," and operating such software is a matter of pointing and clicking.
    In this particular case, the recovered data included "intimate photos, selfies, passport copies, contact lists, navigation files, pornography, resumes, browsing history, identification numbers, and other personal documents." According to bleepingcomputer.com, of the one hundred memory cards collected:
    • 36 were not wiped at all, neither the original owner nor the seller took any steps to remove the data.
    • 29 appeared to have been formatted, but data could still be recovered "with minimal effort."
    • 2 cards had their data deleted, but it was easily recoverable
    • 25 appeared to have been properly wiped using a data erasing tool that overwrites the storage area, so nothing could be recovered.
    • 4 could not be accessed (read: were broken).
    • 4 had no data present, but the reason could not be determined
     

    Deleting, Erasing Wiping… Not The Same

    Thankfully, it appears that most people are not being blasé about their data. They do make an effort to delete the files before putting up their memory cards for sale. The problem is, deleting files doesn't actually delete files. (This terminology morass is the doing of computer software designers. Why label an action as "Delete file" when it doesn't actually do that?)

    The proper way to wipe data on any digital data medium is to overwrite it. For example, if you have a hard drive filled with selfies, you can "delete" all of it by saving to the disk as many cat pictures you can find on the internet (after having moved all of the selfies to the trash/recycle bin on your desktop). This is analogous to painting over a canvas that already has a picture on it, although the analogy breaks down somewhat if one delves into technical minutiae.

    Incidentally, this is why encryption can be used to "wipe" your drive: Encryption modifies data so that the data's natural state is scrambled / randomized. When an encryption key is provided, the data descrambles so that humans can read it. Once the computer is turned off, the data returns to its scrambled state. So, if you end up selling a memory card with encrypted data but without the encryption key, then it's tantamount to offering for sale a memory card that's been properly wiped.

     

    More of the Same

    This is not the first time an investigation has been conducted into data found on second-hand digital storage devices. As the bleepingcomputer.com article notes, similar research was conducted in the past:
    A study conducted in 2010 revealed that 50% of the second-hand mobile phones sold on eBay contained data from previous owners. A 2012 report from the UK's Information Commissioner's Office (ICO) revealed that one in ten second-hand hard drives still contained data from previous owners. A similar study from 2015 found that three-quarters of used hard drives contained data from previous owners.
    And these are but a small sample of the overall number of similar inquiries over the years. The world has seen more than its fair share of privacy snafus, be it a data breach or otherwise. Despite the increased awareness on data security and its importance, the fact that we're still treading water when it comes to securing data in our own devices could signify many things:
    • People don't really care, even if they say they do.
      • No surprises there.
    • We are too focused on spotlighting the problem and failing in highlighting the solution.
      • News anchor: "Yadda yadda yadda…This is how they hacked your data. Be safe out there. And now, the weather." Be safe how? What do I do to be safe?
    • People interested in preserving their privacy do not sell their data storage devices; hence, studies like the above are statistically biased to begin with.
      • Essentially, researchers are testing the inclinations of people who don't really care about privacy or don't care enough to really look into it (a quick search on the internet will show you how to properly wipe your data).
    • Devices sold were stolen or lost to begin with, so the sellers do not have any incentive to properly wipe data.

    Whatever the reasons may be for the continued presence of personal data on memory storage devices, regardless of how much more aware we are of privacy issues, one thing's for certain: It's not going away.

     

    Related Articles and Sites:
    https://www.bleepingcomputer.com/news/security/two-thirds-of-second-hand-memory-cards-contain-data-from-previous-owners/

     
  • SCOTUS Says Cops Need Warrant For Location Data From Network Providers

    It's hardly a secret that the proliferation of digital devices has opened up opportunities and headaches for law enforcement. In the former camp, modern communication devices are de facto tracking devices that store and generate copious amounts of data; access to it could easily make or break a case. In the latter, the use of encryption and other security measures makes it challenging, if not impossible, to access that same data. And now the courts are making it even more onerous to obtain it.  

    Get a Warrant

    According to a recent ruling, law enforcement will "generally" require a warrant to obtain a person's location data from network providers. Before this ruling, the Third Party Doctrine stated that a person gives up their expectation of privacy when he shares information with a third party (like banks, ISPs, phone companies, etc). Hence, law enforcement could request data from these third parties without a warrant; they only had to prove that the information they were seeking could be pertinent in an investigation. For example, police could ask a bank to see a suspect's credit card transactions, since it would pinpoint a person's location at a particular time. In fact, they can still do this going forward.
    However, the Supreme Court has decided otherwise when it comes to your location as pinpointed by your cellphone, more specifically, the cellphone location data created and collected by telcos. It is hardly a surprising judgment. For example, bugging a person's vehicle with a tracker requires a court order because continuous tracking is considered a violation of privacy expectations.
    Of course, there is a difference between bugging a car and using a cell phone: be the phone dumb or smart, it's not the government doing the bugging – you're bugging yourself and paying a company every month to do so. The government could argue (and probably has) that they're merely picking up a trail that you've agreed to broadcast. It would be no different from you tossing evidence left and right as you flee from a crime scene, creating a trail to yourself as you make your getaway. There's nothing illegal in law enforcement following that trail. Indeed, they'd be remiss in not doing so.
    The thing is, though, that life for many now revolves around access to the services that telcos offer. Well over half the population is using either a dumb or smart phone, and these devices need to know your location. Otherwise, you wouldn't be able to receive calls or texts. This is also the case for accessing mobile internet.
    Furthermore, these devices are very rarely turned off, for obvious reasons. So, the data that's collected by telcos and shared with law enforcement would include information that traditionally requires a warrant anyway. The warrant requirement for bugging a vehicle was already mentioned. Even more sacrosanct is the privacy of a person in one's home, and law enforcement's incursion nearly always requires a warrant. Even pointing a thermal imaging device to a person's home without court approval is illegal, which technically does not involve "entering" the home but does involve obtaining evidence from it.
    So, an "information dump" of telco data over an extensive period would already come to loggerheads with such legal restrictions, it seems.
    However, the judges ruled, five to four, that a warrant is necessary when accessing telco location data because the data allows the type of surveillance from which the Fourth Amendment protects US citizens. Remember, the Fourth exists because the British authorities would look for evidence of wrongdoing whenever they felt like it, wherever and whomever it might be.  

    The Fourth Amendment

    The US Constitution provides protections from that sort of thing. Certainly, you could have committed a crime. And, certainly, evidence of said crime could be in your home. But, the law can only enter your home and search for that evidence, and only that evidence, if they have probable cause, which is the grounds for issuing a warrant.
    Consider, then, the aspects of the information that law enforcement claims it should be able to access without a warrant:
    • Location data is very accurate. Not as accurate as GPS but close enough – and the technology will only get better to the point that it will be just as good as GPS or better.
    • This data now covers a sizable number of the entire US population, seeing how Americans of all stripes and colors carry a cellphone.
    • The collected data is excessive. A person's location can be pinged by cell towers multiple times every minute. One can literally tell where a person is every minute of the day.
    • The data is retroactive. The location data is stored by telcos for up to five years. Change the law, and it could be ten years. Perhaps even longer if DNA storage finally happens. ('Cause, let's face it, the only reason why telcos don't want to keep this data long term is tied to storage costs).

    So, we're talking about data that's akin to what would be generated if you implanted a tracking chip on most Americans and let them go about their lives. And because the government didn't force anyone to do this, and third parties are involved, a warrant shouldn't be necessary when trying to get a hold of this data. This, in a nutshell, was their (very dystopian) argument.

    The courts (barely) disagreed.

    However, it follows a number of rulings over recent years where the courts have upheld privacy interests over law enforcement. It seems that slowly, but surely, as the effects and impact of technology begins to leave an imprint upon all – as opposed to the young or the hip or the tech-savvy – people are beginning to have a better understanding of what's at stake.  

     

    Related Articles and Sites:
    https://gizmodo.com/cops-need-a-warrant-to-collect-your-phone-location-data-1827050891

     
  • Yahoo Penalized £250,000 By UK Information Commissioner's Office

    It was reported this week that the United Kingdom's Information Commissioner – the person whose department is in charge of upholding the nation's data privacy laws – has penalized Yahoo! UK Services Limited with the amount of £250,000.
    The penalty is in response to the global data breach Yahoo experienced, and hid, for over two years. Approximately 500,000 accounts in the UK were affected.
    Knowing what we do of the Yahoo breach, and keeping in mind that the ICO can issue a monetary penalty of up to £500,000, it sounds like a woefully inadequate amount. For example, the US's SEC, the Securities and Exchange Commission, fined Yahoo $35 million, a little over 10 times the ICO's penalty.  

    Data Breach Not the Issue?

    According to cnet.com, Yahoo UK was not fined for the data breach. Apparently, what the ICO views as problematic is the long delay in notifying people of the data breach (two years!).
    Which is crazy if it's true.
    There was no "delay." Yahoo didn't fail to alert users of the data breach "in a timely manner." The company, for all intents and purposes, appears to have actively hid the data breach – which is the real scandal; data breaches involving hundreds of millions of people are not a rarity anymore, and neither is going public with the fact at the speed of molasses – of which not alerting affected users is a key component. To fine Yahoo UK for taking longer than usual in notifying people of a data breach is bonkers.
    Thankfully, it seems that the ICO took more than the so-called delay into account:
    • Yahoo! UK Services Ltd failed to take appropriate technical and organisational (sic) measures to protect the data of 515,121 customers against exfiltration by unauthorized persons;
    • Yahoo! UK Services Ltd failed to take appropriate measures to ensure that its data processor – Yahoo! Inc – complied with the appropriate data protection standards;
    • Yahoo! UK Services Ltd failed to ensure appropriate monitoring was in place to protect the credentials of Yahoo! employees with access to Yahoo! customer data;
    • The inadequacies found had been in place for a long period of time without being discovered or addressed.
    Still, the explanation doesn't quite make sense. In the past, the ICO has issued penalties as high as £400,000 for data breaches, as well as other violations of the Data Protection Act. Considering only instances involving data breaches, aside from Yahoo, none of the companies have swept incidences under the rug. They were accused of being technically negligent (same as Yahoo); of having financial, technical, and other means to ensure better data security (same as Yahoo); of not being aware that they were hacked, when they could easily have figured that out (same as Yahoo); etc. In most cases, if not all, less people were affected than in the Yahoo breach.
    So why is Yahoo UK's penalty so much lower? Especially considering that the other companies do not have the dubious reputation of actively hiding the fact that they were hacked? If anything, you would think Yahoo UK's penalty would have hit a new high in the history of ICO monetary penalties to date.
     
    Related Articles and Sites:
    https://www.cnet.com/news/yahoo-fined-334000-in-the-uk-for-failing-to-disclose-2014-hack/
    https://ico.org.uk/about-the-ico/news-and-events/standing-up-for-the-data-rights-of-our-citizens/
     
  • FBI Inflated Encrypted Smartphone Count

    Over a number of years, the FBI kept making the case for an encryption backdoor to smartphones. Of course, because "encryption backdoor" is a charged term, they said that they didn't need a backdoor per se, just a (secret) reliable way to get into encrypted devices when they obtained a warrant.
    This twisting of words is risible because "a reliable way to get into encrypted devices" is kind of the definition of a backdoor. Even the passwords set by smartphone owners are not reliable in the vein that the FBI wants them to be since people are prone to forgetting passwords: What if you went on a digital detox for a month and you actually did forget it? What if you changed it while drunk? What if you had a concussion? So, if you're looking for a method that will work 100% of the time, well… it's got to be a backdoor.
    As part of their case for notbackdoors, the FBI quoted the number of inaccessible devices that were at the center of unsolved crimes. In January 2018, the Director of the FBI, Christopher Wray, emphasized in a speech at a cyber-security conference that nearly 7800 devices could not be accessed in 2017.
    Last week, the Washington Post wrote that the figure was inflated, which was confirmed by the FBI. The actual number of devices that are inaccessible has not been released as of yet, but it's believed to be between 1000 to 2000, a range that is more in line with the 2016 figure: 880 encrypted devices.
    Why the sudden decrease? The FBI says they made an error when compiling their data, a result of having the data in three separate databases instead of one, central one.  

    Credibility Issues

    The FBI has credibility issues. In areas other than encryption, it could be because they're victims of concerted political smear campaigns. Who knows, really. But when it comes to encryption, the Bureau keeps painting itself into a corner.
    This month, it was the revelation of overinflated figures.
    In 2016, the FBI took Apple to court, arguing that they had exhausted all avenues for accessing a terrorist's encrypted iPhone. Towards the end of the legal battle, most experts were learning towards the opinion that the FBI would lose. Coincidentally, or not, the Bureau dropped their lawsuit at the eleventh hour, saying that they had found a third party that could crack open the phone's contents for them.
    Later that same year, the Office of the Inspector General reported that an internal miscommunication led the FBI to conclude that they had tried everything to crack the iPhone's encryption… but they hadn't. (So, technically somehow, the FBI wasn't lying when they said they had).
    And earlier this year, a second company announced discovering ways around iPhone encryption and began selling these techniques to law enforcement. At relatively affordable prices, one might add. So. Over the last couple of years, the FBI has essentially:
    • Mislead the public and Congress, probably not on purpose;
    • Tried to force a company to redesign a key component of their profit driver under the auspices of national security, as if we were living in a Soviet-era communist nation, despite the fact that said company hadn't done anything illegal (because, otherwise, why'd they drop the case? They should have continued even if they eventually found a way into the iPhone);
    • Passive-aggressively insinuated that the entire tech community is a group that encourages and enables criminals, evidenced by its unwillingness (and not mathematical impossibility) to create an encryption backdoor that's not a backdoor, because, you know, that's not what the FBI wants. This, despite the NSA and the CIA issuing declarations that backdoors and other forms of intentionally crippling security are a bad idea.
    The above, of course, does not cover scandals that involve the FBI that are not tied to encryption. It's becoming very hard not to view the FBI's action through a cynical lens.  

    Future Tools

    One has to admit that the problem of "going dark" is real. While it's anyone's guess how big a problem it currently is, it undoubtedly will grow bigger as time goes by. A solution may present itself in quantum computers.
    IBM warned earlier this year that advances in quantum computing could mean that today's ubiquitous encryption can be easily broken in five years' time. Their cost could ensure that only governments and large organizations can afford them for the foreseeable future – just like only they can afford supercomputers – satisfying the goal of not hamstringing cryptography as well as only allowing "the good guys" to break encryption when needed (and authorized).  
     
    Related Articles and Sites:
    https://www.washingtonpost.com/news/monkey-cage/wp/2018/05/30/the-fbi-blunder-on-phone-encryption-explained/
    https://www.washingtonpost.com/world/national-security/fbi-repeatedly-overstated-encryption-threat-figures-to-congress-public/2018/05/22/5b68ae90-5dce-11e8-a4a4-c070ef53f315_story.html
    https://www.fbi.gov/news/speeches/raising-our-game-cyber-security-in-an-age-of-digital-transformation
    https://www.lawfareblog.com/fbi-director-christopher-wrays-remarks-encryption-international-conference-cyber-security
    https://www.wired.com/story/significant-fbi-error-reignites-data-encryption-debate/
     
More Posts « Previous page - Next page »