in

This Blog

Syndication

Tags

News

AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.

Archives

AlertBoot Endpoint Security

AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.

July 2017 - Posts

  • Customs and Border Protection Admits They Cannot Search Remote Data

    Earlier this week, the US Customs and Border Protection (CBP) responded to Senator Ron Wyden's inquiries regarding electronic device searches at US borders (more specifically, airports). As numerous media outlets have relayed, CBP "admitted" that they do not have the authority to search data that is "solely" in the cloud, data that is not the on a device itself but could easily be accessed via a smartphone.
    The implication, it appears, is that CBP does not want to risk accessing information that could exist in servers located on proper US or foreign soil – that is, outside of their own jurisdiction – and which could require a proper warrant.
    But aside from that, CBP reiterated that they have the right to conduct searches on data storage devices. The inclusion of the word "solely" in the response, experts surmise, means that emails, text messages, and other information that exist both in the cloud and a device is fair game.
    In addition, CBP apparently admitted:
    that travelers can refuse to unlock their devices or hand over their passwords, but if they do so, CBP officials have the right to detain the device. [neowin.net]
     

    A Couple of Things of Note

    As interesting as the above may be, taking a look at the actual letter(PDF) had plenty of surprising things to reveal that wasn't covered elsewhere.
    To begin with, it appears that CBP can search your belongings for absolutely no reason ("do not require a warrant or suspicion") – it wasn't "just a feeling" that they were doing it, it's actual policy. In addition, they will limit when they'll search a device's contents based on geographic location. In a footnote, the following can be found:
    Border searches of electronic devices do not require a warrant or suspicion, except that following the decision in Cotterman v. United States, 709 F.3d 952 (9th Cir.2013), certain searches undertaken in the Ninth Circuit require reasonable suspicion of activities in violation of a law enforced or administered by CBP.
    The implication here is that, somehow, entering the US via the west coast guarantees a little more rule of law than entering the US from elsewhere (the Ninth Circuit is comprised of Alaska, Arizona, California, Hawaii, Idaho, Montana, Nevada, and Washington, as well as Guam and the Northern Mariana Islands).
    In addition, the letter pointed out that searches of devices are "exceedingly rare… less than one-hundredth of one percent of travelers arriving" to the US.
    This means that devices searches are less than one in 10,000 (which translates to 0.0001 or 0.01%); it also implies that searches are somewhere close to this number. That does seem rare indeed. Except, let's put that in context, shall we?
    According to the US's own government data (PDF), 77.51 million international visitors traveled to the US in 2015. For Americans going abroad, it was 32.78 million; one assumes most of them will return. Applying that 1 in 10,000 figure above, it translates to approximately 11,000 devices searched. It might be relatively small to the number of people entering the US, but it's a pretty big number in its own right. I mean, can you imagine 11,000 phones laid side by side? Where do they even store all this stuff?
    For an everyday comparison, take the instance of car crashes. According to this site, over 37,000 people die in road crashes each year. There are about 323 million people in the US. That means 1.15 in 10,000 people die in car crashes every year in the US. Those figures are pretty close to the number of devices searched by CBP.
    Now, ask yourself, does it feel to you as if car crash deaths are exceedingly rare in the US?  

    One Final Thing

    In a question, the Senator asked whether (my emphasis) "CBP is required to first inform the traveler that he or she has the right to refuse to disclose a social media or email account password or device PIN or password"?
    The CBP's answer, while long, does not address the issue. It would appear that the answer is "no, there is no such requirement."
    Not sure why you'd perform verbal jujitsu instead of coming right out and saying it. It wouldn't be unexpected of people who can perform "border searches of electronic devices [that] do not require a warrant or suspicion."
     
    Related Articles and Sites:
    http://www.nbcnews.com/news/us-news/border-patrol-says-it-s-barred-searching-cloud-data-phones-n782416
    https://arstechnica.com/tech-policy/2017/07/us-border-agents-we-wont-search-data-located-solely-on-remote-servers/
    https://www.pogowasright.org/border-patrol-says-its-barred-from-searching-cloud-data-on-phones/
     
  • Australia Looking To Compel Electronic Message Decryption

    Last week, Reuters and other sources reported that the Australian government has proposed laws that would compel companies to provide access to encrypted information. Obviously, asking for such data is conditional upon taking all the proper legal steps.  

    A Growing Demand

    Governments the world over have been clamoring for access to encrypted data for years now, for decades if you want to be technical about it: ever since the "encryption wars" of the 1990s. But, demands have particularly intensified in recent years, usually under the banner of "we can't tell what terrorists are doing," an argument maintained by upstanding democracies as well as disreputable nation states.

    Some propose the inclusion of backdoors to encryption, which is universally rejected by security professionals: in an era when even unwitting mistakes can be leveraged to break into encryption, a purposefully built-in cryptographic defect will undoubtedly create bigger problems – especially if it's known, without a doubt, that there is one to be exploited.

    (On a societal level, backdoors present a different problem to governments supposedly accountable to the people they represent: your government officials proposing to force companies to secretly build a way to spy on people. Even if the government were to follow all protocol when doing the spying, it would be hard to shake off comparisons to ***, fascists, Cold War-era Eastern Europe, North Korea, etc. because legal protocols were followed in these cases, too).  

    Not a Back Door, But Security Suffers Nonetheless

    Australia's proposed legislation is heavily based on the United Kingdom's "Snooper's Charter." However, it differs in its decryption requirement, with the UK's version not having such a clause.

    Thankfully, back doors are not required to meet the rule of law. For example, a copy of each message ever sent via an encrypted app could be kept stored somewhere. The encryption engine used in the apps could remain intact, ensuring that messages that are intercepted cannot be cracked. This means, however, that the app provider would have to act as a middle man, and that the message would be encrypted twice: once when being sent to the app provider and again when the latter sends the message to the recipient.

    If the provider is subpoenaed, it's a matter of producing the decrypted message to the authorities. (One assumes that these messages will be stored in an encrypted state until necessary).

    This, however, introduces a second problem: knowing that perfect security is impossible, the provider is charged with the unachievable duty of protecting every single one of those messages from hackers, be they internal, external, or government-sanctioned (foreign or otherwise).

    Furthermore, it does nothing to address the original problem that started this entire "encrypt everything" mindset: finding out that governments, Australia included, have been collecting, processing, mining, and generally spying on people. What's even more suspicious is that, when challenged or about to be challenged about these practices, the programs were shut down.

    Sounds like something out of the Bourne Trilogy.  

    Short-lived Gains

    Looking at the long-term consequences, it looks like the law will be for naught. The argument for favoring the law is that governments are having a hard time finding, tracking, and neutralizing terrorist activities; and that it's not the government's intention to pry into private lives on a massive, automated scale.

    Fair enough.

    But, aren't terrorists using these encrypted apps because they're encrypted? It's not as if they chose a communication app for its choice of emoticons and stickers and, as a happy coincidence, it also features end-to-end encryption.

    Consequently, if governments the world over decide to enact similar laws to Australia's proposed, wouldn't it force terrorists to use something else? Perhaps create an encryption app of their own that can be side-loaded on a smartphone? It wouldn't necessarily be easy, but with enough resources, it's not impossible. For example, Mexican drug cartels have been known to build their own cellular network; creating a private communications app would be less daunting and cheaper… and its operation would be harder to detect.

    So, the end result would still be the same – governments having a hard time finding, tracking, and neutralizing terrorist activities – but with increased costs to providers and increased security risks for law-abiding people all around.

    This, however, does not seem to faze the Australian government. Assuming, of course, that they've even considered it at depth. Australian Prime Minister Turnbull had this to say regarding math and cryptography:

    "Well the laws of Australia prevail in Australia, I can assure you of that. The laws of mathematics are very commendable, but the only law that applies in Australia is the law of Australia," he said.
    This was later clarified:
    "I'm not a cryptographer, but what we are seeking to do is to secure [technology companies'] assistance. They have to face up to their responsibility. They can't just wash their hands of it and say it's got nothing to do with them."
    Of course, tech companies are not saying that, that "it's got nothing to do with them." If anything, tech companies are essentially stating the obvious. In terms of an analogy, they're saying that they cannot create a gun that only shoots the bad guys, and when trained on the good guys, the gun won't work. Encryption is the same, and the moment you start making compromises on how it works, it's bound to create other problems. Regarding guns, it would be limited and minimal, but when it comes to encryption, the effects could be global.  

    Two Moves Ahead?

    However, perhaps the end game to the legislation is forcing criminals off well-protected, well-financed apps and devices.

    Some of the brightest minds in the world are working on or contributing towards the security efforts of companies whose lifeblood is the internet. Criminals' technical knowhow and financial resources are microscopic in comparison to what these companies can marshal for data security. Piggybacking on their efforts, and for gratis to boot, is a win-win for bad guys.

    Understandably, governments should be incensed. It would be preferable that those in the shadows use their (comparatively paltry) resources on creating "top-notch" security – thus shifting funds away from, say, weapons – than getting it for free.

    And, while creating a communications app might be relatively easy, creating a secure means of communication with perfect privacy is not. If the best minds in the world are not involved, there's a very good chance that easily exploitable flaws will be found… and that's familiar ground for intelligence organizations.

    So, even if criminals and terrorists do veer off towards uncompromised encrypted communications because of the law, the end result could very well not be the same.

    The only problem? You can't legislate math and science; in the long run, they win. The problems pointed out by mathematicians, scientists, cryptologists, data security professionals, and others will rear their ugly heads sooner or later.

    The question is whether it will have been worth it.

     

    Related Articles and Sites:
    http://appleinsider.com/articles/17/07/14/proposed-australian-law-forces-tech-companies-to-decrypt-customer-messages
    http://www.reuters.com/article/us-australia-security-messages-idUSKBN19Z08S

     
  • Schools In EU Could Face Heavy Fines For Data Breaches

    Beginning in May 2018, schools in EU member countries (including the UK despite Brexit) must comply with the new General Data Protection Regulation (GDPR). Not doing so would mean they could be subject up to 4% of their turnover, a figure that created quite the buzz when it was announced for businesses earlier this year (some news sites debated the implication for notorious privacy-stragglers like Facebook and Google).  

    Walk, Don't Run

    Poring over data breaches of the past ten years, it can be readily surmised that the education sector is not very good at protecting itself from data security incidents, be it hacking intrusions or lost data storage devices. It must be pointed out, however, that it's usually because they lack the resources to do so: schools trying to stretch their shrinking budgets cannot afford the latest and greatest in technology (leading to banks of old computers running long-abandoned software that is virtually impossible to secure), much less a proper IT security staff.
    In the short-term, it looks like GDPR could cause more problems than solve them, especially because schools are unaware of their responsibilities. In the UK, the Information Commissioner's Office (ICO) has provided some tips on what must be done – but as it's usually the case, this is not to be taken as a checklist where you can cross off items and declare yourself compliant.
    There is much to do, according to an expert interviewed by schoolsweek.co.uk:
    • Replace out-of-date IT equipment and ensure warranties exist for current equipment.
    • Designate a data protection officer.
    • Ensure a formal contract exists with data processors, which need to meet industry standards.
    • Document where data goes and how it is used.
    There are other experts, however, who advise not to rush into things as there are some issues that need to be resolved. On the other hand, they, too, advise to "start 'preparatory tasks.'"  

    Other Side of the Pond

    Meanwhile, in the US, a school district in Maryland has opted to stop collecting Social Security numbers for students:
    Director of Technology Infrastructure Edward Gardner, who oversaw the development of the new data policy, said the school system would not collect student Social Security numbers "unless explicitly necessary," and he could not think of a reason it would be.

    This may very well be the best approach to data security: if you don't need it, don't collect it. Far too many organizations take the approach of "collect it first and deal with it later." The problem is, of course, that it never gets dealt with at all. At least, not in the sense of securing the data – be it encrypting, scrubbing, deleting, etc.

    Unsurprisingly, a data breach ensues somewhere.

    Related Articles and Sites:
    https://schoolsweek.co.uk/schools-face-hefty-fines-for-data-breaches-under-new-eu-laws/
    https://www.databreaches.net/school-district-in-maryland-stops-collection-of-social-security-numbers/
    http://www.govtech.com/security/School-District-in-Maryland-Stops-Collection-of-Social-Security-Numbers.html