in

This Blog

Syndication

Tags

News

AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.

Archives

AlertBoot Endpoint Security

AlertBoot offers a cloud-based full disk encryption and mobile device security service for companies of any size who want a scalable and easy-to-deploy solution. Centrally managed through a web based console, AlertBoot offers mobile device management, mobile antivirus, remote wipe & lock, device auditing, USB drive and hard disk encryption managed services.
  • Uber Being Investigated For 2016 Data Breach

    Uber, the ride-sharing Silicon Valley unicorn, is… still in the news. They say that all publicity is good publicity – even the bad ones – but Uber is really taking that saying to its limits, it seems.

    This week, it was revealed that the company had been hiding a massive data breach that occurred over a year ago. The breach involved personal information including names, email addresses, and phone numbers of 57 million customers worldwide. In addition, driver's names and their license numbers were illegally accessed as well (7 million in total; 600,000 drivers in the US alone). According to bloomberg.com,

    Here’s how the hack went down: Two attackers accessed a private GitHub coding site used by Uber software engineers and then used login credentials they obtained there to access data stored on an Amazon Web Services account that handled computing tasks for the company. From there, the hackers discovered an archive of rider and driver information. Later, they emailed Uber asking for money, according to the company.
    Unsurprisingly, many states – including Illinois, Massachusetts, Missouri, New York, Connecticut, and Washington – have announced an investigation into the matter. Data security regulators in other countries have done the same.  

    A Checkered Past

    It was just this past August that Uber agreed to a settlement with the FTC, closing a probe into how Uber misled customers regarding its privacy practices: the company allowed employees to access riders' personal information, including the details of trips, via a tool called "God View." The problem was described by some as a "lapse" in the ride-hailing company's security practices.

    In addition, the company had to deal with a data breach (smaller than the one being discussed here). The FTC looked into the issue and concluded, per recode.net:

    For years, Uber stressed it had taken great steps to protect its driver and rider data — all stored using Amazon’s cloud service. Until 2015, however, some of that information was saved as "clear, readable text, including in database back-ups and database prune files, rather than encrypting the information," the FTC said.

    The end result? Uber agreed to 20 years of oversight, the implementation of a comprehensive privacy policy, etc. The usual stuff that Big Tech companies agree to. An "onerous" slap on the wrist.

    (However, as recode.net points out, the settlement hasn't been finalized. The FTC must vote on it, and some lawmakers had urged the FTC to increase the penalties, perhaps even open a new investigation based on what the probe had revealed. This was before the latest revelation).  

    Hackers Bad. Lawyers Even Worse?

    When Bloomberg broke the news about Uber's latest transgression, two people were fired, including Uber's Chief Security Officer, Joe Sullivan. When approached by the hackers, Sullivan and Craig Clark, a lawyer with the company, made the decision to pay the attackers $100,000 to delete the data and to stay quiet about the incident.

    While none of that is illegal – paying off the hackers, asking them to be quiet, the hackers actually keeping quiet, and the hackers deleting the data they had acquired – what Uber did afterwards is.

    The US has 48 separate data breach notifications laws. Most of them are similar. For example, most have a specific definition of what "private data" is and is not, and generally require a notification to be sent within 60 calendar days of discovering the breach. Also, they provide safe harbor from notifying clients after a data breach if the data was encrypted.

    Unfortunately, not all states offer the same protection, meaning that if your business is big enough, you're going to have to come clean anyhow: while people may be willing to believe that a Brooklyn-located mom-and-pop store's data breach affected New York residents only, it'd be very unusual that only New York residents were affected by a Uber hack. So, it makes no sense to announce a data breach in New York only (assuming New York does not provide encryption safe harbor) because people excel at adding two and two together.

    In addition, the European Union has very extensive privacy safeguards in place, and data breach notifications, at least to regulators, are de rigueur. So, again, if your business is big enough that it traverses your home country's natural borders, then you're going to have to fess up. Because people also excel at adding deux and deux together.

    When Sullivan and Clark decided to conceal what had happened, they broke… the same law, essentially, oh-so-many-times. The fact that lawyers decided to take this approach (Sullivan, the unseated CSO, was a federal prosecutor earlier in his life) is surprising à la Schrodinger's meow – that is, knowing what we do about Uber, surprising and unsurprising at the same time.

    Things appear to be changing now that someone new is at the helm; otherwise, we may never have learned of the breach. And yet it feels as if the corporate miasma will take a while to disperse (from thenewstribune.com):

    In a letter to Washington Attorney General Bob Ferguson's office last week, an Uber attorney wrote that the company "now thinks it was wrong not to provide notice to affected users at the time" [of the 2016 Uber data breach].

    Really? Now they think it was wrong?

     

    Related Articles and Sites:
    https://www.bloomberg.com/news/articles/2017-11-21/uber-concealed-cyberattack-that-exposed-57-million-people-s-data
    http://searchsecurity.techtarget.com/podcast/Risk-Repeat-Uber-data-breach-has-implications-for-infosec
    http://www.thenewstribune.com/news/local/article187221548.html
    https://www.recode.net/2017/11/22/16690556/uber-data-hack-57-million-state-investigation

     
  • Smartphone Encryption: FBI and Apple At It Again?

    Following the worst mass shooting in Texas history, the Federal Bureau of Investigation has announced in a press conference that they're unable to get into the smartphone of the shooter. The reason? Encryption.
    While the brand of the smartphone was not officially revealed at the time (so as to not alert the "baddies" which one is giving the FBI difficulties), Gizmodo and others have reported that it's an iPhone. Of course, this is not the first time that the FBI and Apple have crossed paths.  

    There's a History There

    Last year, the FBI and Apple went head-to-head in court: The FBI was looking to compel Apple to write a backdoor to its encryption (they denied this; however, the end result would have been the same). A magistrate ordered Apple to create a way to hack into the San Bernardino shooter's iPhone 5c. Apple refused. The parties involved went to a bigger court.
    Things were growing into a crescendo in court when the FBI suddenly announced it didn't need Apple's help after all. They had acquired software that could hack into an iPhone 5c (but not newer models). Some critics at the time accused the FBI of backing out, not because it had found another way to get to the encrypted data, but because it looked like the case would set a precedence against the FBI's interests.
    Today, we see a situation that is very similar: a mass shooting, a smartphone that's encrypted, the FBI unable to access it. But if you're looking for a repeat of last year's court drama, it probably won't happen.
    With the San Bernardino case, the FBI argued that they needed to access the device to see if the shooter was linked to other terrorists; if memory serves, the FBI concluded before going to court that this was not the case. Regardless, a legal decision was sought (and dropped, as mentioned earlier).
    In the more recent Texas shooting, we know it's not an act of terrorism – at least not the one the US regularly rallies against. Indeed, if one follows the news, it looks like the FBI is building its case of what happened quite readily, without the need to access the encrypted smartphone.
    The usual argument for "there could be an additional threat out there, ready to pounce soon" cannot be made. It would fall upon deaf ears and so there really isn't much impetus for the FBI to make a scene like it did last year. But complain about encryption? It's been doing that any chance it can get, so it's not surprising that they're bringing it up.  

    Cat and Mouse Games

    What is surprising, perhaps barely, is that the FBI still appears to be playing games designed to sway public opinion. According to various media outlets, Apple reached out to the FBI to offer assistance – which feels oxymoronic, the two having gone to court over that same issue – but the FBI never acknowledged it. While some insinuated that Apple did this before the FBI complained about the encrypted phone in the press conference, it was clarified that the Cupertino-based tech giant reached out afterwards, when they finally realized that an iPhone was involved.
    Regardless, the offer was for naught.
    Apparently, the FBI did not reach out to Apple at all. All reports suggest that the FBI did not completely stop seeking Apple's help after the duo's legal showdown last year, so it is quite surprising that the government did not seek Apple's help in one of the year's most high-profile cases.
    Did the FBI do this because they thought that Apple wouldn't help? Or couldn't help? Or because they forgot about it? Or was it a measured tactic that they're using to carve more notches in their "encryption is aiding criminals" pole?
    The answer, short of another legal loggerheads extravaganza, will depend on how much Konspiracy Kool-Aid you're willing to drink.
    As usual, some degree of sympathy goes out to the FBI and other law-enforcement agencies. Nobody denies that encryption can and will hamstring investigations. However, the position that the FBI (and it is the FBI in particular. You don't hear a peep from the NSA or the CIA regarding encryption) has taken up is highly questionable.
    If the past couple of years have shown anything, it's that ordinary citizens need more data security, not less.
     
    Related Articles and Sites:
    https://gizmodo.com/the-fbi-is-seeking-access-to-the-sutherland-springs-sho-1820273857
    https://www.cnet.com/news/apple-vs-fbi-one-year-later-still-stuck-in-limbo/
    https://www.theverge.com/2017/11/8/16626452/apple-fbi-texas-shooter-iphone-unlock-encryption-debate
     
  • Hilton To Pay $700,000 Over 2015 Data Breach, Slow Notifications

    The New York attorney general has announced a $700,000 settlement with Hilton Worldwide Holdings over issues related to the two data breaches that occurred in 2014 and 2015. $400,000 will go to New York. The remaining goes to Vermont which collaborated in the investigation.  

    Reported Breaches Late, In November 2015

    Multinational corporations being hacked is old news. It happened to Yahoo, Target, Merck, Equifax, etc. – the list is endless and varied. No industry is exempt, no company is free from the internet renegades who are willing to compromise a network for financial rewards, to make political statements… or just because they're bored and they can.
    When a company is fined hundreds of thousands of dollars in this day and age by the government for a data security breach, it means the victimized companies must have grievously erred somehow. In Hilton's case, they were apparently employing lax security practices and were slow with their data breach notifications.
    The famed hospitality company became aware of a data breach in February 2015 (the actual hack occurred sometime between November and December 2014). Another breach was discovered in July 2015, with the intrusion occurring between April and July of the same year. The notifications were not sent out until late November. If your yardstick starts from the second breach, it's about two months after discovery; if you're measuring from the first data breach, it's nine months.
    Which one to use? Common sense would dictate that it's the first. Especially considering that, while many states' data breach notification laws require a notification no later than 60 calendar days, not all states do. New York, in fact, only states that:
    The disclosure must be made in the most expedient time possible and without unreasonable delay…
    One could argue that 60 days was as expedient as it could get, but nine months?
    In addition, it turned out that Hilton was not compliant with PCI-DSS requirements, a set of security rules meant to minimize the incident of credit card number hacks.  

    Have You Seen HLT's 10-K?

    Seven-hundred thousand dollars is a big chunk of money. However, it's meaningless to a company like Hilton. The holding company had revenues of over $11.6 billion in 2016 with net income of $348 million. That makes $700K a cost of doing business, and a small one at that.
    Look at it this way: In Hilton's case, over 360,000 credit cards were put at risk. That works out to nearly a $2 fine per credit card compromised. Their hotels' profit margins on minibar peanuts is probably higher. I imagine that management is probably more concerned about the cost of towels and robes that go missing each year.
    So, the AG's proclamation that data breaches take top priority can feel a little anticlimactic based on the figures involved. But, it's not his fault. He doesn't make the law; he merely does what he can with the legal tools he's given. People have been calling for greater punitive damages against companies who appear to be less than concerned that their security is compromised (who in turn have been whining since the early 2000s that they're victims, too. For companies that do this, let's put this way: it's hard to sympathize with a drunk driver who ran over the neighbor's dog but asks for pity because his car was totaled and his ribs are broken).
    Case in point regarding the legal branch having its hands tied: despite the disaster that is Equifax, the US Congress has voted this week to make it harder for people to sue it.  
     
    Related Articles and Sites:
    https://www.engadget.com/2017/10/31/hilton-data-breaches-700-000-penalty/
    https://ag.ny.gov/press-release/ag-schneiderman-announces-700000-joint-settlement-hilton-after-data-breach-exposed
    http://codes.findlaw.com/ny/general-business-law/gbs-sect-899-aa.html
    https://finance.yahoo.com/quote/HLT/financials?p=HLT
    https://techcrunch.com/2017/10/24/congress-votes-to-disallow-consumers-from-suing-equifax-and-other-companies-with-arbitration-agreements/
     
  • FBI Unable to Access 7000 Encrypted Devices in 2017

    At the International Association of Chiefs of Police conference, held in Philadelphia last week, Federal Bureau of Investigation Director Christopher Wray noted that the FBI has nearly 7,000 encrypted devices it cannot access. Per the phillyvoice.com:
    In the first 11 months of the fiscal year [2017], federal agents were unable to access the content of more than 6,900 mobile devices, Wray said in a speech….
    Considering what Wray's predecessor had to say about the issue in 2016, the problem is growing, fast:
    [Former FBI Director James Comey] said, during the last three months of 2016 the FBI lab received 2,800 electronic devices sent in by local police and federal agents looking for evidence they contain. But analysts were unable to open 1,200 of them, "using any technique."
    Assuming that the influx of inaccessible encrypted devices to the FBI's labs remained relatively constant last year, the implication is that the FBI possessed 4,800 encrypted mobile devices in 2016. In other words, there was a 50% increase year-over-year.  

    A Growing Problem

    One can expect the number of inaccessible smartphones to keep growing for a number of reasons.
    First, older devices get replaced with new ones, eventually. That in of itself doesn't mean anything security-wise, except that encryption was not turned on by default for many older devices. Even if encryption were turned on, a password may not have been required.
    Smartphones and tablets now come with encryption turned on by default and require a form of password; one can assume that nearly 100% of the phones the FBI needs to search in the future will be inaccessible.
    Second, encryption tends to get stronger over time because researchers are constantly trying to find flaws in it. When found, they're patched up. Cracking techniques that may have worked in the past may not be available on newer devices.
    When the FBI filed and then dropped a lawsuit against Apple in 2016, the Bureau revealed that it had obtained a method to gain access to an iPhone 5C (they didn't reveal what it was). Thus, it didn't need to force Apple through the courts. It also noted that this method didn't work on iPhones newer than the 5C, so that's as far as that technique will go. Seeing how OS updates to the iPhone 5C ended this past summer, the FBI's mysterious technique will see limited action in the future.
    This tends to be the general pattern for flaws in security (assuming, of course, that you have bright people working on the problem; sometimes, flaws go undetected for years, possibly decades. Still, encryption performance points in one direction).
    Third, more people are aware of the power and need for encryption. When the FBI butted heads with Apple (and, indirectly, with the entire tech community) in 2016, many in Congress initially supported the FBI. Calls for encryption backdoors, explicit or otherwise, were in the air. As time went by and these representatives educated themselves on the pros and cons of purposefully hamstringing cryptography, they started backtracking.
    But, it's not just Congress. Ironically, the Apple vs. FBI case caused ripples and worked to educate a lot of people about encryption and its benefits, detriments, and importance. With more people aware of what encryption does and how it works, you can expect encryption to extend to even those devices that don't come with it by default.  

    How to Solve It?

    So, yeah, encryption is problematic for the FBI. And, it will continue to be problematic. Hence, it's not surprising to find that,
    The Justice Department under President Donald Trump has suggested it will be aggressive in seeking access to encrypted information from technology companies. But in a recent speech, Deputy Attorney General Rod Rosenstein stopped short of saying exactly what action it might take. [apnews.com]
    Honestly, short of a backdoor, there isn't a solution here, and a backdoor is not a solution. Still, seeing how strange 2017 has been (and will probably be for the next three years, at least), it wouldn't be surprising if the FBI finally got what they wished for. No matter how ill-advised it might be.
     
    Related Articles and Sites:
    http://www.phillyvoice.com/fbi-couldnt-access-nearly-7k-devices-because-of-en/
    https://gizmodo.com/the-fbi-cant-stop-fearmongering-about-encryption-1819772851
    https://www.nbcnews.com/news/us-news/comey-fbi-couldn-t-access-hundreds-devices-because-encryption-n730646
    https://apnews.com/04791dfbe30a4d3596e8d187b16d837e
     
  • 47.5 GB of PHI Left Exposed on the Cloud. (That's 316,000 PDFs)

    According to gizmodo.com, security researchers at Kromtech Security Center found a wide-open Amazon Web Services (AWS) bucket that contained over 300,000 PDFs, each one a medical file that would fall under the governance of the Health Insurance Portability and Accountability Act (or HIPAA which, arguably, finally jumpstarted the drive towards encrypting sensitive digital files thanks to generous fines levied on hospitals and other legally-covered entities that screwed up their data security).
    There have been (too) many similar cases over the years, although we're beginning to see a transition of sorts: while the past showed incorrectly configured servers at the center of an "accidental" data breach (that is, the blame didn't lie on hackers but on what a company's IT staff decided to do…or not do), today's incidents increasingly tend to involve incorrectly configured cloud services, be it AWS, Microsoft's Azure, Dropbox, or others.
    Technically, they're the same problem – misconfigured settings on boxes connected to the internet – but the former was more complex than what one deals with today: nowadays, you click on a checkbox in a webform, hit the save button, and companies like Amazon take care of the rest.
    (Although, if one were to play Devil's Advocate, it should be pointed out that AWS does support programmatic read-write permissions which are similar, but nowhere close, to server configurations of yore).  

    Quick Remediation

    When Kromtech alerted the healthcare company of the error, the situation was corrected the very same day. However, they appear to have remained incommunicado to subsequent reach outs by the security company. Not necessarily the height of gratitude but, hey, it doesn't look like they're ignorantly suing Kromtech "for hacking" them, so that's a plus. The downside: the PDFs contained,
    In addition to names, addresses, and other contact information, many of the records contained dates of birth, diagnoses, as well as the names of physicians overseeing care of the patients…
    No SSNs or credit card details. However, with information like the above, obtaining such data is literally a phone call away. In a world where millions get scammed for computer tech support they don't need, how hard would it be to socially engineer sensitive data by posing as hospital staff that know real details about someone's recent medical history?
    The answer is "not very hard."  

    Prevention

    One easy way to lower the odds of suffering similar data breaches is to use file encryption prior to uploading digital documents to the cloud. This was the case when people set up their own internet-facing databases in the past and still is the case with cloud services. Granted, AWS's security options are more than adequate, at least when it comes to conforming to data security requirements and regulations across the US.
    But that's within the confines of the cloud service (and assuming one doesn't screw it up by unchecking the wrong box). If the internet is used as a cloud-based document repository, then those files will descend from the cloud at some point (which seems pretty likely for PDFs). Will they be downloaded to a laptop or a desktop? Backed up to tape? Copied over to a USB drive? Emailed as an attachment?
    In each case, encrypting a file is basically the only way to secure the data. And if so, if the files are being uploaded and downloaded from the cloud, why not encrypt them before doing anything at all? The risk of something going awry may be small, but the expected ramifications are huge if or when something does go wrong.
     
    Related Articles and Sites:
    https://gizmodo.com/data-breach-exposed-medical-records-including-blood-te-1819322884
     
  • Equifax Data Breach Continues To Bear Poisoned Fruit

    About two weeks ago, when Equifax first revealed their massive data breach, it was noted by many that the company didn't appear to be prepared nor equipped to deal with the demands of whatever contingency plans they had prepared for the day they would be hacked. That was on the first day after Equifax had gone public.
    In the two weeks since, those observations have proven to be more than prescient. Because so much has happened, I present you a list. Between then and as of September 19, 2017, the following are true:
    • The price of Equifax's stock has plunged 35% in response to the data breach and all the other news following it.
    • A couple of Equifax honcho's "retired" after the breach was made public, including the Chief Security Officer (CSO).
    • It turns out that Equifax's CSO has a bachelor's and master's degree in music.
      • It should be noted, however, that she has worked in security-related positions at other big companies.
      • Plus, plenty of programmers (security or otherwise) are music majors, philosophy majors, art majors… you get the idea. (On the other hand, this is apparently not the case for the ex-CSO, as far as one can tell).
    • More than 30 lawsuits have been filed.
    • The Federal Trade Commission announced an investigation into the data breach.
    • The US DOJ started criminal investigations to see if the three executives who recently sold nearly $2 million in stock violated federal law.
    • Security researchers found that Equifax's Argentinian branch had an employee portal that used "admin" and "admin" for the username and password.
    • Equifax initially blamed a vulnerability in Apache software for the hack. The latter immediately issued a press release pointing out that a security patch had been available since March.
    • Speaking of March, it turns out that there was an initial data breach at Equifax that occurred in that same month.
      • While currently being treated separately, it could possibly be the initial ingress into Equifax, well before the July data breach that was initially proclaimed.
    • Equifax revealed that up to 400,000 in England had been affected by the breach.
      • As well as 10,000 in Canada.
      • And let's not forget the 143 million in the USA.
    • The site Equifax set up to reveal whether a person was affected by the data breach gave inaccurate answers.
      • That site was set up outside of the main Equifax.com site. As certain security researchers noted, it made for easy phishing. One proved it by setting up a fake site, which ended up being passed via Twitter by whoever was managing Equifax's Twitter account.
    • Equifax tried to charge consumers for freezing their credit reports – and then announced that they wouldn't.

    Some of the reactions to the data breach are not unexpected, and yet surprising – like the lawsuits. It was expected, but thirty of them filed in less than a week? Wow.

    Other outcomes, such as charging people for freezing their credit reports, are mind-blowing. It's like no one thought to consult the PR department because… at this point, what's the use?

    The stock market seems to think that the other shoe has fallen. At the beginning of this week, Equifax's stock price stopped its losses and ever so slowly begun to rise, although some say that it's nothing but a dead cat bounce, either because the market hasn't effectively priced everything in or because there's more bad news on the horizon.

    Based on the last couple of weeks, it wouldn't be foolhardy to wait and see what other surprises spring up.  

    Related Articles and Sites: https://www.databreaches.net/equifax-data-breach-aftermath-lawsuits-and-criticism-mount-stock-prices-plummet/

     
More Posts Next page »