Last week, the world saw another bombshell announcement from WikiLeaks. Per their tweets and resulting confidential data dump, it was readily apparent that the CIA had amassed techniques for breaking into many kinds of digital devices imaginable: smartphones and computers, yes, but also things connected to the internet, like smart TVs (perhaps they've looked into hacking internet-connected refrigerators as well).
But, unlike the initial announcements that were passed around like crazy, it looks like the CIA does not have easy access to the encrypted data. If anything, one could say that a large reason why Langley has so many hacking tools is that they need to get around encryption somehow. Seeing how encryption is nearly impossible to break – it is possible, apparently, but a number of conditions must be met, including the physical presence of the device – it becomes easier to hit cryptography where it is weakest: before the data is encrypted (or once it's decrypted).
There is an inherent "weakness" in cryptography: encrypted data cannot be read by human beings (or machines, for that matter) in encrypted form. The information has to be decrypted at some point if it's to be useful to someone; that is, so it can be read, copied and pasted, processed, etc.
And that's where the CIA are targeting their efforts: since the encrypted data has to be decrypted at some point, let's read it then but no sooner. Mind you, other intelligence agencies around the world are probably doing this as well; it can't only be the guys in Langley, especially when you consider that they're one of the best financed and equipped SIGINT bodies in the world, and they are having problems breaking encryption.
This is good news and bad news. It's good news because you know that encryption works. Your encrypted laptop was stolen at the airport, and the device contained sensitive information? You can rest easy, knowing the odds of a data breach are at the nanoscale end of things.
At the same time, it's bad news because, for the CIA to do their job, they can't reveal the software weaknesses they're exploiting; doing so would lead to companies patching up these problems.
Considering that exploiting these weaknesses is technically the easier way to spy on someone's communications, logic dictates that others must be taking advantage of this weakness as well; people tend to go for the low-hanging fruit.
This, in an indirect manner, makes the CIA complicit in weakening security for all Americans, because, undoubtedly, the same weaknesses they're hiding from the public is being used by others to spy on Americans, more specifically Americans in power. (Sure, officials at the highest echelons of government have to have their devices vetted – but the past couple of years have shown this is not how it actually works in real life, a fact that reached a fervor this past US election).
The silver-lining on all of this may be that the government is nailing the coffin on a contentious issue: encryption backdoors.
The CIA's actions prove what academics have argued for nearly three decades, if not more: a security weakness is an invitation to be exploited. The hardware and software industry did not tell the CIA about the security defects in its products; rather, the agency just knew there must be some (because there's always a weakness somewhere) and found them on its own.
The above is your proverbial search for a needle in a haystack (or needles in haystacks), except that you don't know whether said needles exist. You make an assumption that they do and go from there. If after some time you don't find any needles, you move to the next haystack.
Now imagine what would happen if the US government had succeeded in requiring, by law, a backdoor in encryption. You know that backdoor is somewhere. If you will, the US required that needle to be placed in the haystack. It's a matter of time until you find it; time spent looking for it is not time wasted.
Thankfully, the nonsense regarding backdoors was quickly laid to rest. Now, if only the CIA would agree that keeping known vulnerabilities a secret is a bad idea… just like they did when it comes to encryption backdoors.