It's an interesting read, mostly about the government surveillance of him and other journalists. He speaks about an NSA program called FIRSTFRUITS that specifically spies on US journalists. (This isn't news; we learned about this in 2006. But there are lots of new details.)
One paragraph in the excerpt struck me:
Years later Richard Ledgett, who oversaw the NSA's media-leaks task force and went on to become the agency's deputy director, told me matter-of-factly to assume that my defenses had been breached. "My take is, whatever you guys had was pretty immediately in the hands of any foreign intelligence service that wanted it," he said, "whether it was Russians, Chinese, French, the Israelis, the Brits. Between you, Poitras, and Greenwald, pretty sure you guys can't stand up to a full-fledged nation-state attempt to exploit your IT. To include not just remote stuff, but hands-on, sneak-into-your-house-at-night kind of stuff. That's my guess."
I remember thinking the same thing. It was the summer of 2013, and I was visiting Glenn Greenwald in Rio de Janeiro. This was just after Greenwald's partner was detained in the UK trying to ferry some documents from Laura Poitras in Berlin back to Greenwald. It was an opsec disaster; they would have been much more secure if they'd emailed the encrypted files. In fact, I told them to do that, every single day. I wanted them to send encrypted random junk back and forth constantly, to hide when they were actually sharing real data.
As soon as I saw their house I realized exactly what Ledgett said. I remember standing outside the house, looking into the dense forest for TEMPEST receivers. I didn't see any, which only told me they were well hidden. I guessed that black-bag teams from various countries had already been all over the house when they were out for dinner, and wondered what would have happened if teams from different countries bumped into each other. I assumed that all the countries Ledgett listed above -- plus the US and a few more -- had a full take of what Snowden gave the journalists. These journalists against those governments just wasn't a fair fight.
I'm looking forward to reading Gellman's book. I'm kind of surprised no one sent me an advance copy.
The NSA just published a survey of video conferencing apps. So did Mozilla.
Zoom is on the good list, with some caveats. The company has done a lot of work addressing previous security concerns. It still has a bit to go on end-to-end encryption. Matthew Green looked at this. Zoom does offer end-to-end encryption if 1) everyone is using a Zoom app, and not logging in to the meeting using a webpage, and 2) the meeting is not being recorded in the cloud. That's pretty good, but the real worry is where the encryption keys are generated and stored. According to Citizen Lab, the company generates them.
The Zoom transport protocol adds Zoom's own encryption scheme to RTP in an unusual way. By default, all participants' audio and video in a Zoom meeting appears to be encrypted and decrypted with a single AES-128 key shared amongst the participants. The AES key appears to be generated and distributed to the meeting's participants by Zoom servers. Zoom's encryption and decryption use AES in ECB mode, which is well-understood to be a bad idea, because this mode of encryption preserves patterns in the input.
AES 256-bit GCM encryption: Zoom is upgrading to the AES 256-bit GCM encryption standard, which offers increased protection of your meeting data in transit and resistance against tampering. This provides confidentiality and integrity assurances on your Zoom Meeting, Zoom Video Webinar, and Zoom Phone data. Zoom 5.0, which is slated for release within the week, supports GCM encryption, and this standard will take effect once all accounts are enabled with GCM. System-wide account enablement will take place on May 30.
There is nothing in Zoom's latest announcement about key management. So: while the company has done a really good job improving the security and privacy of their platform, there seems to be just one step remaining to fully encrypt the sessions.
The other thing I want Zoom to do is to make the security options necessary to prevent Zoombombing to be made available to users of the free version of that platform. Forcing users to pay for security isn't a viable option right now.
Finally -- I use Zoom all the time. I finished my Harvard class using Zoom; it's the university standard. I am having Inrupt company meetings on Zoom. I am having professional and personal conferences on Zoom. It's what everyone has, and the features are really good.
The New York Times is reporting on the NSA's phone metadata program, which the NSA shut down last year:
A National Security Agency system that analyzed logs of Americans' domestic phone calls and text messages cost $100 million from 2015 to 2019, but yielded only a single significant investigation, according to a newly declassified study.
Moreover, only twice during that four-year period did the program generate unique information that the F.B.I. did not already possess, said the study, which was produced by the Privacy and Civil Liberties Oversight Board and briefed to Congress on Tuesday.
The privacy board, working with the intelligence community, got several additional salient facts declassified as part of the rollout of its report. Among them, it officially disclosed that the system has gained access to Americans' cellphone records, not just logs of landline phone calls.
It also disclosed that in the four years the Freedom Act system was operational, the National Security Agency produced 15 intelligence reports derived from it. The other 13, however, contained information the F.B.I. had already collected through other means, like ordinary subpoenas to telephone companies.
The report cited two investigations in which the National Security Agency produced reports derived from the program: its analysis of the Pulse nightclub mass shooting in Orlando, Fla., in June 2016 and of the November 2016 attack at Ohio State University by a man who drove his car into people and slashed at them with a machete. But it did not say whether the investigations into either of those attacks were connected to the two intelligence reports that provided unique information not already in the possession of the F.B.I.
This program is legal due to the USA FREEDOM Act, which expires on March 15. Congress is currentlydebating whether to extend the authority, even though the NSA says it's not using it now.
Mr. Sanborn has set up systems to allow people to check their proposed solutions without having to contact him directly. The most recent incarnation is an email-based process with a fee of $50 to submit a potential solution. He receives regular inquiries, so far none of them successful.
The ongoing process is exhausting, he said, adding "It's not something I thought I would be doing 30 years on."
I have a related personal story. Back in 1993, during the first Crypto Wars, I and a handful of other academic cryptographers visited the NSA for some meeting or another. These sorts of security awareness posters were everywhere, but there was one I especially liked -- and I asked for a copy. I have no idea who, but someone at the NSA mailed it to me. It's currently framed and on my wall.
I'll bet that the NSA didn't get permission from Jay Ward Productions.
Google reportedly has a database called Sensorvault in which it stores location data for millions of devices going back almost a decade.
The article is about geofence warrants, where the police go to companies like Google and ask for information about every device in a particular geographic area at a particular time. In 2013, we learned from Edward Snowden that the NSA does this worldwide. Its program is called CO-TRAVELLER. The NSA claims it stopped doing that in 2014 -- probably just stopped doing it in the US -- but why should it bother when the government can just get the data from Google.
Yesterday's Microsoft Windows patches included a fix for a critical vulnerability in the system's crypto library.
A spoofing vulnerability exists in the way Windows CryptoAPI (Crypt32.dll) validates Elliptic Curve Cryptography (ECC) certificates.
An attacker could exploit the vulnerability by using a spoofed code-signing certificate to sign a malicious executable, making it appear the file was from a trusted, legitimate source. The user would have no way of knowing the file was malicious, because the digital signature would appear to be from a trusted provider.
A successful exploit could also allow the attacker to conduct man-in-the-middle attacks and decrypt confidential information on user connections to the affected software.
That's really bad, and you should all patch your system right now, before you finish reading this blog post.
This is a zero-day vulnerability, meaning that it was not detected in the wild before the patch was released. It was discovered by security researchers. Interestingly, it was discovered by NSA security researchers, and the NSA security advisory gives a lot more information about it than the Microsoft advisory does.
Exploitation of the vulnerability allows attackers to defeat trusted network connections and deliver executable code while appearing as legitimately trusted entities. Examples where validation of trust may be impacted include:
Signed files and emails
Signed executable code launched as user-mode processes
The vulnerability places Windows endpoints at risk to a broad range of exploitation vectors. NSA assesses the vulnerability to be severe and that sophisticated cyber actors will understand the underlying flaw very quickly and, if exploited, would render the previously mentioned platforms as fundamentally vulnerable.The consequences of not patching the vulnerability are severe and widespread. Remote exploitation tools will likely be made quickly and widely available.Rapid adoption of the patch is the only known mitigation at this time and should be the primary focus for all network owners.
Early yesterday morning, NSA's Cybersecurity Directorate head Anne Neuberger hosted a media call where she talked about the vulnerability and -- to my shock -- took questions from the attendees. According to her, the NSA discovered this vulnerability as part of its security research. (If it found it in some other nation's cyberweapons stash -- my personal favorite theory -- she declined to say.) She did not answer when asked how long ago the NSA discovered the vulnerability. She said that this is not the first time the NSA sent Microsoft a vulnerability to fix, but it was the first time it has publicly taken credit for the discovery. The reason is that the NSA is trying to rebuild trust with the security community, and this disclosure is a result of its new initiative to share findings more quickly and more often.
Barring any other information, I would take the NSA at its word here. So, good for it.
And -- seriously -- patch your systems now: Windows 10 and Windows Server 2016/2019. Assume that this vulnerability has already been weaponized, probably by criminals and certainly by major governments. Even assume that the NSA is using this vulnerability -- why wouldn't it?
Transport Layer Security Inspection (TLSI), also known as TLS break and inspect, is a security process that allows enterprises to decrypt traffic, inspect the decrypted content for threats, and then re-encrypt the traffic before it enters or leaves the network. Introducing this capability into an enterprise enhances visibility within boundary security products, but introduces new risks. These risks, while not inconsequential, do have mitigations.
The primary risk involved with TLSI's embedded CA is the potential abuse of the CA to issue unauthorized certificates trusted by the TLS clients. Abuse of a trusted CA can allow an adversary to sign malicious code to bypass host IDS/IPSs or to deploy malicious services that impersonate legitimate enterprise services to the hosts.
A further risk of introducing TLSI is that an adversary can focus their exploitation efforts on a single device where potential traffic of interest is decrypted, rather than try to exploit each location where the data is stored.Setting a policy to enforce that traffic is decrypted and inspected only as authorized, and ensuring that decrypted traffic is contained in an out-of-band, isolated segment of the network prevents unauthorized access to the decrypted traffic.
To minimize the risks described above, breaking and inspecting TLS traffic should only be conducted once within the enterprise network. Redundant TLSI, wherein a client-server traffic flow is decrypted, inspected, and re-encrypted by one forward proxy and is then forwarded to a second forward proxy for more of the same,should not be performed.Inspecting multiple times can greatly complicate diagnosing network issues with TLS traffic. Also, multi-inspection further obscures certificates when trying to ascertain whether a server should be trusted. In this case, the "outermost" proxy makes the decisions on what server certificates or CAs should be trusted and is the only location where certificate pinning can be performed.Finally, a single TLSI implementation is sufficient for detecting encrypted traffic threats; additional TLSI will have access to the same traffic. If the first TLSI implementation detected a threat, killed the session, and dropped the traffic, then additional TLSI implementations would be rendered useless since they would not even receive the dropped traffic for further inspection. Redundant TLSI increases the risk surface, provides additional opportunities for adversaries to gain unauthorized access to decrypted traffic, and offers no additional benefits.
Nothing surprising or novel. No operational information about who might be implementing these attacks. No classified information revealed.
In an extraordinary essay, the former FBI general counsel Jim Baker makes the case for strong encryption over government-mandated backdoors:
In the face of congressional inaction, and in light of the magnitude of the threat, it is time for governmental authorities -- including law enforcement -- to embrace encryption because it is one of the few mechanisms that the United States and its allies can use to more effectively protect themselves from existential cybersecurity threats, particularly from China. This is true even though encryption will impose costs on society, especially victims of other types of crime.
I am unaware of a technical solution that will effectively and simultaneously reconcile all of the societal interests at stake in the encryption debate, such as public safety, cybersecurity and privacy as well as simultaneously fostering innovation and the economic competitiveness of American companies in a global marketplace.
All public safety officials should think of protecting the cybersecurity of the United States as an essential part of their core mission to protect the American people and uphold the Constitution. And they should be doing so even if there will be real and painful costs associated with such a cybersecurity-forward orientation. The stakes are too high and our current cybersecurity situation too grave to adopt a different approach.
Basically, he argues that the security value of strong encryption greatly outweighs the security value of encryption that can be bypassed. He endorses a "defense dominant" strategy for Internet security.
Glenn Gerstell, the General Counsel of the NSA, wrote a long and interesting op-ed for the New York Times where he outlined a long list of cyber risks facing the US.
There are four key implications of this revolution that policymakers in the national security sector will need to address:
The first is that the unprecedented scale and pace of technological change will outstrip our ability to effectively adapt to it. Second, we will be in a world of ceaseless and pervasive cyberinsecurity and cyberconflict against nation-states, businesses and individuals. Third, the flood of data about human and machine activity will put such extraordinary economic and political power in the hands of the private sector that it will transform the fundamental relationship, at least in the Western world, between government and the private sector. Finally, and perhaps most ominously, the digital revolution has the potential for a pernicious effect on the very legitimacy and thus stability of our governmental and societal structures.
He then goes on to explain these four implications. It's all interesting, and it's the sort of stuff you don't generally hear from the NSA. He talks about technological changes causing social changes, and the need for people who understand that. (Hooray for public-interest technologists.) He talks about national security infrastructure in private hands, at least in the US. He talks about a massive geopolitical restructuring -- a fundamental change in the relationship between private tech corporations and government. He talks about recalibrating the Fourth Amendment (of course).
The essay is more about the problems than the solutions, but there is a bit at the end:
The first imperative is that our national security agencies must quickly accept this forthcoming reality and embrace the need for significant changes to address these challenges. This will have to be done in short order, since the digital revolution's pace will soon outstrip our ability to deal with it, and it will have to be done at a time when our national security agencies are confronted with complex new geopolitical threats.
Much of what needs to be done is easy to see -- developing the requisite new technologies and attracting and retaining the expertise needed for that forthcoming reality. What is difficult is executing the solution to those challenges, most notably including whether our nation has the resources and political will to effect that solution. The roughly $60 billion our nation spends annually on the intelligence community might have to be significantly increased during a time of intense competition over the federal budget. Even if the amount is indeed so increased, spending additional vast sums to meet the challenges in an effective way will be a daunting undertaking. Fortunately, the same digital revolution that presents these novel challenges also sometimes provides the new tools (A.I., for example) to deal with them.
The second imperative is we must adapt to the unavoidable conclusion that the fundamental relationship between government and the private sector will be greatly altered. The national security agencies must have a vital role in reshaping that balance if they are to succeed in their mission to protect our democracy and keep our citizens safe. While there will be good reasons to increase the resources devoted to the intelligence community, other factors will suggest that an increasing portion of the mission should be handled by the private sector. In short, addressing the challenges will not necessarily mean that the national security sector will become massively large, with the associated risks of inefficiency, insufficient coordination and excessively intrusive surveillance and data retention.
A smarter approach would be to recognize that as the capabilities of the private sector increase, the scope of activities of the national security agencies could become significantly more focused, undertaking only those activities in which government either has a recognized advantage or must be the only actor. A greater burden would then be borne by the private sector.
It's an extraordinary essay, less for its contents and more for the speaker. This is not the sort of thing the NSA publishes. The NSA doesn't opine on broad technological trends and their social implications. It doesn't publicly try to predict the future. It doesn't philosophize for 6000 unclassified words. And, given how hard it would be to get something like this approved for public release, I am left to wonder what the purpose of the essay is. Is the NSA trying to lay the groundwork for some policy initiative ? Some legislation? A budget request? What?
He argues that the piece "is not in the spirit of forecasting doom, but rather to sound an alarm." Translated: Congress, wake up. Pay attention. We've seen the future and it is a sweaty, pulsing cyber night terror. So please give us money (the word "money" doesn't appear in the text, but the word "resources" appears eight times and "investment" shows up 11 times).
Susan Landau has a more considered response, which is well worth reading. She calls the essay a proposal for a moonshot (which is another way of saying "they want money"). And she has some important pushbacks on the specifics.
I don't expect the general counsel and I will agree on what the answers to these questions should be. But I strongly concur on the importance of the questions and that the United States does not have time to waste in responding to them. And I thank him for raising these issues in so public a way.