With all the discussion about the NSA’s digital surveillance activities, the “Big Brother” analogy is becoming as trite as it is inaccurate. NSA systems analyst Edward Snowden’s illegally disclosed information made the public keenly aware that the NSA gathers troves of metadata and digital content about foreign nationals and occasionally, U.S. citizens. This kind of surveillance has led many to liken the NSA’s activities to the dictatorial menace in George Orwell’s 1984. Yet this is a faulty comparison.
Orwell’s Big Brother was a surveillance state that sought to control every aspect of citizen life. The NSA and numerous other federal agencies involved in domestic and international surveillance are focused on security and detecting terrorist and violent threats to the United States and its interests. Setting aside the fear-mongering label of Big Brother, the government’s intelligence gathering activities merit a thoughtful, grounded public discussion. In this article series, we look at who is spying on whom, why they’re doing it, and what that means for a democratic society that values liberty and privacy.
All of these companies use sophisticated encryption that prevents most individuals and organizations from seeing what is traversing the web. The NSA, however, need not worry about encryption, because the world’s Internet companies hand over decrypted data.
A Widely Shared Secret
Everything Internet users do online leaves a trail, somewhere. When e-mails, Internet searches, video chats and other communications are sent, they pass through the servers of Internet goliaths like Google, Facebook, Apple, Microsoft and others. All of these companies use sophisticated encryption that prevents most individuals and organizations from seeing what is traversing the web. The NSA, however, need not worry about encryption, because the world’s Internet companies hand over decrypted data. This is PRISM, the NSA’s overarching program collecting enormous amounts of data from Internet companies. It is, to use the NSA term, Special Source Operations – the “special sources” being the Internet companies.
The authority for PRISM is granted by the Foreign Intelligence Surveillance Act (FISA), Sec. 702, which gives the agency the power to collect digital data as a means to inform foreign intelligence operations. Before Snowden released classified documents, PRISM was a closely guarded secret, in part because the NSA recognized the reputation damage that could be done to its “special sources” if the general public became aware of their activities. An NSA PowerPoint document on PRISM (released by Snowden) notes: “98 percent of PRISM production is based on Yahoo, Google and Microsoft; we need to make sure we don’t harm these sources.”
The Internet companies are not alone in giving data to the NSA. There are a range of other NSA programs working with telecommunications companies to gather similar kinds of phone data. For example, one program – called Blarney – collects information from AT&T. Other programs contain equally colorful and mysterious code names, like Stormbrew, Oakstar and Lithium. (More on the NSA’s phone call data collection will come in a later installment in this series.)
Acquiring and then filtering vast amounts of data takes a lot of sophisticated technology, which is supplied by the U.S. private sector. The Wall Street Journal reported that system technology is made by Narus (a subsidiary of Boeing Co.), Cisco Systems Inc. and Juniper Networks Inc. David Rieff humorously wrote in Foreign Policy that “the ‘secret’ [Snowden] revealed appeared to be one of the most broadly shared secrets in the world.” People in the White House, the U.S. Congress, U.S. allies, and numerous U.S. businesses were all aware of at least a portion of the NSA’s programs. Those who were not in on the secret were the billions of Internet users the program targets.
Given all the organizations complicit in this intelligence gathering, it is remarkable that much of the public outrage has been directed specifically at the NSA and not at the Internet companies and technology manufacturers, which are critical to the NSA’s Digital Network Intelligence work. Yet, the Internet companies are not necessarily eager to hand over their data. They simply have no choice, as they have been ordered to do so by the now infamous Foreign Intelligence Surveillance Court.
They simply have no choice, as they have been ordered to do so by the now infamous Foreign Intelligence Surveillance Court.
The NSA’s Signals Intelligence Directorate is charged with digging through all this information, both for their own intelligence work and sometimes at the behest of other intelligence organizations, like the CIA. Yet, PRISM does not give all intelligence organizations carte blanche to search through collected data. Instead, the data is inspected by analysts in Fort Meade, Md. (Well, mostly by analysts in Fort Meade. Sometimes it is looked at by off-site systems analysts, like Edward Snowden.) Yet, NSA analysts are not reading e-mails to old friends and tuning in to watch a Skype chat about what honest, tax-paying Americans are making for dinner. They are largely looking at metadata, not content.
Metadata is the information surrounding an individual’s online activity. On the Internet, it would mean data like the time of a search or message, the user’s IP address, their location, the recipient of a message and other details. What metadata does not include is the specific content of these online activities, such as the words in an e-mail or a recording of a video conversation. While the NSA does have access to content, the bulk of their searching and sifting deals with metadata.
Needles and Haystacks
It is difficult to wrap one’s mind around just how much data is pouring into the NSA’s programs. One of Snowden’s classified documents says that at some sites, the NSA is receiving more than 20 terabytes of data every day. That is the equivalent of about 10 billion single-spaced printed pages; if put in a single stack, it would be more than a half-mile high. It is twice the amount of printed material held at the Library of Congress. And that is only a portion of the agency’s overall data collection efforts. While acquiring and storing so much data necessitates a secret court and shadowy agreements with Internet companies, actually putting this much intelligence to use is a challenge unto itself.
After the Berlin Wall fell, western intelligence organizations and East Germans flooded into the Stasi’s headquarters and found rooms packed with all the reports, photographs and data collected through the East German security service’s impressive intelligence gathering apparatus. There was so much information, in fact, that the Stasi were simply overwhelmed and had no hope of synthesizing and using all the data they collected.
In today’s digital age, however, the NSA uses sophisticated algorithms and search programs to dig through the billions of data points collected.
While the parallels between the brutal Stasi and the NSA are mostly limited to the scope of their respective intelligence gathering operations, the challenge of how to use so much information is clear. In today’s digital age, however, the NSA uses sophisticated algorithms and search programs to dig through the billions of data points collected. One such program is, however, mysteriously named XKeyscore.