(Editor’s note: This commentary was originally published at Law and Liberty.)
When you read that NSA’s capting and sorting most telephone and internet traffic is “America’s main remaining advantage over terror networks,” (the Wall Street Journal editorial, Dec. 17) or that “The effectiveness of data mining is proportionate to the size of the sample” (ibid, June 10), you should know that the writers are as ignorant of what technologies make signals intelligence effective as they are careless of our liberties. Ignorantly, they have swallowed the propaganda of co-dependent bureaucrats at NSA and in industry. Unwittingly, they are shilling for corruption, paid for by intrusion and inefficiency.
In fact, NSA’s equation of communications intelligence with universal capting-and-sorting, a practice that dates to WWII, has been yielding sharply diminishing returns for two generations and has been the subject of debate among the few persons cleared for the details. Having been one of these and having taken part in these internal debates, I am happy now to see these matters exposed to wider attention.
The Presidential Panel on NSA surveillance, by perhaps the most obscure of its recommendations (#20 out of 46), proposed developing “Software that would allow… intelligence agencies more easily to conduct targeted information acquisition rather than bulk-data collection.” That would mean equipping NSA’s universal vacuum-cleaners-of-the-electronic-spectrum with criteria to reject the vast bulk of data that no one envisages ever analyzing. The software would do a kind of pre-analysis.
This recommendation just scratches the surface of a big debate. It starts from the fact that, even in the 1980s, 90 percent of NSA’s communications intelligence budget was devoted to unfocused, universal collection but produced only 50 percent of COMINT reports, while the other half of reports came from targeted collection. Simply, while on-the-air traffic has exploded in volume even as its valuable nuggets have been shielded by unbreakable encryption, even as cheap cellphones have universalized access to “one-time pads” – all of which has reduced the value of universal collection – the technologies that broadly fall under the category of “bugging” have improved by leaps and bounds. Since the technical trends that produced this disparity have been accelerating, NSA has had a harder and harder time justifying its motto: “collect first, think about it later.”
A noteworthy Wall Street Journal story reports that, in 1996, NSA reacted to pressures to shift its multibillion dollar COMINT budget by developing software to sift data within collection systems to avoid recording useless stuff, especially the mountains of American internet traffic that were swamping NSA’s storage capacity. While the advocates of this, called “Thin Thread,” knew that protecting the privacy of Americans would be a side benefit of their work, the efficiency of intelligence was the program’s primary objective.
However, by the time the software had been developed in 1999 for all of $3 million, NSA’s leadership decided to throw it away. The decision seems to have proceeded from the straightforward logic of bureaucracy: this small step toward targeted collection could have been a first step away from the any number of multi-billion dollar programs for bulk collection and storage. These are programs in which countless officials have build careers within the agency, programs that offer these officials the post-retirement jobs by which they cash in on their service; programs whose contractors are major contributors to members of the House and Senate Intelligence committees.
My own experience on the Senate Intelligence Committee staff is consistent with the program team’s contention that the program’s abandonment was due to the contractors’ lobbying. Again and again, senators and staff would agree to redirect programs away from established priorities, or to trim them on the basis of reasoning about what would produce better intelligence, only to be turned around by standard lobbying from teams of officials and contractors. The juggernaut continued to roll, directed by money and careers, careers and money.
The post-9/11 environment let this back-scratching combination cover itself with the claim that “more” collection and capacity to look into more nooks and crannies is a viable substitute for good judgment about where you should be looking – to put it bluntly, for explicit profiling.
That claim is the reverse of the truth. In fact, mere expansion of collection leads naturally to focusing on the data that is available most easily and plentifully. That happens to be data on ordinary Americans. That is why federal agencies scramble for access to NSA’s trove in order the better to enforce their burgeoning regulations. Security bureaucrats being as lazy as any other kind, they will find “suspects” where the finding is easy – by profiling of the implicit kind.
As our ruling class applies the term “terrorist” ever more promiscuously and conveniently to its own domestic competitors, it would be surprising were data gathered by mere inertia resulting from garden-variety corruption not used for the most nefarious of purposes.
For more information please click here:
Angelo M. Codevilla is professor emeritus of international relations at Boston University. He served as a U.S. Senate Staff member dealing with oversight of the intelligence services. His book Peace Among Ourselves and With All Nations is forthcoming from Hoover Institution Press.