Sunday, November 26, 2017

PART 9 OF 9:THE SHADOW FACTORY THE ULTRA SECRET NSA FROM 9/11 TO THE EAVESDROPPING ON AMERICA

THE SHADOW FACTORY
The Ultra Secret NSA from 9/11 to the To The Eavesdropping on America 
BY JAMES BAMFORD 


BOOK FIVE 
FUTURE 
Exabytes
With the Middle East destined to be the central focus of NSA operations for the foreseeable future, shortly after the 9/11 attacks Mike Hayden took a close look at his limited and dilapidated facilities at NSA Georgia and called in the architects. As more trailers were set up to accommodate the growing overflow of personnel, work began on the blueprints for a sprawling new operational center to eavesdrop on target countries stretching from Pakistan to Libya. 

Code-named Sweet Tea, the new listening post will include a 501,699- square-foot operations building containing a workout room, a credit union, a mini–shopping area, nursing facilities, an eight-hundred-seat cafeteria, and a new 7,600-square-foot Visitor Control Center. Located at the corner of 16th Street and Lane Avenue, the new NSA/CSS Georgia was also designed with the NSA’s all-hearing antennas in mind. The location, said agency documents, “provides the perfect look angles with no possibility for encroachment to their required line-of-sight in the future.” While the cost of the project has been pegged at $340.8 million, that figure excludes the purchase and installation of all the new equipment. Once the costly computers and expensive analytical equipment have been acquired and wired in place, officials believe the final total could be closer to $1 billion. The latest projections in 2008 were that the center would be fully up and running by 2012 and employ more than four thousand workers, making it the agency’s largest facility outside of Fort Meade. 

Inside, the workstations of voice interceptors and data miners that once looked like battlefield command centers, with multiple computers and monitors crammed together, will now have a single monitor and hard drive. Until now, for security reasons, separate computer systems have been needed for different highly classified programs, thereby creating a physical firewall between them. One system might be used exclusively for an operation targeting a high-level encrypted Egyptian diplomatic network while another, less sensitive computer might be used on intercepts from an Iranian naval base. A third might be an unclassified system connected to the Internet. Switching between hard drives or computers was both time-consuming and cumbersome. 

But the NSA is now developing a secure “virtualization” platform able to combine multiple special-access programs on a single workstation. Analysts will only be able to enter the various sections of the computer hard drive with unique IDs and passwords. They can also quickly form password-protected “communities of interest” in the system, such as one in which only personnel cleared for Operation Highlander intercepts have access. “What’s nice about this platform is I can form these communities of interest on the fly, make sure that they are secure, and begin to share information very quickly with other members of that community,” said Chris Daly of IBM’s Software Group, which is developing the system with General Dynamics Corporation’s C4 Systems. “Secure virtualization ensures that one virtual space on my machine doesn’t get contaminated by another.” 

Another innovation for Arabic voice interceptors at NSA Georgia, who must sift through at least twenty different dialects, is a new reference manual called the Arabic Variant Identification Aid (A.V.I.A), which describes six dialects, Baghdadi Arabic being the latest addition. The system, created by NSA’s Center for Advanced Study of Language, also comes with voice samples to help identify the origin of the speaker. A center document says, “A speaker who claims to be Egyptian but who speaks with a Yemeni ‘accent’ is probably lying. Linguists can use the A.V.I.A to determine that such a person’s speech is really Yemeni rather than Egyptian.” 

The intercept operators will also soon be carrying around Top Secret BlackBerry-type smart phones. Also built by General Dynamics’ C4 Systems, the secure mobile device, known as the Sectéra Edge, will be capable of handling a variety of classifications, including voice communications at the Top Secret level and e-mail and Web access at the Secret level and below. “For the first time, authorized military and government personnel can now wirelessly access both classified and unclassified voice and data communications on the same device,” said John Cole, vice president of C4 Systems for information assurance. “The Sectéra Edge is an all-in-one communications solution, allowing users to easily switch between classified and unclassified information by pressing a single key.” Like a BlackBerry, the smart phone can synchronize information with a computer and, soon, will be able to accommodate WiFi networks. 
Image result for IMAGES OF Georgia Republican Senator Saxby Chambliss
At the quiet groundbreaking ceremony on March 26, 2007, NSA director Keith Alexander showed up with Georgia Republican Senator Saxby Chambliss. Before scooping some dirt with a golden shovel, Chambliss told the group of electronic spies, “You’re doing the Lord’s work.” Someone then asked Alexander about the warrantless eavesdropping program. “We don’t want to spy on Americans, now do we,” he said. “We want to spy on terrorists.” 

While NSA Georgia has its ears cocked toward the Middle East and North Africa, NSA Texas eavesdrops on Central and South America as well as the Caribbean from a series of buildings and satellite dishes on the Medina Annex of Lackland Air Force Base. Since the attacks on 9/11, however, and with Latin America low on the priority list, the facility has been playing an increasing role in helping NSA Georgia target the Middle East and also hot spots in Europe such as Bosnia. Like Georgia, the facility is a consolidation of army, navy, marine, and air force Sigint specialists, with the air force taking the lead role. And also like Georgia, San Antonio is about to get an influx of NSA money and personnel. 

The third major listening post in the U.S., focusing on Asia and the Pacific, is NSA Hawaii. For decades, the facility has been buried underground in a bunker in Kunia, in the center of the island of Oahu, about fifteen miles west of Honolulu. Originally constructed as an underground tunnel shortly after the Japanese attack on Pearl Harbor in December 1941, the $23 million cavern was built to house a bomb-proof aircraft assembly plant. But rather than drill into the ground, the engineers decided instead to erect a three-story hangar like structure with a large open bay area and reinforced concrete walls and then cover it with earth. 

Nicknamed “The Hole,” the 250,000-square-foot facility was instead used as a map-making plant, producing 2,700,000 maps during one very busy month. Then following the war, the navy used the subterranean compound to store torpedoes and ammo, and later, after extensive renovations, it became a command center for U.S. Pacific forces. Then in January 1980, the NSA took it over, packed it with receivers and computers, put the army (and later the navy) in charge, and began eavesdropping on much of Asia. 
Image result for IMAGES OF Director Keith Alexander,
But with the sudden windfall of post-9/11 cash, NSA architects drew up plans for a massive new facility, similar to NSA Georgia. On August 30, 2007, Director Keith Alexander, wearing a lei around his neck and holding a long ‘O’o stick, an early Hawaiian digging tool, broke ground on the new facility. “Because of the mind-boggling changes in communication technology over the last two decades, coupled with the disturbing social and political dynamics, we need more, newer, and better ways to process intelligence,” said Alexander. “This building and its design, infrastructure, capabilities, and location will support and protect an unparalleled intellectual combine.” 

Located off Whitmore Avenue in Wahiawa, the new 234,000-squarefoot, two-story building is to be surrounded by an “Exclusive Standoff Zone,” an empty area the width of a football field between the facility and the tall fence that encircles it. The seventy-acre site was formerly the home of a giant “elephant cage,” an enormous circular antenna used for eavesdropping and direction finding over much of the Pacific. Once the $318 million facility is completed in September 2010, the twenty-seven hundred Kunia workers will leave their old bunker and enter the new one, which will be built partly underground. 

Another highly secret NSA facility undergoing extensive expansion is its Denver Security Operations Center, located at 18201 East Devils Thumb Avenue on Buckley Air Force Base in Aurora, just outside Colorado’s capital city. For decades, a series of four large satellite dishes in golf-ball-like radomes have served as the down link for a number of the agency’s most powerful eavesdropping spacecraft. These include a microwave-only eavesdropping system known as Vortex or Mercury, and a multi frequency giant known as Magnum or Orion. The two geosynchronous satellites were such behemoths they needed to be launched on the powerful Titan-IV rocket. The take from these satellites is analyzed in the attached Aerospace Data Facility, which in 2000 employed about twenty-nine hundred analysts from all branches of the service as well as NSA civilians. But like the other listening posts, the Denver center is undergoing a large expansion. 

At the time of the attacks, the NSA had only about 7 percent of its facilities outside of the Baltimore-Washington area. The realization that a series of similar attacks could virtually wipe out the agency caused Mike Hayden to begin thinking seriously about moving critical parts of the agency to other areas of the country. Another reason to relocate large chunks of the agency was power. As the agency began digging through massive amounts of data, its energy-hungry thinking machines were put on overdrive. What is likely the world’s largest collection of super powerful computers is housed in the Tordella Supercomputer Building, a windowless, two-story, 183,000-square-foot facility on Ream Road at NSA Headquarters. Keeping the whirring machines from melting is an eight thousand-ton chilled water plant. Even before the attacks, all that number crunching consumed enormous amounts of energy—about the same amount of electricity as half the city of Annapolis, Maryland’s capital. 

Following the attacks, as the NSA began plowing through mountains of data in its search for terrorists, the agency’s already enormous power demands began running up against Baltimore Gas & Electric Company’s finite amount of energy. The problem was so serious that agency technicians were unable to install two new multi million-dollar supercomputers in the Tordella Building out of fear that the NSA’s power grid would collapse, blowing the fuse on the entire agency. 

By 2006, the estimates were that such a calamity could be anywhere from two months to less than two years away. “If there’s a major power failure out there, any backup systems would be inadequate to power the whole facility,” said Michael Jacobs, who was in charge of the code-making side of the NSA until 2002. Another longtime agency executive, William Nolte, pointed to the danger of erratic power surges. “You’ve got an awfully big computer plant and a lot of precision equipment, and I don’t think they would handle power surges and the like really well,” he said. “Even re-calibrating equipment would be really time-consuming—with lost opportunities and lost up-time.” 
Image result for IMAGES OF Senator John D. Rockefeller IV,
As a short-term fix, the agency began considering buying additional generators and pulling the plug on a number of older computers designed for code-breaking attacks on Cold War targets. They even began raising the temperature two degrees during the summer to help alleviate the strain on the electrical system. Some current and former government officials pointed the finger at General Hayden for not taking greater action when the energy problems first began to surface in the late 1990's. “It fits into a long, long pattern of crisis-of-the-day management as opposed to investing in the future,” said one. Also alarmed was Senator John D. Rockefeller IV, the chairman of the Senate Intelligence Committee. The NSA officials “were so busy doing what various people wanted that they forgot to understand that they were running out of power, and that’s sort of a national catastrophe,” he said, warning, “We cannot have that place go dark.” 

One potential solution involved an enormous building boom, creating a poweropolis with a new 50-megavolt amp substation, a 50-megawatt generator plant, and another 36-megawatt generator plant, on top of the agency’s existing city-size capacity. The other answer was to begin moving much of the data mining out of Fort Meade to more energy-friendly parts of the country. 

After months of searching, it was decided to relocate the data center to a former Sony Electronics computer chip plant not far from NSA Texas. In what had become a common practice, once the NSA approved of the new building, it was purchased by Corporate Office Properties Trust (C.O.P.T), a Columbia, Maryland, real-estate investment company that specialized in leasing buildings to the NSA and its contractors. The firm also owned much of National Business Park, the office complex across from the NSA where many of the companies doing business with the agency leased buildings. The NSA would then lease the former Sony plant from C.O.P.T at a nice profit for the company. “We have become increasingly reliant on intelligence and defense tenants,” said Randall Griffin, president and CEO of C.O.P.T, “particularly due to the increased activity in those sectors following the events of September 11, 2001.” 

C.O.P.T paid $30.5 million for the 470,000-square-foot facility, which Sony vacated in 2003. Located on fifty acres of land, it consists of two connected former research and development buildings at 1 Sony Drive, located at NW Loop 410 and Military Drive in northwestern Bexar County. The company also placed under contract another twenty-seven acres of adjoining land with the understanding the NSA would likely expand the facility and construct additional buildings. The NSA’s plan was to spend about $100 million to renovate what it was calling the Texas Cryptology Center, and then employ about fifteen hundred people to work there, many hired locally. An initial group of experienced agency workers would come down to train the new hires in another leased building, an old Albertson's grocery store near Interstate 10 and Wurzbach Road in San Antonio. 

The timing of the move was interesting. Although the agency began looking at the property in 2005 and even signed a lease for the Sony building, it seemed to be holding back. When asked if the project was still on track, NSA spokesman Don Webber issued a noncommittal response regarding the move. “I will not speculate about any changes to NSA’s plans for a new facility for NSA/CSS Texas,” said Webber. “As with any government program, shifting priorities, funding availability, and mission essentials could always alter the scope or schedule of a planned project.” City officials, worried about losing the facility, traveled to NSA headquarters in early January 2007. “We told them we were going to get Microsoft, and that really opened up their eyes,” said Bexar County judge Nelson Wolff. Then on January 18, Microsoft formally announced its decision to move to San Antonio. Three months later, on April 19, the NSA issued a quiet press release saying it had finally agreed on the San Antonio location. 

Both the NSA and Microsoft had been eyeing San Antonio for years. The city had the cheapest electricity in Texas and the state had its own power grid, which made it less vulnerable to rippling outages on the national power grid. Nevertheless, it seemed that the NSA wanted to be assured that Microsoft would also be there before making a final commitment. 

For an agency heavily involved in data harvesting, there were many advantages to having their miners virtually next door to the mother lode of data centers. Microsoft’s plan was to build a $550 million, two-building complex on a forty-four-acre site at 5150 Rogers Road. At 470,000 square feet, the facility was the exact same size as the NSA’s data center, with each almost the size of the city’s Alamodome. One big difference, however, was in the number of personnel to be employed. As with most data centers, virtually everything in the Microsoft complex was automated and thus the company intended to hire only about seventy five people to keep the equipment humming. The NSA, however, was planning to employ about fifteen hundred—far more than was needed to  babysit a warehouse of routers and servers but enough to analyze the data passing across them. 
Image result for IMAGES OF Debra Chrapaty
On July 30, 2007, under mostly sunny skies, a white stretch Hummer pulled up to a vacant field in the Westover Hills section of San Antonio. As the door opened, a woman dressed in a white shirt, khaki pants, and cowboy boots stepped out and surveyed the vast open area. Debra Chrapaty, Microsoft’s corporate vice president for global foundation services, then opened the groundbreaking ceremony for the new complex that, she said, would contain the digital brain for the world’s largest software company. 

“We’re building a cloud,” Chrapaty said. “The cloud is not the cloud in the sky, it’s what we’re about to break ground on in San Antonio.” Inside the virtual cloud, she said, were tens of thousands of computer servers through which will pass e-mail, instant messages, photos, videos, software programs, and details on the Internet searches of millions of users worldwide. Chrapaty noted that Microsoft has more than 280 million Hotmail customers, and its computer systems handle eight billion message transactions per day. She also said the current plant was only the beginning and that Microsoft hoped to build a second, identical facility, bringing the total investment close to $1 billion. The new data center will be a place “where the Internet lives,” said another company executive. 

Microsoft hoped the first phase of the complex would “go live” in July 2008. When completed, the building will be a mirror image of the company’s new data center in Quincy, Washington, which went live on March 27, 2007. Like Quincy, the San Antonio complex will be low-key and secretive, without even a sign to identify it. On the outside, the windowless, beige-colored building will be wrapped in a tall security fence. 

Inside, employees will have to pass through a telephone-booth-sized security portal containing a bio-metric scanner that will take a hand impression to match one in the computer. They will also wear badges with radio-frequency-identification smart chips. Past the lobby, a small group of workers will oversee the operations of the data center in a glass-enclosed control room with a wall of monitors. Elsewhere, the building will consist of long hallways between huge brain centers containing tens of thousands of computer servers. To keep them a cool sixty to sixty-eight degrees, each center will have a room with refrigerator-sized air-conditioning units. In the event of a power failure, another room will contain giant blocks of batteries that would automatically come to life for eighteen seconds before the SUV-sized backup generators kick in. 
Image result for IMAGES OF John Poindexter
As Microsoft broke ground on Rogers Road, 7.3 miles away workers were tearing walls and replacing floors at the NSA’s future data center. In addition to tapping into American communications without a warrant, General Hayden also wanted to know exactly what Americans were doing day by day, hour by hour, and second by second. He wanted to know where they shopped, what they bought, what movies they saw, what books they read, the toll booths they went through, the plane tickets they purchased, the hotels they stayed in, and the restaurants where they ate. In other words, Total Information Awareness, the same Orwellian concept that John Poindexter had tried to develop while working for the Pentagon’s D.A.R.P.A. 

Following the scandal that erupted after public exposure of his T.I.A project, Poindexter resigned and Congress killed any further money for the project. But surveillance projects have an uncanny way of coming back, and rather than die, many of the ideas and concepts simply migrated to the NSA, an agency with a far better track record than D.A.R.P.A for keeping secrets. Even though Congress cut off funding for the stillborn program in 2003, it nevertheless authorized some of the research to continue and allowed T.I.A technology to be used in the NSA’s foreign surveillance operations. Thus, just as the NSA can rifle through millions of phone calls under the Bush administration’s warrant less surveillance program, it can also sift through billions of records, such as those stored at Microsoft’s data facility. Such “transactional” data includes websites visited, queries to search engines, phone records, credit card usage, airline passenger data, banking transfers, and e-mail header details. 

Even without the warrant less powers granted by President Bush, obtaining personal information has become much easier with the passage of the Patriot Act and the frequent use of “national security letters,” which do not require probable cause or court approval. In 2000, the number of N.S.L's issued was 8,500, a large number. But between 2003 and 2005 the requests had skyrocketed to 143,074, according to a 2007 Justice Department inspector general’s report. The audit found that 60 percent of a sample of these subpoenas were not in compliance with the rules, and another 22 percent contained unreported possible violations of the law, including improper requests and unauthorized collections of information.

The revised C.A.L.E.A not only makes it a crime for any company, such as Microsoft, to refuse to cooperate, it also makes it a crime for company officials to disclose such cooperation. 

While the revelations of such widespread abuse may have come as a surprise to most Americans, they did not surprise the president of a small Internet access and consulting business who was one of the many recipients of a national security letter. “The letter ordered me to provide sensitive information about one of my clients,” he said. “There was no indication that a judge had reviewed or approved the letter, and it turned out that none had. The letter came with a gag provision that prohibited me from telling anyone, including my client, that the FBI was seeking this information. Based on the context of the demand—a context that the FBI still won’t let me discuss publicly—I suspected that the FBI was abusing its power and that the letter sought information to which the FBI was not entitled.” 

The executive went to court and fought the order and the FBI eventually dropped the matter. “But the FBI still hasn’t abandoned the gag order that prevents me from disclosing my experience and concerns with the law or the national security letter that was served on my company,” he said. “Living under the gag order has been stressful and surreal. Under the threat of criminal prosecution, I must hide all aspects of my involvement in the case—including the mere fact that I received an N.S.L—from my colleagues, my family, and my friends. When I meet with my attorneys I cannot tell my girlfriend where I am going or where I have been. I hide any papers related to the case in a place where she will not look. When clients and friends ask me whether I am the one challenging the constitutionality of the NSL statute, I have no choice but to look them in the eye and lie. I resent being conscripted as a secret informer for the government and being made to mislead those who are close to me, especially because I have doubts about the legitimacy of the underlying investigation.” He added, “At some point—a point we passed long ago—the secrecy itself becomes a threat to our democracy.” 

Another 2007 study, this one by the Congressional Research Service examining the federal government’s data mining practices, gave a hint at the NSA’s data dragnet. It cited a statistic from the Web page (now removed) for the NSA’s Advanced Research and Development Activity (A.R.D.A): “Some intelligence data sources grow at a rate of four petabytes per month now,” the study said, “and the rate of growth is increasing.” As noted in the opening of this book, in a year at that rate, the database would hold at least 48 petabytes, the equivalent of nearly one billion four-door filing cabinets full of documents. It would also be equal to about twenty four trillion pages of text. 

Eric Haseltine noted in 2004 that even the NSA’s enormous computer power has trouble keeping up with the flow. “We can either be drowned by it or we can get on our surfboard and surf it and let it propel us. And, of course, that’s what we’re trying to do.” 

According to a University of California, Berkeley, study that measured data trends around the globe, the NSA does a lot of surfing. In 2002, there were 1.1 billion telephone lines in the world producing close to 3,785 billion minutes—equivalent to 15 exabytes of data. At the same time, there were also 1.14 billion mobile cellular phones producing over 600 billion wireless minutes, or another 2.3 exabytes. Then there’s the Internet, which in 2002 contained about 32 petabytes of data and had about 667 million users who sent and received about 532,897 terabytes of information, including 440,606 terabytes of e-mail. 

To analyze such amounts of information flowing into the agency’s rapidly filling databases, the NSA and A.R.D.A came up with a number of T.I.A-like exploitation systems including one called Novel Intelligence from Massive Data (N.I.M.D). The program focused on the development of data mining and analysis tools to be used in working with enormous quantities of information. “Novel Intelligence” refers to a potential key piece of a puzzle that had not previously been known. “Massive Data” is measured either by size—one petabyte and above—or by complexity, such as multimedia, audio, maps, graphics, video, spoken text, equations, and chemical formulas or a combination all jumbled together. 

At the heart of N.I.M.D is a piece of software called the Glass Box that sits on analysts’ workstations and captures much of their online research process—the searches, results, downloads, documents viewed, and locations where data is sent. Based on the data captured in the Glass Box, models are created to automate and improve upon the analysts’ techniques. Similar analytic functions can then be automated and implemented on vast bodies of data. The ultimate goal would be to have, in essence, robotic analysis “of streaming petabytes of data”—such as that flowing across the Microsoft servers or through AT&T’s OC-192 pipes. This “data triage” would then make “decisions about which data to store, which to elevate for immediate analysis, and which to delete without further attention.” If fully implemented on U.S. communications and data links, it would create a society where everyone’s words and actions would be screened by secret surveillance machines programmed to watch-list anyone who matches a complex algorithm created by a secret agency. 

In the same way that the NSA is drowning in useless data, it is also unable to keep its head above water in analyzing voice communications. Despite decades of research, the agency has still not perfected the capability to effectively spot key words or phrases in voice telephone conversations. There are just too many and they go by too fast. Even at the agency’s Middle East listening post in Georgia, where the hunt for Osama bin Laden was priority one, the eavesdropping was still conducted the old-fashioned way—analysts such as Adrienne Kinne would manually listen to each call. There were far more calls, however, than there were analysts to listen. 
Image result for IMAGES OF Shabtai Shavit,
Nevertheless, the science of telephonic word spotting is progressing both within the agency and in the outside world. Among the leading companies in the field is Natural Speech Communication (N.S.C), which, like Verint, Narus, and NICE, is a company based in Israel, the eavesdropping capital of the world. Founded by Ami Moyal, a participant in the Wiretappers’ Ball, the company has sold its eavesdropping products to a number of unidentified Western intelligence services. “The NSC Spotter is currently deployed in several agencies around the world,” says the company. According to Moyal, “We don’t pretend that we can compete against the U.S. National Security Agency, but we have a supplementary product.” Like the other Israeli bugging companies, N.S.C also has extremely close ties to Israeli intelligence. Among the five members of the company’s board of directors is Shabtai Shavit, who served as head of Mossad from 1989 until 1996, and since then has been an adviser to the Israeli National Security Council and to the subcommittee on intelligence of the Knesset. 

“NSC’s technology is a fascinating technology that can upgrade intelligence and monitoring systems all over the world,” said Shavit. “NSC has a unique solution for the analysis of huge amounts of audio data in real  time for security operations that depends on immediate response. Also, word-spotting technology has big potential in additional markets dealing with large quantities of audio and video data. I believe that the right use of this technology will create a big change in the way audio is analyzed and mapped these days and will enable the full utilization of valuable information hidden in audio data.” 

According to the company, N.S.C’s keyword-spotting technology has the capability to monitor in real time an enormous number of phone calls. “With increasing volumes of audio streams that require monitoring,” says the company, “keyword spotting is the only way to address the need of handling hundreds of thousands of calls per day. K.W.S can assist human agents to focus on the most relevant calls thereby optimizing the monitoring process. This frees the agents from working on irrelevant material, leading to better utilization of human resources . . . These organizations are inundated with a huge amount of audio sources that require constant monitoring. Since there is such a large amount of data, using only human resources is not an option. K.W.S technology enables these organizations to scan and prioritize the audio material so that the most significant conversations are handled first.” 

For the NSA, a particularly appealing feature of the NSC keyword spotting software is its availability in a variety of Arabic dialects. “This included recording a large, representative database of Arab speakers of the Levantine dialect,” says the company, “spoken by Israeli Arabs, Jordanians, Lebanese, and Palestinians. A particular problem was collecting the colloquial spoken form of the language as used in everyday speech, and not the classical standard forms found in read speech.” 

Another company deeply involved in targeting phone calls, and closely linked to the NSA, is Nexidia Inc. But rather than conducting real-time word spotting on multiple voice communications channels, Nexidia specializes in analyzing, at enormous speed, the content of calls already recorded. According to the company, it can search through phone calls “169,000 to 548,000 times faster than real time.” Thus, says Nexidia, “the technology can render over eight thousand hours of audio data searchable per day.” Among the company’s first customers was the NSA, and sitting on the board of directors is the NSA’s former director Ken Minihan. 

Also, among the grants and contracts awarded to the company’s founder, Mark A. Clements, were several from the NSA, including one titled “Analysis of Whispered Speech,” which he worked on from January 2000 until August 2003. From the title, the NSA might have had George Orwell’s classic dystopian novel 1984 in mind. In his book, Orwell wrote, “Any sound that Winston made, above the level of a very low whisper, would be picked up by it . . . You had to live—did live, from habit that became instinct—in the assumption that every sound you made was overheard and, except in darkness, every movement scrutinized.” Mindful of the limitations of Orwell’s Big Brother, the NSA is apparently determined to prevent even low whispers from escaping their microphones.

Trailblazer 
Well beyond word spotting, NSA is also developing another tool that Orwell’s Thought Police might have found useful—an artificial intelligence system designed to know what people are thinking. With the entire Internet and thousands of databases for a brain, the device will be able to respond almost instantaneously to complex questions posed by intelligence analysts. As more and more data is collected—through phone calls, credit card receipts, social networks like Facebook and MySpace, GPS tracks, cell phone geo-location, Internet searches, Amazon book purchases, even E-Z Pass toll records—it may one day be possible to know not just where people are and what they are doing, but what and how they think. The system is so potentially intrusive that at least one researcher has quit, citing concerns over the dangers in placing such a powerful weapon in the hands of a top-secret agency with little accountability. 

Known as Aquaint, which stands for “Advanced QUestion Answering for INTelligence,” the project was run for many years by John Prange, an NSA scientist at the Advanced Research and Development Activity. Headquartered in Room 12A69 in the NSA’s Research and Engineering Building at 1 National Business Park, A.R.D.A was set up by the agency to serve as a sort of intelligence community D.A.R.P.A, the place where John Poindexter’s infamous Total Information Awareness project was born. Later named the Disruptive Technology Office, A.R.D.A has now morphed into the Intelligence Advanced Research Projects Activity (I.A.R.P.A). 
Image result for IMAGES OF John Prange,NSA
A sort of national laboratory for eavesdropping and other spy craft, I.A.R.P.A will move into its new 120,000-square-foot home in 2009. The building will be part of the new M Square Research Park in College Park, Maryland. A mammoth two-million-square-foot, 128-acre complex, it is operated in collaboration with the University of Maryland. “Their budget is classified, but I understand it’s very well funded,” said Brian Darmody, the University of Maryland’s assistant vice president of research and economic development, referring to I.A.R.P.A. “They’ll be in their own building here, and they’re going to grow. Their mission is expanding.” 

If I.A.R.P.A is the spy world’s D.A.R.P.A, Aquaint may be the reincarnation of T.I.A. After a briefing by Hayden, Cheney, and Tenet on some of the NSA’s data mining programs in July 2003, Senator Jay Rockefeller IV, the vice chairman of the Senate Intelligence Committee, wrote a concerned letter to Cheney. “As I reflected on the meeting today,” he said, “John Poindexter’s T.I.A project sprung to mind, exacerbating my concern regarding the direction the administration is moving with regard to security, technology, and surveillance.” 

The original goal of Aquaint, which dates back to the 1990's, was simply to develop a sophisticated method of picking the right needles out of a vast haystack of information and coming up with the answer to a question. As with T.I.A, many universities were invited to contribute brainpower to the project. But in the aftermath of the attacks on 9/11, with the creation of the secret warrant less eavesdropping program and the buildup of massive databases, the project began taking on a more urgent tone. 

In a 2004 pilot project, a mass of data was gathered from news stories taken from the New York Times, the AP news wire, and the English portion of the Chinese Xinhua news wire covering 1998 to 2000. Then, thirteen U.S. military intelligence analysts searched the data and came up with a number of scenarios based on the material. Finally, using those scenarios, an NSA analyst developed fifty topics, and in each of those topics created a series of questions for Aquaint’s computerized brain to answer. “Will the Japanese use force to defend the Senkakus?” was one. “What types of disputes or conflicts between the P.L.A [People’s Liberation Army] and Hong Kong residents have been reported?” was another. And “Who were the participants in this spy ring, and how are they related to each other?” was a third. Since then, the NSA has attempted to build both on the complexity of the system—more essay-like answers rather than yes or no—and on attacking greater volumes of data.

“The technology behaves like a robot, understanding and answering complex questions,” said one former Aquaint researcher. “Think of 2001: A Space Odyssey and the most memorable character, HAL 9000, having a conversation with David. We are essentially building this system. We are building HAL.” A naturalized U.S. citizen who received her PhD from Columbia, the researcher worked on the program for several years but eventually left due to moral concerns. “The system can answer the question, ‘What does X think about Y?’ ” she said. “Working for the government is great, but I don’t like looking into other people’s secrets. I am interested in helping people and helping physicians and patients for the quality of people’s lives.” The researcher now focuses on developing similar search techniques for the medical community. [God bless her for standing her ground DC]
Image result for IMAGES OF John Prange,NSA
A super smart search engine, capable of answering complex questions such as “What were the major issues in the last ten presidential elections?” would be very useful for the public. But that same capability in the hands of an agency like the NSA—absolutely secret, often above the law, resistant to oversight, and with access to petabytes of private information about Americans—could be a privacy and civil liberties nightmare. “We must not forget that the ultimate goal is to transfer research results into operational use,” said Aquaint project leader John Prange, in charge of information exploitation for IARPA. 

Once up and running, the database of old newspapers could quickly be expanded to include an inland sea of personal information scooped up by the agency’s warrant less data suction hoses. Unregulated, they could ask it to determine which Americans might likely pose a security risk—or have sympathies toward a particular cause, such as the antiwar movement, as was done during the 1960's and 1970's. The Aquaint robo spy might then base its decision on the types of books a person purchased online, or chat room talk, or websites visited—or a similar combination of data. Such a system would have an enormous chilling effect on everyone’s everyday activities—what will the Aquaint computer think if I buy this book, or go to that website, or make this comment? Will I be suspected of being a terrorist or a spy or a subversive? 

Collecting information, however, has always been far less of a problem for the NSA than understanding it, and that means knowing the language. To expand its linguistic capabilities, the agency established another new organization, the Center for Advanced Study of Language (C.A.S.L),and housed it in a building near I.A.R.P.A at the M Square Research Park. But far from simply learning the meaning of foreign words, C.A.S.L, like Aquaint, attempts to find ways to get into someone’s mind and understand what they’re thinking. One area of study is to attempt to determine if someone is lying simply by watching their behavior and listening to them speak. According to one C.A.S.L document, “Many deception cues are difficult to identify, particularly when they are subtle, such as changes in verb tense or extremely brief facial expressions. C.A.S.L researchers are studying these cues in detail with advanced measurement and statistical analysis techniques in order to recommend ways to identify deceptive cue combinations.” 

Another area of focus explores the “growing need to work with foreign text that is incomplete,” such as partly deciphered messages or a corrupted hard drive or the intercept of only one side of a conversation. The center is thus attempting to find ways to prod the agency’s cipher-brains to fill in the missing blanks. “In response,” says the report, “C.A.S.L’s cognitive neuroscience team has been studying the cognitive basis of working memory’s capacity for filling in incomplete areas of text. They have made significant headway in this research by using a powerful high-density electroencephalogram (EEG) machine acquired in 2006.” The effort is apparently directed at discovering what parts of the brain are used when very good cryptanalysts are able to guess correctly the missing words and phrases in a message. 

Like something out of a B-grade sci-fi movie, C.A.S.L is even trying to turn dull minds into creative geniuses by training employees to control their own brain waves: “The cognitive neuroscience team has also been researching divergent thinking: creative, innovative and flexible thinking valuable for language work. They are exploring ways to improve divergent thinking using the EEG and neurobiological feedback. A change in brain-wave activity is believed to be critical for generating creative ideas, so the team trains its subjects to change their brain-wave activity.” 

Now that the NSA has begun undertaking remote assassinations, C.A.S.L is also attempting to find ways to better identify who exactly is speaking before the CIA blows him or her up with a Hellfire missile, as they did with al-Harethi and his companions in Yemen. “C.A.S.L researchers,” says the report, “are applying sociolinguistic knowledge to speaker recognition and identification technology. The team developed a protocol for conducting a forensic exam to bring in insights from phonetics, sociolinguistics, speech analysis and culture. In addition, the team is working on a sociolinguistic ontology, or an organized system for representing the social variables—race, gender, age, etc.—that interact with linguistic variation.” 
Image result for IMAGES OF NSA Trailblazer,
Aquaint, Novel Intelligence from Massive Data, Glass Box, cognitive neuroscience research, brain-wave control, speaker recognition, and many more projects are all part of Trailblazer, the code name for the NSA’s rapid push to modernize its eavesdropping operations in a digital, cellular, fiber-optic world. Hayden had originally picked Trailblazer over the rival system Thinthread, which would have given the agency a greater ability to trace the origins and destinations of phone calls and e-mail. Unfortunately, Trailblazer, launched in 2000, started out bad and only got worse

The first contracts, worth $197 million, went to a little-known software company that only eighteen months earlier was operating out of the owner’s basement. The company, Conquest, was founded in 1989 by Norman G. Snyder, a former agency employee, in the basement of his Severna Park, Maryland, house. It had the advantage of being close to the Denny’s restaurant in Laurel, where the company’s executives held their weekly meetings. “Five or ten years ago NSA would have never chosen a company like Conquest,” said Snyder, who later moved the company to the agency’s National Business Park. 

Later, many of the NSA’s giants—S.A.I.C, Boeing, Computer Science Corporation, IBM, Litton came on board. But Trailblazer was plagued from the start with huge cost overruns and long delays and things never got better. Hayden indicated that one of the key problems was that they were eavesdropping on far more information than they could ever process. “We’ve had pretty good success with the front end in terms of collection,” he said. “The more success you have with regard to collection, the more you’re swimming in an ocean of data. So what Trailblazer was essentially designed to do was to help us deal with masses of information and to turn it into usable things for American decision makers. There is no other element out there in American society that is dealing with volumes of data in this dimension.” 

Hayden and his corporate partners quickly realized they were no longer swimming but drowning in that data ocean as the cost overruns began mounting. When the agency’s inspector general looked at the problem, he found “inadequate management and oversight” of private contractors and over payment for the work that was done. “The costs were greater than anticipated, to the tune of, I would say, in the hundreds of millions,” Hayden acknowledged. “The slippages were actually more dramatic than the costs. As we slipped, the costs were pushed to the right. But we underestimated the costs by, I would say, a couple to several hundred million in terms of the costs. Again, it was what we actually encountered doing this. It was just far more difficult than anyone anticipated.” Hayden also said that the agency tried to do too much too fast. “We learned,” he said, “that we don’t profit by trying to do moon shots, by trying to take the great leap forward, that we can do a lot better with incremental improvement, spiral development.”

Turbulence 
Image result for IMAGES OF Lieutenant General Keith Alexander
Upon becoming director in August 2005, Lieutenant General Keith Alexander decided to learn from Hayden’s mistakes and take a much more piecemeal approach to the problem of the three troublesome V's of signals intelligence—volume, velocity, and variety. Rather than one unified theory of Sigint, as Trailblazer was intended to be, Alexander focused more on mastering the individual pieces of the system. “I think the way to do it efficiently is smaller steps, more rapidly done, rather than try to take one big jump and make it all the way across,” he said, “in terms of how you handle data, how you visualize that data and how we jump from industrial-age analysis to the information-age analysis that our country needs.” 

“The new idea of Trailblazer, the follow-on to Trailblazer, the big pie in-the-sky super secret follow-on is now called Turbulence, said one senior official familiar with the program. Soon after it was established, Turbulence lived up to its name as Congress began raising questions. “NSA’s transformation program, Trailblazer, has been terminated because of severe management problems, and its successor, Turbulence, is experiencing the same management deficiencies that have plagued the NSA since at least the end of the Cold War,” said one document prepared by the Senate Armed Services Committee in March 2007. 

A month later, Alexander received the results of an internal survey that appeared remarkably similar to a nearly identical study carried out when Hayden first arrived at the agency eight years earlier. “What we need is fundamental change in the way we manage NSA and what we expect of management and ourselves,” said the task force, which was led by George “Dennis” Bartko, the agency’s deputy chief of cryptanalysis. The agency lacked a “unity of purpose,” was facing an “identity crisis,” and failed to produce a “fundamental management culture change.” The twenty-eight page classified document referred repeatedly to a lack of direction and cohesion among both management and the workforce. “We do not trust our peers [coworkers] to deliver,” it said. “Fragmentation has undermined corporate [NSA management] trust. Lack of trust is on display in NSA organizational structures [and] behaviors across the Enterprise.” 

Among their solutions, the twenty-four members of the panel recommended that the agency “decide upon a common purpose, develop plans and strategies aligned with that purpose, manage all of our resources, and tie rewards to successful execution of our plans.” Bartko, in a separate column he wrote in an agency publication, pointed out the seeming lack of progress from the earlier, Hayden-era report. “If these recommendations were made before, what’s different this time?” he asked rhetorically, adding, “Now is the time” for change. “It has to be. The Nation is depending on us not only today, but tomorrow as well.” 

But most troubling was the lack of oversight. “There is no clear measurement and no accountability for execution performance,” said the task force. That may have been a factor in another report that measured morale within the intelligence community. The survey found that only 46 percent of senior managers within the intelligence community were satisfied with the “policies and practices of your senior leaders,” and only 43 percent of NSA managers. 

Ironically, despite the call for accountability, the room within the agency reserved for the Government Accountability Office, the Congressional watchdog agency, remains vacant. “We still actually do have space at the NSA,” said Comptroller General David M. Walker, the director of the GAO. “We just don’t use it and the reason we don’t use it is we’re not getting any requests from Congress, you know. So I don’t want to have people sitting out there twiddling their thumbs.” 

At the same time that the NSA is becoming less accountable, it is becoming more and more depended upon—due in large part to the lack of useful human intelligence coming from the CIA. Now just one agency among many in the intelligence community, the CIA’s lackluster performance became starkly clear when it was forced to shutter nearly all of its multimillion-dollar front companies throughout Europe because they were not producing any useful intelligence. The fronts, posing as investment banks and other companies, were to serve as cover for clandestine service officers attempting to develop sources and information. But instead of intelligence the front companies only produced large bills, leading to the closure of ten out of a dozen offices. Critics saw the failure as just one more example of an agency out of touch with the times. “I don’t believe the intelligence community has made the fundamental shift in how it operates to adapt to the different targets that are out there,” said Republican congressman Peter Hoekstra of Michigan, the number two person on the House Intelligence Committee and normally a strong defender of the agency. Considering the CIA’s failures leading up to the attacks on 9/11, its bumbling on the weapons of mass destruction question leading to the war in Iraq, and now its lack of credible human intelligence on terrorism despite billions being added to its budget, the agency was quickly becoming more of a liability than an asset. As a result, Bush and Cheney began turning instead to the NSA and Turbulence to lead both the intelligence war and the cyber war. “Bush told Alexander that he wanted ‘a Manhattan Project’ on this,” said the senior official with knowledge of the program. Bush as well as Cheney, who had become very close to Alexander, pushed the NSA chief to go hard on the offensive. 
Image result for IMAGES OF Joint Functional Component Command for Network Warfare
Not only is Alexander the country’s top eavesdropper as director of the NSA, he is also the nation’s hacker in chief as commander of the little-known Joint Functional Component Command for Network Warfare (J.F.C.C-N.W). A highly secret element of the U.S. Strategic Command, it is America’s cyber war center, located at the NSA. While the Air Force also runs a cyber operations center at Lackland Air Force Base in Texas, the Air Force Information Warfare Center, its focus is largely defensive. At the NSA, the emphasis is penetration, exploitation, and attack. “They have had some pretty good success in terms of monitoring networks and going in and collecting and going in and leaving things behind,” said the official. 

In addition to viruses designed to covertly tap into networks, the things left behind could also potentially include such things as virulent strains of software viruses and logic bombs that remain dormant until a predetermined time. Once they come to life, they destroy a computer’s data from the inside. Shortly after he retired as director of the NSA, Mike McConnell, now the director of national intelligence, said he knew of more than a dozen people who could “do major damage” to a nation by mounting a computer attack with just a few weeks’ preparation. 

Aware of the NSA’s increasing involvement in cyber warfare, in March 2008 Russian president Vladimir Putin signed several executive orders designed to protect secrets on government computer networks from attack by restricting connections between international and domestic computer networks. Similar to a practice long employed by U.S. intelligence agencies, the measures restrict the ability of computers with access to “state or official secrets” to connect with networks that travel outside of the country. The decree stipulates that all “information systems, information and telecommunications networks, and computer equipment used to store, process or transmit information that contains state secrets or information from a state agency that contains official secrets,” may not operate on networks connected to others that travel outside Russia’s borders. 

The NSA’s heavy involvement in cyber warfare dates back to 1996, when then CIA director John Deutch announced plans to create a “cyber war” center at the NSA. “The electron,” Deutch warned, “is the ultimate precision-guided weapon.” The Information Operations Technology Center was created at the NSA in 1999 and became the leading organization for network exploitation and attack. Then in July 2002, President Bush signed a top-secret order directing the national security community, including the NSA, to develop, for the first time, rules and policies governing how the United States would launch cyber attacks against foreign computer networks. 

Known as National Security Presidential Directive 16, the order allows the president to launch a secret preemptive cyber war against any number of foreign countries, from China to Pakistan. “I think the presidential directive on information warfare is prima facie evidence of how seriously the government does take cyber warfare,” said John Arquilla, an associate professor of defense analysis at the Naval Postgraduate School and an expert on unconventional warfare. “It also marks a shift away from a far more prudential approach to information warfare. In the last administration, there was a great concern about using techniques of cyber warfare that would then be emulated by others, and, by suggesting to the world that the Americans think this is a legitimate form of warfare, others might want to begin doing this as well. There was a great deal of concern about that.” As the most cyber-connected country in the world, the U.S. has more to lose by starting an endless cyber war than any other nation. 

The NSA sends its “global network exploitation analysts” to train at the agency’s Network Exploitation and Target Development Boot camp. Then, at the National Cryptologic School, they take such courses as “Ultimate Web Hacking” and “Ultimate Web Hacking Advanced.” Many of the cyber warriors are outsourced from the agency’s major contractors lining its National Business Park. 

“Turbulence is working much better,” said a knowledgeable official in 2008. “Trailblazer they tried to start off too comprehensively. What they’re doing with Turbulence is they’re starting out with little test programs and trying to take those and see where they go and expand on them. If they work, expand them, if they don’t work, shit can them. Spend small amounts of money on certain ideas, see if they work, if they don’t work, forget it; if they do work, move on to the next idea. And they try to expand those things out through the system. With Trailblazer, they tried to design a comprehensive system from day one. Alexander’s thing is don’t start with the big concept, start with little ideas, see how they work and see if you can sustain them.” Most of the new Turbulence projects, he said, deal with network attacks. “They are mainly more ways of automating things to go into computers, burrow into computers, and then confuse the computer once you get it going. More sophisticated ways to do that kind of thing.” 

By moving into the world of cyber war, the NSA has crossed another dangerous threshold. Corrupting or destroying another nation’s data network is considered by most countries an act of war. And in a world where all networks are intertwined like a ball of string, once a well-disguised virus is set loose on one system, it may quickly spread to others, including those in the U.S. Like warrant less eavesdropping and mega–data mining, it is a legal and technical landscape virtually unexplored by Congress and society. 

But if the NSA is light-years ahead of the laws of the United States, it still must obey the laws of physics—although it is coming close to getting around those laws also. According to an internal study, in order for the agency to be able to handle the enormous amounts of data projected in the near future, its computers will have to accelerate enormously—to petaflop speed, a quadrillion mathematical operations a second, long the Mount Everest of computing. With such a capability, the agency would likely be able to search through much of the world’s telecommunications and computer networks looking for keywords on a real-time basis. But as silicon chips reach their finite limit in capacity, and as the supercomputer industry gives way to massively parallel computing, the agency is looking for ways to reinvent the computerized wheel. 

In the spring of 1976 the first Cray-1 rolled out of the Cray Research production plant in Chippewa Falls, Wisconsin, and directly into the basement of the NSA. A second was quietly delivered to the NSA’s secret think tank, the Communications Research Division of the Institute for Defense Analysis at Princeton University. With a random-access semiconductor memory capable of transferring up to 320 million words per second, or the equivalent of about twenty-five hundred three-hundred page books, the computer could not have been a disappointment. And when it was hooked up to the computer’s specialized input-output subsystem, the machine could accommodate up to forty-eight disc storage units, which could hold a total of almost thirty billion words, each no farther away than eighty millionths of a second. 

By the mid to late 1980's, the pace of supercomputer development was barely giving the NSA enough time to boot up its newest Cray mega machine before a new one was wheeled into its basement “flophouse.” But as the demand grew for faster—and cheaper—machines in the 1990's, universities and high-end companies turned to massively parallel computers containing a thousand or more processors, each as powerful as a traditional minicomputer. The shift meant trouble for Cray as the world turned to subcompacts, with fewer and fewer takers for its supercharged Rolls-Royce's. 

Following the worst financial year of its life, in which it was forced to cut nearly a quarter of its employees, and facing an uncertain future, Cray Research called it quits. It was acquired by Silicon Graphics Inc.—later known simply as S.G.I—a Mountain View, California, manufacturer of powerful, high-performance workstations, the sort of machines that became Cray’s greatest competitor. 

As the supercomputer business began crashing, worries increased at the NSA. Massively parallel processing might have been a good solution for some high-end commercial businesses, but it was insufficient for the NSA’s specialized needs. “High end computing systems don’t scale well when they’re put in clusters, and they tend to be fragile, with a lot of reliability issues,” said Steve Scott, chief technology officer at Cray. According to a Pentagon report on super computing and the NSA, “Large supercomputers have always been the only way to solve some really big ‘capability’ problems.” These massive number crunchers, known as vector computers, were the engines that powered the agency’s unique code breaking machines—machines that stripped away the tightly welded steel that encased the secret intercepted messages flowing into the NSA. As a result, for decades the agency had quietly underwritten a large portion of the supercomputer industry. 

The nervousness at the NSA increased substantially in 1999 as S.G.I appeared to be on the verge of going belly up while still under contract to build the agency’s newest supercomputer, the Cray SV2. At the Pentagon, a special task force of the Defense Science Board was convened to look into pumping cash into the company to keep the SV2—and NSA code breaking—alive. 

“The Task Force concluded that there is a significant need for high performance computers that provide extremely fast access to extremely large global memories. Such computers support a crucial national crypt-analysis capability,” said the study. “The vector super computing portion of the capability segment of the high performance technical computing market is at a critical juncture as far as U.S. national security interests are concerned. If the current Cray SV2 development slips its schedule or is unsuccessful, this vector market will be lost to the U.S. with the result that only foreign [Japanese] sources will be available for obtaining this critical computing capability . . . While the Task Force considers the development of the SV2 to be a very high-risk venture, we believe the D.o.D should continue to pursue its development because the potential payoff is so great—two orders of magnitude improvement—and the required investment is reasonable.” 

The decision to underwrite the SV2 was welcomed at the NSA with a collective sigh of relief. “The United States is committed to maintaining and building on its long-held position as the global leader in super computing,” said the NSA’s chief scientist, George Cotter. “These powerful computers are absolutely essential to U.S. national security interests.To that end, the U.S. government is committing significant support to S.G.I’s Cray SV2 program.” The new system was expected to dramatically extend the capability of the NSA’s supercomputers with exceptional memory bandwidth, interconnections, and vector-processing capabilities. Its peak speed was estimated to be in the tens of teraflops (trillions of calculations per second), faster than any supercomputer in existence. 
Image result for IMAGES OF Burton J. Smith,
In 2000, S.G.I finally threw in the towel and sold Cray Research to the Seattle-based Tera Computer. In a sense, Cray had gone full circle, ending up in the hands of another maverick with a dream of building the fastest machine on earth. This time it was Tera’s founder and chief scientist, Burton J. Smith, a large, rumpled man who had stunned many in the field by building a machine that in 1997 set a world speed record for sorting integer numbers. The rebirth of what was now called Cray Inc. was good news for the NSA. The agency was said to have played a quiet role in making the deal happen “because it wants at least one U.S. company to build state-of-the-art supercomputers with capabilities beyond the needs of most business customers.” Work would thus continue on the NSA’s SV2 with a delivery date scheduled for 2002. 

Another major Cray customer, not surprisingly, was Australia’s Defense Signals Directorate, their NSA. A Cray document bluntly stated the D.S.D’s mission: the organization, it said, “filters all telephone conversations, fax calls and data transmissions, including e-mail.” 

Following the attacks on 9/11, with the NSA increasing its data intake exponentially, Hayden began looking beyond the SV2, rechristened the Cray X1. What he now needed was a new customized system capable of much greater bandwidth and able to process the Nile Rivers of data gushing in from the NSA’s front-end collection facilities both in the U.S. and around the world. He also wanted a system that would be a hybrid, combining the best of both parallel and vector processing. The answer was a colossal Cray machine code-named the Black Widow. Made up of sixteen tall cabinets crammed with thousands of processors, the computer was painted jet black with a splash of red. In September 2003 Hayden gave his approval for the system, for which the NSA was paying $17.5 million—about the size of the agency’s entire budget in its early years. 

According to an Office of Science and Technology document, the Black Widow system “will provide outstanding global memory bandwidth across all processor configurations and will be scalable to hundreds of teraflops. Should be the most powerful commercially available system in the world at that time.” Also called the Cray XT5h, the Black Widow was targeted to scale to 32,000 processors, versus 4,096 for the X1, and employ new multi streaming processors (M.S.P's) allowing it to achieve the enormous speeds. But while the sixteen closet-sized cabinets were to roll into the agency’s Tordella supercomputer building in 2006, by early 2008 the agency was still waiting for the truck to arrive. Cray hoped to have the Black Widow in place sometime that year. 

Then in 2010, the NSA expects delivery of the Cray X-3, known as Cascade. Funded with $250 million from D.A.R.P.A, it will likely be the most expensive computer ever created, and the fastest—designed to break the petaflop barrier with a sustained speed of more than a quadrillion calculations a second. It had been a long struggle. In 1971, the agency’s CDC 7600 broke the megaflop barrier and fifteen years later, in 1986, its Cray-2 cracked the gigaflop limit. Then in 1997 its Intel ASCI Red crossed the teraflop line. 

Finally, in 2008, a military supercomputer called Roadrunner reached the petaflop milestone. The $133 million computer, built by scientists at IBM and Los Alamos National Laboratory, will be used to solve problems related to nuclear weapons. But if history is any judge, it is likely that the NSA will also get their own Roadrunner. If so, they will have to again increase their power supply; the machine uses up about three megawatts of power, about what a large shopping mall consumes. According to Thomas D’Agostino, the administrator of the National Nuclear Security Administration, the amount of calculation the Roadrunner can do in a day is the equivalent to everybody on the planet—six billion people— using hand calculators to perform calculations twenty-four hours a day, seven days a week, for forty-six years. But while Roadrunner hit 1.026 quadrillion calculations a second, what the NSA needs is a computer that will operate at that speed or above constantly, and that is what they hope Black Widow and Cascade will do. 

But for the NSA, the petaflop barrier may be only a brief way station. The agency has quietly made it known within the Pentagon that by 2018 it will need a computer capable of exaflop speed—one quintillion (1,000, 000,000,000,000,000) operations a second. To build such a machine for both the NSA and the Department of Energy, a new computer research center was launched in 2008. Known as the Institute for Advanced Architectures, the facility is run jointly by Sandia and Oak Ridge national laboratories. “We are faced with some problems for which petaflop supercomputers will not be fast enough,” said the Sandia National Laboratory computer architect Doug Doerfler. “That’s why we need to start designing an architecture now for exaflop-caliber computing.” Among those potential problems, according to Sandia’s Sudip Dosanjh, is power consumption. “An exaflop supercomputer might need 100 megawatts of power, which is a significant portion of a power plant,” he said. “We need to do some research to get that down. Otherwise no one will be able to power one.” After exaflops come zettaflops (a billion trillion) and yottaflops (a trillion trillion) and beyond that, the numbers haven’t yet been named. 

With its secret intercept rooms, its sprawling data farms, and its race for exaflop speeds, the NSA is akin to Jorge Luis Borges’s “Library of Babel,” a place where the collection of information is both infinite and at the same time monstrous, where the entire world’s knowledge is stored, but not a single word understood. In this “labyrinth of letters,” Borges wrote, “there are leagues of senseless cacophonies, verbal jumbles and incoherence's.”

Abyss 
Like a pint-size brain surrounded by a heavily protected, half-million square-foot body, a diminutive Dell computer in the basement of the National Counter terrorism Center is at the core of the Bush administration’s war on terror. Contained on its tape drive is “the watch list”—the group of people, both American and foreign, thought to pose a threat to the nation. At one time the list could be contained on a small 3x5 card with a great deal of space left over. Today it has grown to more than half a million names, and it is expanding by the thousands every month. Known as the Terrorist Identities Datamart Environment, or TIDE, it is the last stop for the thousands of names vacuumed up in the NSA’s warrant less eavesdropping program as well as its other eavesdropping operations. 

“This is the list that the Do Not Fly list comes from,” said one senior intelligence official concerned about the integrity of the system. “When that data comes in, it goes out to about six different watch lists. They’re all drawn from that central database. It is an Oracle database sitting in a Unix operating system. In a nutshell, N.C.T.C is functionally a huge data warehouse. The only thing that makes N.C.T.C worth anything is the database, the TIDE database. This is the most important data since 9/11. If you screw this up, we know they’re out there, we know they’re operating, we know they’re trying to get back in. The data is buried in this database.” 

Nevertheless, he said, the system is a disaster. The database is incompatible with both the NSA and the CIA systems. Despite the ocean of data collected by the NSA, he pointed out, “there really are no interfaces now so even if they want to send every bit of signal intelligence they have, we don’t have the database structure that can match up the records. There are point-to-point interfaces between NSA and CIA. It doesn’t exist from NSA to N.C.T.C. That’s the problem with data in the intelligence field—there is no leadership right now.” 

The problem, he said, goes back years. “Prior to O.D.N.I [Office of the Director of National Intelligence] there was no organization that would say, ‘All of you guys have to play together electronically.’ There were all these memorandums of agreement that were one-off. The CIA director would meet with NSA and they’d do a handshake, and NSA would meet with D.I.A [Defense Intelligence Agency] and they’d do a handshake. So if you have sixteen of these major collection systems out there, you can just see how many of these memorandums of agreement exist today. There should have been a data architect . . . It’s the worst technical screw up I’ve ever seen . . . The brains of the U.S. intelligence community reside in that building out there. The lights are on but nobody’s home.” 

When CIA employees around the world write intelligence reports, they send one copy to the CIA’s main computer database, code-named Quantum Leap, which is located on a secure floor in an office building in Reston, Virginia. Another copy goes to the N.C.T.C. But because of the incompatibility, at N.C.T.C the reports must be printed off the computer, manually reviewed, and then physically typed into the TIDE database. “The investment in it’s been a couple of hundred million dollars,” said the senior intelligence official. “Not so much the software and even the machines, but it’s all of the people. The CIA had a budget of about a hundred million a year just converting documents to get it in there—basically cables out of the field. The transmission of those documents to N.C.T.C was by hand. They literally had no way to connect the two networks, so they’d print out a big stack of documents and they’d get reentered in the system. Then they had teams of dozens of analysts going through the cables . . . They sit there and read them and highlight things with yellow highlighters and then they go to a data entry team. And then it goes into an Oracle database, a relational database. And in that kind of database you can’t do a lot of connecting the dots.” 

The official also had great concerns about the civil liberties dangers of the massive database. “The core group is about 40,000, which is the hard-core, identified,” he said. “When you go out at two degrees or three degrees, meaning friends, family, business associates, it grows to almost 120,000. When you go out four degrees, you’re upwards of 400,000. Four degrees is—I know you, you live in the building, and it so happens that there is a business in that building that allows me to connect the owner of that business to another group of another cell. So it’s really just using this technology to establish these connections.” 

Because of changes in the law, the rules changed at the N.C.T.C and U.S. names no longer had to be removed. “Before the F.I.S.A thing came down you would get U.S. citizens and they would have to be flagged and then they’re removed,” he said. “When the Patriot Act started it didn’t matter. Before that if someone was a U.S. citizen whether they were hanging out with Saddam Hussein in Sudan or not, you were required by law to delete their record in that database. You could not have U.S. citizens in a collection database. The Patriot Act said if someone’s a person of interest and has a known affiliation to a suspected group, you can track them when the initial encounter occurred outside the U.S. If it occurred inside the U.S., you immediately had to turn it over to the FBI.” 

The official said the NSA ran a test with the N.C.T.C in order to see whether it would be possible to match the NSA’s enormous database of phone numbers—acquired from the phone companies—with the N.C.T.C list of names. The test was apparently part of the warrant less eavesdropping program. “We ran a pilot where we ran the data and connected it with cell phone records. So we knew these people in the U.S. and they got a whole bunch of cell phone records—matched the names to numbers. Pretty much we know every cell phone number in the world. But the cell phone numbers allowed us to connect them to calls inside. So all of a sudden we had a rich pattern of connectivity. So we have some guy living in Frankfurt—he makes a lot of calls to six or seven people in Chicago all the time. Bingo, now you’re able to notify the FBI that, hey, you know those two guys you were looking for? Here’s their address. 

“It’s what NSA’s been doing since 9/11,” the official continued. “They’re just sweeping the stuff up. Now you don’t have to put in sweeper rooms to collect this stuff; in many cases you can just go to the phone company and say, ‘Give me only those records associated with outbound calls to this number in Frankfurt.’ You get the same data. Frankly, that’s a much better way. All of the telephone company equipment has been standardized since about ’86. The telecommunications act. Prior to ’86 every switch had its own peculiar data format.” 

But the law and policy, he said, have not kept pace with the technological developments. “They could be snooping on just about anything right now and not be accountable and be able to hold up their hands and go, ‘Our system doesn’t track that,’ ” he said, “when in many cases the system does but the code is so convoluted you could never know it. What concerned me is that I started to realize the linkage between what they were trying to do with the technology and what was going on up on the policy and the legal level with law. You can’t build these systems without safeguards and controls and they don’t have any of that in place right now.” 

Rather than focusing on legal, policy, and civil liberties issues, the N.C.T.C is focusing its attention on building a bigger database—this one code-named Railhead—which will absorb TIDE. “The metaphor was that the Railhead program would be this intersection, this railhead, where all these data interfaces would converge into the equivalent of a railhead in a train network,” said the senior intelligence official. “It’s the largest program at N.C.T.C, and Railhead and TIDE are about to be fused. Railhead is about to eat the TIDE database and when it does that, the TIDE database will just cease to exist.” 

So loose are the criteria for being tossed into the vast sea of names that in 2007, over twenty-seven thousand were removed, for a variety of unnamed reasons, because they should not have been in there. How many other innocent people remain on the list is unknown, but with upwards of a thousand new names a day being added, the number is likely substantial. Unlike a bad credit report, there is no way for anyone to ever know they are in the system—and few ways out of it. 

More than three decades ago, when the NSA posed a fraction of the privacy threat it poses today with the Internet, digital communications, and mass storage, Senator Frank Church, the first chairman of the Senate Intelligence Committee, investigated the NSA and issued a stark warning: 

That capability at any time could be turned around on the American people and no American would have any privacy left, such [is] the capability to monitor everything: telephone conversations, telegrams, it doesn’t matter. There would be no place to hide. If this government ever became a tyranny, if a dictator ever took charge in this country, the technological capacity that the intelligence community has given the government could enable it to impose total tyranny, and there would be no way to fight back, because the most careful effort to combine together in resistance to the government, no matter how privately it was done, is within the reach of the government to know. Such is the capability of this technology. 

There is now the capacity to make tyranny total in America. Only law ensures that we never fall into that abyss—the abyss from which there is no return.

http://www.bookarmor.com/_files/TSF.pdf




FAIR USE NOTICE


THIS SITE CONTAINS COPYRIGHTED MATERIAL THE USE OF WHICH HAS NOT ALWAYS BEEN SPECIFICALLY AUTHORIZED BY THE COPYRIGHT OWNER. AS A JOURNALIST, I AM MAKING SUCH MATERIAL AVAILABLE IN MY EFFORTS TO ADVANCE UNDERSTANDING OF ARTISTIC, CULTURAL, HISTORIC, RELIGIOUS AND POLITICAL ISSUES. I BELIEVE THIS CONSTITUTES A 'FAIR USE' OF ANY SUCH COPYRIGHTED MATERIAL AS PROVIDED FOR IN SECTION 107 OF THE US COPYRIGHT LAW.

IN ACCORDANCE WITH TITLE 17 U.S.C. SECTION 107, THE MATERIAL ON THIS SITE IS DISTRIBUTED WITHOUT PROFIT TO THOSE WHO HAVE EXPRESSED A PRIOR INTEREST IN RECEIVING THE INCLUDED INFORMATION FOR RESEARCH AND EDUCATIONAL PURPOSES. COPYRIGHTED MATERIAL CAN BE REMOVED ON THE REQUEST OF THE OWNER.

No comments:

Part 1 Windswept House A VATICAN NOVEL....History as Prologue: End Signs

Windswept House A VATICAN NOVEL  by Malachi Martin History as Prologue: End Signs  1957   DIPLOMATS schooled in harsh times and in the tough...