WAR JUNE 19, 2013
In July 2011, an army combat team known as the Arctic Wolves moved into the Kandahar district of Panjwai, where the Taliban was born and where Osama bin Laden is said to have planned the 9/11 attacks. The area was all but evacuated—it was not yet poppy-growing season, and Panjwai’s residents had gone to nearby cities to find work. For two months, the Arctic Wolves went about their business of clearing the territory of weapons and establishing a combat outpost without suffering casualties.
But as the Wolves continued to patrol the area in their brand new, supposedly bomb-resistant armored vehicles, they could feel eyes watching them. Insurgents were starting to move into the houses abandoned by the villagers. During this season of uneasy quiet, a 21-year-old Army specialist from Wichita, Kansas, named James Burnett called home. Burnett had enlisted while he was still in high school, and he intended to marry his fiancé and become a cop once he returned to Wichita. Those plans seemed distant now.
“I’m in a bad place,” the soldier told his stepmother. “I’m scared. Pray for me.”
Burnett and the rest of the Arctic Wolves should have had a keen sense of where the enemy was and what it would do next. The Army had developed a sophisticated data platform called the Distributed Common Ground System, or DCGS-A (pronounced “d-sigs A”) for that exact purpose. The multibillion-dollar “system of systems” was built to gather, analyze, and share information from a multitude of sensors and human intelligence sources so that an Army commander could immediately assess the threat in his brigade’s environment. It would reveal the enemy’s history in the territory, the danger zones, names, faces. That was the concept, anyway.
In reality, the intelligence data platform had proved all but useless to the analysts supporting the Arctic Wolves. “We had no basis of understanding how the enemy was operating,” one of them later told me. He explained: “We’re using DCGS-A, and my team was pretty solidly experienced with a lot of these tools. And the command generals basically wanted updates every couple of hours—what was the threat, where there were additional high-threat areas. And we were nowhere near close to turning around any intelligence with the tools we had.”
On November 12, all hell broke loose. One of Burnett’s fellow Arctic Wolves was on foot patrol when a roadside bomb exploded and killed him. The next day, another improvised explosive device (IED) detonated under one of the brigade’s armored vehicles, killing a sergeant. And on November 16, the Stryker tank containing Burnett ran over a pressure-plate IED and burst into flames, killing him and another specialist.
Back in Kandahar, the intelligence team “went into crisis mode,” the analyst recalled. “Since I’d seen other types of software in the contractor world, I went online and started sending e-mails to different analytical tool companies and saying, ‘What tool do you have that could be used in a counter-IED scenario?’ And the company that responded was Palantir.”
The Palo Alto-based data-integration firm happened to have a field rep in Afghanistan already, and he made fast arrangements to visit the Kandahar base. While the hundred or so DCGS-A analysts continued with their labors, he fired up his company software.
What transpired next was flabbergasting, recalls the intelligence analyst: “We had spent probably a day and a half trying to make a map using DCGS-A. And in my three hours with Palantir, he was able to show ten times more information— breaking it down into charts, showing patterns. We could see a rotation pattern of where [the insurgents] were moving southwest to northeast across Panjwai district. We started to see some connections where there’d been four other unsuccessful attacks with the same type of device in this area that we hadn’t seen before ... This was my sixth combat deployment, and I’d never been able to pull that level of detail together, certainly not that fast. I was sitting there like, holy shit.”
The chief intelligence officer for the 82nd Airborne overseeing all Army activity in southern Afghanistan, Lieutenant Colonel Michelle A. Schmidt, agreed with her analysts: Palantir was mission-critical, something they had to have. The request put in by Schmidt’s command in late November 2011 didn’t mince words. Relying on the more cumbersome Army product, DCGS-A, “translates into operational opportunities missed and unnecessary risk to the force.” By contrast, Palantir actually worked and would probably save lives. Its price tag to the 82nd was $2.5 million—a fraction of what the Army had been spending on its new Stryker vehicles and, for that matter, its vaunted data platform. For soldiers like the Arctic Wolves who were literally walking through mine fields, spending loose change on software to protect thousands of young American warriors would seem to be a no-brainer.
But Schmidt’s request for Palantir was denied. Instead, the Army sent over more DCGS-A equipment and personnel to Kandahar. The consequence, as one internal memo from the 82nd would state, was that the troops on the ground continued to “struggle with an ineffective intel system while we’re in the middle of a heavy fight taking casualties.” This was not the first time the Army brass had rejected a unit’s urgent request for Palantir—and, owing to the Pentagon’s stubborn refusal to concede that there might be a better way than their own, it would not be the last.
America’s struggle to understand, anticipate, and outsmart its enemies has been rigorously discussed since 9/11. Where the story becomes turgid—where we stop paying attention—is on the decidedly unsexy but crucial question of how that intelligence, once gathered, is organized and made available in a way that will allow it to actually be put to use. Where the days following the September 11 attacks found us painfully groping for on-the-ground assets in Afghanistan, today our intel units have gobs of information they have no idea what to do with, such as thousands of detainee cell-phone numbers that have yet to be analyzed. The immortal Donald Rumsfeld litany of “known unknowns” leaves off the most infuriating conundrum of all—namely, that there are also unknown knowns: disjointed facts that languish in a warehouse or in the ether, unreachable to the cops or colonels who urgently need them but aren’t aware that such facts are already there for the asking.
A longtime high-level Pentagon intelligence analyst walked me through the reasons why the government tended to do a lousy job of producing military intelligence tools. An ambitious Army intelligence commander might want to impress his superiors by coming up with some new way of graphing intel. The procurers at the Pentagon go through the motions of ordering up the visionary new graphing software, secure in the knowledge that the commander will soon be promoted and out of their hair. The idea gets kicked over to some government agency’s development shop, which issues a design contract to one of the big defense contractors within the military-industrial complex that has developed a tight relationship with the Pentagon and has “no incentive to deliver something on time and on cost,” said the veteran analyst. At no point would anyone second-guess the ambitious commander. The agency bureaucrats were, he told me, “amateurs, essentially, who are incentivized to make their customers, the government requester, happy—to the point where they engineer things that won’t work. They accept everything as a requirement, and they judge every requirement to be equal. You say to them, ‘I want something with a bell that’ll go off whenever something interesting comes up.’ So they spent five million dollars on that, and no one asks, ‘Why the fuck would you want a bell?’”
The mindless perpetuation of dubious Pentagon technologies is especially infuriating in a time when all government agencies, including the Department of Defense (DOD), are expected to do more with less. And the ethical implications take on a darker hue when the supposed overseers stand to benefit from keeping things cozy between industry and the military. (Witness, most recently, former Raytheon lobbyist Bill Lynn. President Barack Obama waived ethics rules in 2009 so that Lynn could become his deputy secretary of defense in charge of procuring weapons systems like those made by Raytheon. Lynn held this position until late 2011, when he left the administration to become CEO of yet another weapons-technology contractor, DRS.) Still, the Pentagon’s lumbering insularity puts more than just money and morals at risk. After all, hundreds of uniformed men in Humvees lost their lives from IED attacks in Iraq from late 2003 until early 2007 while DOD procurers ignored urgent requests for more bomb-resistant armored vehicles. And now the multiyear saga between Army intelligence officials and a Silicon Valley innovator begs a question of existential proportions: Are we losing our edge in the war on terror because a few military bureaucrats don’t want to admit that they were given a mission and failed?
“If there’s a thing that keeps me and my leadership awake,” Lieutenant General Mary Legere told me in her office one afternoon last month, “it’s that nobody’s sons or daughters come back injured—or don’t come back—because we failed to get intelligence to them.”
Legere is the Army’s top intelligence officer, affable and nerdy, a lanky marathon runner and three-star general with more than 30 years of military service. Her office, known as the G-2, oversees the DCGS-A program. She knows its history intimately. Promoted as a post-9/11 remedy for the failure of agencies to share intel (which in turn had allowed the hijackers to evade detection), DCGS-A was jointly developed by a who’s who of giant military contractors—Northrop Grumman, Raytheon, General Dynamics, and Lockheed Martin, among others—and was first tested on the Korean peninsula in 2003, before being rolled out for use in Iraq and Afghanistan. It now supports 1.1 million Army troops stationed in the Pacific, South America, Europe, and Asia. And its intelligence data can be accessed by 25,000 Army analysts using the platform’s various formatting tools, including commercially designed mapping (Google Earth), geospatial (ArcGIS), and link-analysis (IBM Analyst Notebook) software. “It’s actually the thing that underpins all our decisions and informs our weapons platforms,” Legere said with evident pride. “We’re now confident in the data and its richness.”
Only DCGS-A has access to every layer of intel, she said. Without all data at a brigade’s disposal, “I don’t know that the one piece of data that wasn’t ingested won’t make a difference,” she warned. But the general’s contentions are problematic. For starters, DCGS-A doesn’t have its own exclusive data—it’s simply a kind of high-powered search engine that reaches into the intelligence community’s various sources of information. And according to the intel analyst in Kandahar who experienced difficulties with the system: “It was quantity versus quality. A lot of it is the same reports in like ten different databases.” The intel provided by DCGS-A wasn’t just duplicative, he added—it was also, contrary to Legere’s assertion, incomplete: “A source we actually wanted, the National Intelligence Service biometric data, wasn’t in DCGS-A.”
Though Legere implicitly views DCGS-A’s mission as one that’s so vital that a price cannot be put on it, she also insisted to me that the Army product really is not a more expensive option than a commercial alternative. This claim would be more plausible if Legere’s own office had not estimated the cost of DCGS-A as being more than $2.3 billion so far. As a Senate military aide who has studied the matter puts it, the system has repeatedly shown “an inability to do anything on time or on budget.” (More bluntly, a House staffer refers to DCGS-A as “the new nine-hundred-dollar toilet seat.”) Though Legere vigorously denies that DCGS-A has continually been developed behind schedule, a person who works on DCGS-A explained to me how the Army obfuscates the matter: “They’ll say, ‘OK, we delivered Increment One on time.’ But what they delivered has only forty percent of the requirement that it was supposed to have when it got started. The rest of the requirement gets slipped into Increment Two.”
“We could talk for a week on the missteps over the program’s life,” the engineer who works on DCGS-A told me. DCGS-A frequently locks up, causing information to be lost. (Defense News reported last August that the DCGS-A screens suddenly went black during a joint military exercise in South Korea while attempting to track simulated North Korean troop movements.) And one Army intelligence officer recounted: “As I was being deployed to Afghanistan, I was given a DCGS-A system with a twenty-two-thousand-dollar laptop. When I got to my unit at a company level, it did not function. I couldn’t pull data off the map—I ended up just using the Internet, literally using Google Earth to plan my operations.” He added: “I was later told, when I complained about DCGS-A, that it was my problem, even though I’d been through training and was a certified user. Probably seventy to eighty percent of my course instructors said to us, ‘We understand that it doesn’t work, that you don’t like it. But find a way to make it work.’”
In other words: The Army product costs too much, isn’t inclusive, is prone to crashing, and doesn’t do what it’s supposed to do. What’s not to love?
Still, the military bureaucrats deemed DCGS-A to be the Army’s “program of record” in 2005—a position it contentedly held for the next four years, until Palantir offered its services.
Palantir is the Palo Alto–based brainchild of PayPal’s creators, and it was partially funded by In-Q-Tel, a nonprofit venture-capital firm created by the CIA to invest in intelligence technologies. (In-Q-Tel had also provided seed money for Keyhole, a mapping software product founded in 2001 that was later sold to Google, which renamed it Google Earth.) With offices in places like Washington, London, Santa Monica, and Singapore, Palantir’s reach is global, and the fraud-detection software it has developed was used by the White House to root out abusers of the federal stimulus program.
“We understand that it doesn’t work, that you don’t like it. But find a way to make it work.”
Palantir’s fundamental raison d’être is similar to that of DCGS-A: to provide a data platform that integrates information quickly so as to facilitate decision-making. But because Palantir isn’t a Frankenstein’s monster of numerous software languages and cumbersome requirements that only a super-user is equipped to navigate, it has accrued as many admirers as DCGS-A has detractors. Says the DCGS-A engineer with noticeable envy: “The people who know and understand the two systems understand that the difference is in the database. I try not to use hyperbole, but Palantir really has a game-changing capability in the way you can connect types of data. The database for DCGS-A is really 1990s, very static, versus Palantir, which is much more itemized and flexible.” Palantir has become one of the largest private tech companies in the world, with a client list of big-data users ranging from Citigroup to the NYPD—not to mention the FBI, CIA, Special Operations, Marines, and America’s military counterparts in the United Kingdom, Australia, and Canada. Mark Bowden reported in his book The Finish that Palantir’s data analysis assisted in the raid on Osama bin Laden.
But the biggest of all military services, the U.S. Army, has viewed Palantir with hostility from the start. In January 2009, the company made its pitch to the Army and was subsequently invited to have its product evaluated at the Joint Intelligence Laboratory in Suffolk, Virginia. The day before the scheduled evaluation, the G-2 office informed Palantir by e-mail that the meeting was canceled, without giving an explanation. In fact, the G-2 office—both before and after Legere took charge in 2012—led an effort to block every Army request for Palantir, even when it came from the Army’s director of intelligence in Afghanistan, Michael Flynn, who in a 2010 memo condemned DCGS-A’s haplessness, which Flynn said “translates into operational opportunities missed and lives lost.”
On the morning of November 30, 2011, a rather large and unusual meeting took place at centcom headquarters at MacDill Air Force Base in Tampa. Among the 20 or so in attendance were two representatives from Palantir and several adamant defenders of DCGS-A—chief among them, Lynn Schnurr, the chief information officer for the G-2 office. Chairing was centcom J-2 Brigadier General Robert P. Ashley Jr., though the meeting was hardly his idea. Three U.S. senators— Republican Kay Bailey Hutchison and Democrats Dianne Feinstein and Tom Carper—had written centcom’s commander, expressing concern that the Army wasn’t making use of available commercial intel tools. Instead, said the letter, the Army’s in-house systems “have been funded in the billions of dollars over more than a decade and have yet to meet the requirements of the users.” The centcom brass knew what this meant: DCGS-A needed to make nice with Palantir.
Despite the pissy letter from Capitol Hill, the DCGS-A advocates showed up to the November 2011 centcom meeting with nostrils flared. Palantir, Schnurr’s team claimed, did far less than DCGS-A. It operated on only one network, it was proprietary, it was noncompliant, it lacked essential data, and its clients were only trial users. As an internal centcom document summarizing the meeting would characterize it, “Lots of bad blood here—and one side or the other was definitely lying.”
Recalls a participant not employed by either side: “The Palantir guys had an answer for every objection. But the Army guys were rude on every level, would interrupt when the Palantir guys were speaking, and I was embarrassed as a service member to see how unprofessional they were. Some of them were flat-out lying, and the Palantir guys knew it—but as contractors, they had to have a measured response. At one point, it came out that they’d refused to provide Palantir with some of the government-derived codes—so of course it’s not gonna communicate fully. What are we, in eighth grade here? It’s utterly juvenile. Brigadier General Ashley put his head in his hands—couldn’t believe this crap.”
The meeting resolved nothing. The Army, this neutral participant went on to say, “decided to fight it. They’ve put so much stinking money in DCGS-A over the years and still had a dog. And jobs are at stake now: If you’re one of the mediocre-to-poor engineers working for the U.S. Army and you’ve developed the all-source tools within DCGS-A and have produced a Studebaker, and here comes this Ferrari that costs less than your Studebaker, it’s an existential crisis! Of course you’re threatened!”
Threatened, and also annoyed. The letter from the senators that had compelled the centcom meeting was one of several that the Army had received from Congress. An alpha dog in the tech world, Palantir was not used to rolling over to anyone, not even the government. And so instead of meekly showing gratitude for a few scraps of business, the company had sent lobbyists to Capitol Hill. The lobbyists in turn were telling legislators and their staffers that the Army was wasting money and risking the lives of its soldiers.
This was no way to make friends. The time-tested way to ingratiate one’s self to DOD bureaucrats was there for Palantir to see, courtesy of America’s giant military contractors, who of course have whole squadrons of lobbyists. Take Raytheon, whose vice president for technology strategies was Heidi Shyu—until last year, when she was appointed the Army’s assistant secretary in charge of acquisitions. Or take saic, whose former vice president, Dr. Russell Richardson, is now one of the Army’s architects of a future iteration of DCGS-A. Or take General Dynamics Information Technology, which recently hired G-2 Chief Information Officer Lynn Schnurr. Each of these mega-firms has been among DCGS-A’s principal subcontractors for years, despite the system’s continued failings and cost overruns. As Republican Representative Jason Chaffetz told me, with specific reference to Schnurr’s lateral move into the private sector, “If you’re nice to a contractor, your payday will come.”
“It’s a complete scam,” said Representative Duncan Hunter as he plopped down onto a couch in his office in the Cannon Building one afternoon in April. The 36-year-old former Marine had just finished using his service connections to access the Cloud, which the G-2 office has touted as a dazzling DCGS-A update, one that would instantly link up the data between U.S. command centers and those out in the field. “For all of Afghanistan, it’s got a total of sixty-six persons of interest. You would think thousands. So we pull up—”and Hunter mentioned a suspected terrorist. “No known relationships. No known IED guys. No link saying he’s done anything.”
Agitated, the tall California Republican sucked on an electronic cigarette, as he’d recently stopped smoking the real things. “We’re asking five hundred million dollars for this?” he asked. “It’s supposed to be like this big cloud portal, so that anybody can access it. But nobody does—because it doesn’t work! It’s like opening PowerPoint or whatever and clicking on everything and nothing works. They’re just buttons, made to look like Palantir’s buttons.”
When I asked General Legere about Hunter’s experience, she insisted that the congressman had only looked at “an experimental cloud” called UX and that they had “already shifted” to a new one known as Red Disk. In fact, Red Disk won’t be operable until the end of the year, while the UX software system remains the version of the Cloud being used at the ground intelligence center based at Fort Bragg—as well as at the other Cloud center, based at Bagram Airfield in Afghanistan. Except that the latter has been offline for months now. Meaning, there is no synchronicity between centers. Meaning, by definition, that the Cloud is not a cloud.
As a former data programmer during his college years in San Diego, Hunter understands not only arcane software terminology, but also the mindset of its creators: “I used to code ’til like four in the morning, because I enjoyed it—and that’s what these Silicon Valley and Palo Alto guys do. You don’t get that at the Pentagon. You just don’t.” As a Marine deployed to IED-laden Fallujah in 2004, he yearned for an intel system like Palantir that would “be able to piece together all of the thousands of different connections between people and places and draw logical conclusions out of all that with the click of a mouse.” As the son of a veteran congressman, he could see how Duncan Hunter Sr. had to bypass the normal acquisitions system to get his son’s fellow soldiers the sniper scopes they needed—with the result that “the logistics guys were pissed, because it meant more work for them.” And now, as a congressman himself since 2009, Hunter has also butted heads with Pentagon middle-managers by using his influence to aid troops on the battlefield. Having heard about the deadly Stryker attacks in November 2011 and the 82nd Airborne’s urgent pleas for Palantir being met with three months of obstinate pushback from General Legere’s G-2 office, Hunter personally appealed to Army Chief of Staff General Raymond Odierno and thereby got the unit what it wanted.
Hunter is a lone congressman, however. And while DCGS-A has managed to achieve the near-impossible feat in Washington of uniting numerous politicians on both sides of the aisle against it—from liberal Houston Representative Sheila Jackson Lee to moderate Senator Claire McCaskill to Tea Party Representative Mike Pompeo of Kansas—bureaucracies can display a unique form of agility when confronted with a whiff of their own mortality. They can charm, befuddle, deceive—and thereby endure. By 2012, DCGS-A’s big subcontractors like General Dynamics began showing up to Capitol Hill with a two-page “Congressional Engagement Strategy” with such misleading talking points as, “DCGS-A provides a suite of analytical tools while Palantir provides a small subset of DCGS-A capabilities.”
Still, when medal-studded commanders like Odierno and Legere look a legislator in the eyes (as both generals have done in the past several weeks) and declare that DCGS-A is working splendidly while Palantir by comparison is little more than an app, few have the nerve—and expertise—to call bullshit. (Hunter did so to Odierno, during a House Armed Services hearing in April; the exchange went viral on YouTube.) It’s far easier for a beleaguered, jargon-deficient congressional staffer to advise the boss that, OK, let’s just trust the military brass and move on to something else.
“The Army guys were rude on every level, would interrupt when the Palantir guys were speaking,”
As one influential Democratic staffer admitted after receiving a briefing with DCGS-A personnel to explain the Cloud’s persistent ineffectiveness: “They’re really good at saying, ‘We’re just one step from full roll-out,’ and then you’ll find out six months later they’re delayed. They get into super acronym mode, and it becomes kind of a confusing game.” A Senate military aide whose boss has become deeply interested in the DCGS-A versus Palantir saga confessed to me that nothing in his expertise covered the writing of software codes. And when a Senate Appropriations Committee staffer was given a tour of the DCGS-A facility in Aberdeen, Maryland, this past February, she was shown a demo of the system’s latest version. The system crashed during the demonstration and had to be rebooted, according to another eyewitness, but the Senate staffer appeared not to notice.
For my part, General Legere momentarily succeeded in bewildering me with an arsenal of jargon. Palantir, she said, can’t link up with Army data sources because “they have a different ontology for tagging.” (While this is true, Palantir’s ontology, according to the military users of Palantir with whom I spoke, in no way makes the system less open, only more efficient.) Over and over, Legere damningly referred to Palantir as “proprietary,” a dirty yet meaningless word that also applies to the Army’s industry partners IBM and Google, neither of whom has sold off its intellectual property to the government. She insisted that the few Army units that had requested Palantir only used it “for link analysis and visualization.” (The military users later told me this was laughably untrue. “As soon as someone says that, I know they don’t know what they’re talking about,” the intel analyst from the 82nd Airborne told me. “The link analysis in Palantir isn’t better, honestly. What makes Palantir powerful is you can find and connect all this data with all these other parts of the system.”) And she spoke ominously of Palantir’s “data latency gaps.” This does appear to be a fair criticism: Palantir’s architecture is rigged in such a way that slightly slows down data’s path from its repository to a user. But the Marines and Special Operations don’t seem particularly hamstrung by this liability. By contrast, says one of the Marine officials who was involved in the service’s acquisition of Palantir and counts himself as an admirer, “We took a look at DCGS-A and talked to soldiers who’d used it, and it was panned.”
The Army’s latest gambit has been to exhibit a show of conciliation by entering into a “cooperative research agreement” with Palantir. Duncan Hunter is unimpressed. “As we speak, DCGS-A is making Palantir integrate into their shitty system,” he told me. “This is like Google having to integrate into Microsoft Access. It’s totally backwards.” In the meantime, a total of nine Army brigades have been permitted to use Palantir. The other 129 are stuck with DCGS-A—which, just this past November, was described by the Army’s own evaluation office to be “not operationally effective, not operationally suitable and not operationally survivable against cyber threats.” The G-2 office headed by General Legere has routinely responded to criticisms of DCGS-A by blaming the victim—telling the users who can’t make sense of the system, You need more training. “I don’t think Legere has seen it in use in the field, except by people trying to convince her how good it is,” said the analyst from the 82nd.
Still, Legere, the marathoner, knows that time is on the side of bureaucrats, who have every reason to believe that politicians like Hunter will eventually turn their attention elsewhere. By 2014, the United States will cease combat operations in Afghanistan—turning away from the land of IEDs and low-intensity conflicts that require the kind of skilled instant mapping of enemy tactics that DCGS-A seems incapable of providing. Instead, as one veteran Senate military aide suggested to me, “The services are already pivoting to Asia, back to their happy place where they can refocus on traditional forms of warfare.”
But as our tragic misadventure in Iraq proved, even invading a country that is equipped with inferior conventional weapons doesn’t guarantee anything like victory. Protracted asymmetrical warfare waged by un-uniformed insurgents—this is the new normal American troops can expect to face. Without intelligence data that is “rich” not only in quantity but also in accessibility, we don’t know what we know, and the enemy regains the edge. When I asked the aforementioned Army intelligence officer how he would feel about going into Syria or Iran with DCGS-A, his reply was immediate.
“That terrifies me more than you know,” he said.
Robert Draper is a contributing writer for The New York Times Magazine and a correspondent for GQ. His most recent book is Do Not Ask What Good We Do: Inside the U.S. House of Representatives.