More on this book
Kindle Notes & Highlights
Read between
October 24 - November 2, 2025
The Grand Challenge, as the race was called, would give $1 million to the winning robot creator. As Tether prepared to travel to California to kick off the competition, he had in the back of his mind the realization that the robotic car race, if it went well, might save DARPA from the circling critics. If it did not go well, DARPA’s future was at stake.
Before he became the head of DARPA, or even an engineer, Tether worked full-time for a period as a Fuller Brush salesman, going door-to-door selling personal care products. “I always say that [was] the best education I ever had,” Tether later told an interviewer hired by DARPA.
DARPA’s Grand Challenge was the brainchild not of a DARPA scientist but of the agency’s onetime chief legal counsel, Richard Dunn, who had been inventing creative mechanisms to evade bureaucracy. Whether it was finding ways to hire employees on special contracts or circumventing normal government procedures to work with small companies, he had become something of a one-man “fix it” shop for DARPA’s red-tape cutting.
Shortly before Dunn retired from DARPA in 2000, he persuaded Congress to give DARPA authority for “incentive prizes,” although there was no specification of the type of contest.
In the end, however, an air force colonel hired to manage the logistics told him there was no way to shut down a highway so close to Los Angeles, even at night. Instead, the colonel suggested Barstow, a dying town in the California desert dominated by desert shrubs and methamphetamine labs. Shutting down a road in Barstow would not be too difficult.
The race favorite, a Humvee, made it the farthest, logging 7.32 miles before stopping, its “belly straddling the outer edge of a drop-off, front wheels spinning freely, on fire,” as the magazine Popular Science reported, headlining news of the race as “Debacle in the Desert.”
The New York Times reported in 2005 that DARPA had slashed its computer science funding for academics; by 2004, it had dropped to $123 million, down from $214 million in 2001. Tether defended the cuts, saying that he had not seen any fresh ideas from computer science departments. “The message of the complaints seems to be that the computer science community did good work in the past and, therefore, is entitled to be funded at the levels to which it has become accustomed,” Tether shot back, when faced with criticism.
In 2002, he suddenly ended DARPA’s four-decade-long relationship with the JASONs, the independent scientific advisory group, by taking away their funding. Tether never publicly commented on what precisely led to his decision to sever ties, but according to several accounts the conflict was over the group’s membership, which some Pentagon officials perceived as being weighted toward older physicists.
Back in the 1970s, Donchin might drop in at DARPA to see Lawrence about brain-driven computers, but he would also poke his head in Licklider’s office or chat with other program managers with similar interests. DARPA back then was an open office building, at least for its unclassified projects, and visiting one research manager was an invitation to chat and meet with other officials and swap ideas. “When I came to see Dylan [Schmorrow], there were security people in the lobby,” Donchin said. “I couldn’t speak to anybody about anything except Dylan Schmorrow. It was an amazing transformation.”
...more
The video, inspired by the Star Trek holodeck, opened with lingering shots of groundbreaking scientists: Charles Darwin, father of evolution theory; B. F. Skinner, famous for operant conditioning; and Hans Berger, inventor of electroencephalography. It then flashed to DARPA’s Dylan Schmorrow, credited as the father of augmented cognition.
Picking up these sorts of signals required careful controls and knowledge of the equipment, but the researcher simply stomped his foot when he wanted the cursor to move, introducing a deliberate “artifact,” or error. (Ironically, this was the same method that Uri Geller, who claimed to have psychic powers, was accused of using three decades earlier.) “It clearly was fake, and it wasn’t subtle at all,” Gevins said.
Reviewing the 2003 experiment, Mary Cummings, a human-computer interface expert, noted that even the published results indicated that none of the signs of “overload” that the researchers were testing were consistent across all three variations of the test
As a research program, augmented cognition was a great idea, Cummings maintained. The problem with the program, she said, was that researchers were being asked to show concrete results in an area that was still basic science. “Where DARPA started to fall overboard is when they started to try and make it applied, ready for some sort of operational results,” she said. The allure of science fiction, without the checks and balances of rigorous science, had led the promising field of augmented cognition down a rabbit hole.
Back in the 1980s, DARPA as part of the Strategic Computing Initiative had funded an autonomous land vehicle, dubbed the “smart truck,” which the historian Alex Roland described as a “large ungainly, box-shaped monster.” Instead of a windshield, the front of the vehicle sported a “large Cyclopean eye” that housed the robot’s sensors. It looked more 1950s camp science fiction than Terminator, but the exterior was not important. What mattered were the rows of computers stacked inside the fiberglass shell of the truck and the algorithms that were supposed to make sense of the outside world. Those
...more
The Roomba was made by iRobot, a company that also produced military robots, including its flagship PackBot, which had been developed with DARPA funding in the 1990s. The PackBot showed up in Afghanistan in 2002 to help clear caves and was not particularly effective (the robots lost communications and got stuck). The robot soon found a higher calling in explosive ordnance disposal. Eventually, thousands of modified PackBots were sent to Iraq and Afghanistan to help defuse roadside bombs.
For example, DARPA sponsored Boston Dynamics to build LittleDog, a four-legged vehicle (which actually looked more like a bug than a canine) that was designed to travel on rough terrain. LittleDog was followed by BigDog, a larger version that could carry supplies for troops, like a robotic mule. While tech blogs and popular magazines often called the headless BigDog a “war robot,” it was actually more appropriately a lab robot.
Vision, or lack of it, is what had flummoxed DARPA’s 1980s-era smart truck in Pittsburgh’s Schenley Park. Twenty years later, DARPA was still trying to solve the fundamental problem of providing robots with the ability to process what they see and navigate around obstacles.
That became Jackel’s inspiration for a new program called Learning Applied to Ground Vehicles, or LAGR, focusing on machine learning. Rather than having to identify each specific object, the LAGR robots would learn by experience how to navigate the terrain, mapping out a path in the distance.
The program ended up enabling robots to extend their effective vision out to a hundred meters. “We never got to the point where they were as good as the dogs, but they were a whole lot better than when they started,” said Jackel.
That application of machine learning is what Thrun and his team had been practicing in the desert. “It was our secret weapon,” he told a reporter from The New Yorker.
The Stanford team took home the $2 million jackpot. In all, five vehicles crossed the finish line, compared with none in the first event. What exactly enabled the winning teams to pull ahead of the others is hard to pinpoint. All the teams learned from studying the experience of the first Grand Challenge, according to Jackel, and knew what to expect the second time around. But it is impossible to ignore that the winning teams, Stanford and Carnegie Mellon, had received significant DARPA support for their robotics programs over the years.
There were benefits to incentive prizes, but they should not replace funded research, he argued. In the first two competitions, people had to fund themselves or find corporate sponsorship. Jackel was concerned about the long-term implications of such competitions for research and the survival of institutions that support research. “At some place, money had to flow into the system,” he said.
Jackel was a refugee from Bell Labs, the storied research and development division of the Bell Telephone Company. Ma Bell, as the monopoly was affectionately called, operated its lab as a quasi-academic institution, allowing its scientists to work with a large degree of independence. The scientists were encouraged to work on problems facing the telecommunications industry, but their research was judged by its scientific merit, not by the dollar figure their innovations generated. “Basically, the U.S. population funded Bell Labs through their phone bills,” Jackel said.
that worked well as long as Bell had a monopoly on telecommunications, the way the Pentagon has a monopoly on running the military. When the telephone monopoly was broken up, the lab was downsized, and its autonomy all but eliminated.
The contests cost much more than a $1 or $2 million prize; DARPA also had to pay for logistics, which was the most expensive part of the competition. And yet no money went to research. “It’s not self-sustaining,” Jackel said. “You can do it based on something that already exists, but if all we did was have challenges, then at some point we’d just stagnate.”
The Grand Challenge was about the future of DARPA more than about robots. In 2003, in the midst of the Total Information Awareness imbroglio, the agency had been a hairbreadth away from congressional intervention that would have permanently ended its independence.
The Grand Challenge did more than restore the agency’s image. Tether would soon be presiding over the largest expansion of DARPA’s budget since the agency’s creation.
The Grand Challenge might have saved DARPA, or at least the agency’s image, but it had no immediate effect on the wars in Afghanistan and Iraq, nor was it intended to, because robotic vehicles that could go beyond a racecourse were still years in the future.
The Phraselator was held up in Washington as a grand success, but Zemach had a different assessment: “It sucked.” On patrol in the Afghan village, Zemach held up the Phraselator, which looked more like a Star Trek tricorder than a universal translator. The device spit out a few sentences in the local language. The Phraselator had just said it was going to ask some questions and instructed the man being addressed to raise one hand for yes and two hands for no. The first question was whether the man understood this. The Afghan smiled and raised one hand. The next question was whether there were
...more
One navy tester expressed frustration that the Phraselator, even after five tries, failed to translate a simple question like “Do you speak English?” instead rendering it into phrases like “Follow me,” “Drop it,” and “Can you walk?”
Most of the preloaded phrases in the Phraselator were either yes or no questions, asking about the presence of foreign fighters, or direct orders, such as telling people to put up their hands. What troops usually needed were simple instructions to help defuse a potential confrontation when clearing villages. “You are effectively the invading army,” Zemach said. “You are going into a man’s home in front of his family with weapons and going through his stuff. It’s emasculating.”
In fact, DARPA’s funding of natural language processing in the first decade of the twenty-first century did have one major success. DARPA funded wide-ranging artificial intelligence research under a program called Personalized Assistant That Learns, which sponsored work at SRI International. The military was not interested in the work, and the DARPA program was terminated, but SRI International spun off the technology as a company called Siri, which was eventually bought by Apple and incorporated into the iPhone.
That view reflected DARPA’s image in the early twenty-first century: a great science fiction agency, but not a place that the Pentagon turned to during wartime.
Outside the Washington Beltway, what had cemented DARPA’s reputation for innovation was not necessarily drones or stealth aircraft but the Internet. The agency’s most important creation had ensured DARPA’s place in history, even if it had emerged from a tiny effort four decades prior. Whether the DARPA of 2008 was capable of producing the types of innovation that had emerged in 1968—when Robert Taylor had published plans for the ARPANET—was not something that was widely debated.
DARPA also hired a video production company to interview all of the living former directors for a brief promotional video as part of the anniversary celebration. The unedited interviews, which were only released after a Freedom of Information Act lawsuit, offered insights into how much the agency had changed over the past few decades.
In vastly simplified form, Tether had hit on the fundamental problem of the war on terror: there was no way to win the war with technology.
More immediately successful was the Grand Challenge. In 2012, Google debuted its driverless car, based on the work by Sebastian Thrun, the Stanford professor who led the winning team in the 2005 competition. The unmanned car competition had done exactly what DARPA had hoped it would do: take a bold technical goal, and prove it was possible. If the Orteig Prize ushered in the modern era of transatlantic aviation, then the Grand Challenge can rightfully take credit for the dawn of autonomous cars. It was Tether’s greatest legacy, even if it did nothing for Iraq and Afghanistan.
He offered one additional thought: “We can’t go and kill them all, you know.”
A sign inside the establishment in Jalalabad read simply, “If you supply data, you will get beer.” The idea was that anyone—or any foreigner, because Afghans were not allowed—could upload data on a one-terabyte hard drive kept at the bar, located in the Taj Mahal Guest House. In exchange, they would get free beer courtesy of the Synergy Strike Force, the informal name of the group of American civilians who ran the establishment.
The writing that had the “greatest influence on Petraeus’s thinking,” according to the journalist Fred Kaplan, was a counterinsurgency book by David Galula, the French officer sponsored by DARPA in the early 1960s under Project AGILE. Petraeus lifted Galula out of decades of obscurity, dusting off his writing and incorporating elements of it in a new counterinsurgency manual.
And so DARPA’s first deployment to a war zone since Vietnam began with a group of well-intentioned hacktivists trading beer for data at Afghanistan’s only tiki bar.
When she started making the rounds in the Washington Beltway as DARPA director, her choice of attire—short skirts, stiletto heels, and leather jackets—generated as much buzz as her credentials.
“There is a time and a place for daydreaming. But it is not at DARPA,” she told Congress. “DARPA is not the place of dreamlike musings or fantasies, not a place for self-indulging in wishes and hopes.”
“You know, Peter. I don’t think you should take over IPTO,” she said, just as he was dropping her off after dinner. “You should just start a new office.” Dugan did not say what the new office would do, other than it should be a “pure expression of what DARPA could be.” Lee had no idea what that meant.
Instead of plotting speed traps, he imagined a Trapster-like application that could track potential bomb attacks in Afghanistan. Crowdsourced data was allowing millions of people to monitor events in real time.
The fellows proposed having teams compete to locate red weather balloons that DARPA would release across the United States. Lee was not sure about the idea: having people hunt for balloons sounded a little odd, even for DARPA, but Dugan encouraged him. “That idea might be stupid, but that’s what you came up with yesterday, so you’re going to execute,” he recalled her telling him.
In the end, it took only nine hours for a team from MIT to win. They beat the competitors by using a sliding scale of financial incentives that rewarded not just those who spotted balloons but those who recruited others who successfully spotted balloons. Alex “Sandy” Pentland, an MIT computer science professor who headed the winning team, called the task “trivial.”
Military personnel expressed surprise to see her. “You’re from DARPA,” she recalled their general reaction. “We call you when we have three- to five-year problems.”
When Dugan got back to Washington, D.C., she assembled the office directors and their deputies and gave them a month to come up with ideas for technologies DARPA could contribute immediately to the war in Afghanistan.
One, called More Noses, was a plan to send several hundred dogs outfitted with sensors and GPS trackers.

