Introducing America's future fighting machines
A small gray helicopter was perched on the runway, its rotors beating slowly against the shroud of fog and rain blowing in from the Chesapeake Bay. Visibility was poor, but visibility did not matter. The helicopter had no windows, no doors, and, for that matter, no pilot. Its elliptical fuselage looked as if it had been carved out of wood and sanded smooth of detail. It hovered above the runway for a moment, swung its blind face toward the bay, and then dissolved into the mist.
The helicopter was the first among a dozen unmanned aerial vehicles (UAVs) scheduled to fly during the annual Association for Unmanned Vehicle Systems International conference in Baltimore. The live demonstration area at Webster Field, a naval air facility located seventy miles south of Washington, D.C., was laid out along the lines of a carnival midway. Big defense contractors and small engineering firms exhibited the latest military robots under white tents staked out alongside an auxiliary runway. Armed soldiers kept watch from towers and strolled through the throng of military officers and industry reps. I took a seat among rows of metal chairs arrayed in front of a giant video screen, which displayed a live feed from the helicopter's surveillance camera. There was little to see except clouds, so the announcer attempted to liven things up.
"Yesterday we saw some boats out there," he said, with an aggressive enthusiasm better suited to a monster-truck rally. "They didn't know they were being targeted by one of the newest UAVs!" Next, two technicians from AeroVironment, Inc., jogged onto the airfield and knelt in the wet grass to assemble what appeared to be a remote-controlled airplane. One of them raised it over his shoulder, leaned back, and threw it into the air like a javelin, The airplane -- called the Raven -- climbed straight up, stalled, dipped alarmingly toward the ground, and then leveled off at two hundred feet, its tiny electric motor buzzing like a mosquito. The screen switched to show the Raven's video feed: a birds-eye view of the airstrip, at one end of which a large American flag flapped limply on a rope strung between two portable cranes next to an inflatable Scud missile launcher.
"A lot of the principles we use here are taken from the model industry," an AeroVironment spokesman told the announcer as the Raven looped around the field. The U.S. military has purchased more than 3,000 Ravens, many of which have been deployed in Iraq and Afghanistan, but apparently none of the military officers present had ever seen one land. At the end of the Raven's second flight, the crowd went silent as the tiny plane plummeted from the sky and careered into the ground, tearing off its wings. The technicians scrambled to the crash site, stuck the wings back on, and held the Raven triumphantly above their heads.
"It's designed that way," the spokesman explained.
"Hey, if you can't fix it with duct tape," the announcer said, "it's not worth fixing, am I right?"
Other teams took the field to demonstrate their company's UAVs. The sheer variety of aircraft and their launching methods -- planes were slung from catapults and bungee cords, shot from pneumatic guns and the backs of pickup trucks, or simply tossed by hand into the air -- testified to the prodigious growth in demand for military robots since the terrorist attacks of September 11, 2001, and the subsequent "global war on terrorism." In his opening conference remarks, Rear Admiral Timothy Heely compared the embryonic UAV market with aviation in the first decades of the twentieth century, when the Wright brothers built planes in their workshop and dirigibles carried passengers. "It's all out there," he said. "You don't want to throw anything away."
It started to drizzle again. The military officers sought refuge under a catered VIP tent decorated with red, white, and blue bunting while the rest of us scattered in all directions. I headed to the unmanned ground vehicle (UGV) tent located at the far end of the runway. The tent's interior was dim; the air, sticky and hot. Tables stocked with brochures and laptops lined the vinyl walls. Robots rested unevenly on the grass. This was the first year UGVs were allowed demonstration time at the conference, and company reps were eager to show what their robots could do. A rep from iRobot, maker of the popular Roomba robotic vacuum cleaner, flipped open a shiny metal briefcase that contained an LCD monitor and a control panel studded with switches and buttons for operating the PackBot, a "man-packable" tracked robot not much bigger than a telephone book. Hundreds of Pack-Bots have already been deployed in Iraq.
"If you can operate a Game Boy, you're good," the rep said.
A Raytheon engineer fired up an orange robot that looked like a track loader used in excavation. The only difference was a solid black box containing a radio receiver on top of the cage where the human driver normally sat. It rumbled out of the tent onto the airfield, followed by a camera crew.
"It's a Bobcat," the announcer shouted. "It's a biiig Bobcat!"
The Bobcat rolled up to a steel garbage bin containing a "simulated Improvised Explosive Device," hoisted it into the air with a set of pincers, and crumpled it like a soda can. A Raytheon spokesman listed all the things the tricked-out Bobcat could do, such as breach walls.
"You could also crush things like a car if you wanted to," he added.
"I never thought of crushing something," the announcer said. "But yeah, this would do very nicely."
After the Bobcat had dispatched the mangled garbage bin and returned to the tent, I asked a Raytheon engineer if the company had thought about arming it with machine guns. "Forget the machine guns," he said dismissively. "We're going lasers."
Military robots are nothing new. During World War II. Germans sent small. remote-controlled bombs on tank treads across front lines; and the United States experimented with unmanned aircraft, packing tons of high explosives into conventional bombers piloted from the air by radio (one bomber exploded soon after takeoff, killing Joseph Kennedy's eldest son, and the experiment was eventually shelved). But in a war decided by the maneuver of vast armies across whole continents, robots were a peculiar sideshow.
The practice of warfare has changed dramatically in the past sixty years. Since Vietnam, the American military machine has been governed by two parallel and complementary trends: an aversion to casualties and a heavy reliance on technology. The Gulf War reinforced the belief that technology can replace human soldiers on the battlefield, and the "Black Hawk down" incident in Somalia made this belief an article of faith. Today, any new weapon worth its procurement contract is customarily referred to as a "force multiplier," which can be translated as doing more damage with less people. Weaponized robots are the ultimate force multiplier, and every branch of the military has increased spending on new unmanned systems.
At $145 billion, the Army's Future Combat Systems (FCS) is the costliest weapons program in history, and in some ways the most visionary as well. The individual soldier is still central to the FCS concept, but he has been reconfigured as a sort of plug-and-play warrior, a node in what is envisioned as a sprawling network of robots, manned vehicles, ground sensors, satellites, and command centers. In theory, each node will exchange real-time information with the network, allowing the entire system to accommodate sudden changes in the "battle space. " The fog of war would become a relic of the past, like the musket, swept away by crystalline streams of encrypted data. The enemy would not be killed so much as deleted.
FCS calls for seven new unmanned systems. It's not clear how much autonomy each system will be allowed. According to Unmanned Effects (UFX): Taking the Human Out of the Loop, a 2003 study commissioned by the U.S. Joint Forces Command, advances in artificial intelligence and automatic target recognition will give robots the ability to hunt down and kill the enemy with limited human supervision by 2015. As the study's title suggests, humans are the weakest link in the robot's "kill chain" -- the sequence of events that occurs from the moment an enemy target is detected to its destruction.
At Webster Field, the latest link in the military's increasingly automated kill chain was on display: the Special Weapons Observation Reconnaissance Detection System, or SWORDS. I squatted down to take a closer look at it. Despite its theatrical name, SWORDS was remarkably plain, consisting of two thick rubber treads, stubby antennae, and a platform mounted with a camera and an M240 machine gun -- all painted black. The robot is manufactured by a company named Foster-Miller, whose chief representative at the show was Bob Quinn, a slope-shouldered, balding man with bright blue eyes. Bob helped his engineer to get SWORDS ready for a quick demo. Secretary of the Army Francis Harvey, the VIP of VIPs, was coming through the UGV tent for a tour.
"The real demonstration is when you're actually firing these things," Bob lamented. Unfortunately, live fire was forbidden at Webster Field, and Bob had arrived too late to schedule a formal demonstration. At another conference two months before, he had been free to drive SWORDS around all day long. "I was going into the different booths and displays, pointing my gun, moving it up and down like the sign of the cross. People were going like this" -- he jumped back and held up his hands in surrender -- "then they would follow the robot back to me because they had no idea where I was. And that's the exact purpose of an urban combat capability like this."
Sunlight flooded into the tent as Secretary Harvey parted the canopy, flanked by two lanky Rangers in fatigues and berets. Bob ran his hand over his scalp and smoothed his shirt. It was sweltering inside the tent now. Beneath the brim of his tan baseball cap, Secretary Harvey's face was bright red and beaded with sweat. He nodded politely, leaning into the verbal barrage of specifications and payloads and mission packages the reps threw at him. When he got to SWORDS, he clasped his hands behind his back and stared down at the robot as if it were a small child. Someone from his entourage excitedly explained the various weapons it could carry.
Bob had orchestrated enough dog-and-pony shows to know that technology doesn't always impress men of Secretary Harvey's age and position. "We don't have it in the field yet," Bob interrupted, going on to say that SWORDS wasn't part of any official procurement plan. It was a direct result of a "bootstrap effort" by real soldiers at Picatinny Arsenal in New Jersey who were trying to solve real problems for their comrades in the field. "And soldiers love it," he added.
On the long bus ride back to Baltimore, I sat behind Master Sergeant Mike Gomez, a Marine UAV pilot. "All we are are battery-powered forward observers," he joked. Mike was biased against autonomous robots that could fire weapons or drop bombs with minimal, if any, human intervention. There were too many things that could go wrong, and innocent people could be killed as a result. At the same time, he wasn't opposed to machines that were "going to save Marines, save time, save manpower, save lives."
It wasn't the first time that day I'd heard this odd contradiction, and over the next three days I'd hear it again and again. It was as if everyone had rehearsed the same set of talking points. Robots will take soldiers out of harm's way. Robots will save lives. Allow robots to pull the trigger? No way, it'll never happen. But wasn't the logical outcome of all this fancy technology an autonomous robot force, no humans required save for those few sitting in darkened control rooms half a world away? Wasn't the best way to save lives -- American lives, at least -- to take humans off the battlefield altogether? Mike stared out the bus window at the passing traffic.
"I don't think that you can ever take him out," he said, his breath fogging the tinted glass. "What happens to every major civilization? At some point they civilize themselves right out of warriors. You've got sheep and you've got wolves. You've got to have enough wolves around to protect your sheep, or else somebody else's wolves are going to take them out."
Coming from a career soldier, Mike's views of war and humanity were understandably romantic. To him, bad wolves weren't the real threat. It was the idea that civilization might be able to get along without wolves, good or bad, or that wolves could be made of titanium and silicon. What would happen to the warrior spirit then?
Scores of scale-model UAVs dangled on wires from the ceiling of the exhibit hall at the Baltimore Convention Center, rotating lazily in currents of air-conditioning. Models jutted into the aisles, their wings canted in attitudes of flight. Company reps blew packing dust off cluster bombs and electronic equipment. They put out bowls of candy and trinkets. Everywhere I looked I saw ghostly black-and-white images of myself, captured by dozens of infrared surveillance cameras mounted inside domed gimbals, staring back at me from closed-circuit televisions.
In addition to cameras, almost every booth featured a large plasma monitor showing a continuous video loop of robots blowing up vehicles on target ranges, or robots pepper-spraying intruders, robots climbing stairs, scurrying down sewer pipes, circling above battlefields and mountain ranges. These videos were often accompanied by a narrator's bland voice-over, muttered from a sound system that rivaled the most expensive home theater.
I sat down in the concession area to study the floor map. An engineer next to me picked at a plate of underripe melon and shook his head in awe at the long lines of people waiting for coffee. "Four or five years ago it was just booths with concept posters pinned up," he said. "Now the actual stuff is here. It's amazing."
At the fringes of the exhibit hall, I wandered through the warrens of small companies and remote military arsenals squeezed side-by-side into 10x10 booths. I followed the screeching chords of thrash metal until I stood in front of a television playing a promotional video featuring a robot called Chaos. Chaos was built by Autonomous Solutions, a private company that had been spun out of Utah State University's robotics lab. In the video, it clambered over various types of terrain, its four flipper-like tracks chewing up dirt and rocks and tree bark. The real thing was somewhat less kinetic. A Chaos prototype lay motionless on the floor in front of the television. I nudged it with my foot and asked the company's young operations manager what it was designed to do.
"Kick the pants off the PackBot," he said, glancing around nervously. "No, I'm kidding."
A few booths down I encountered a group of men gathered around a robot the size of a paperback book. Apparently, it could climb walls by virtue of a powerful centrifuge in its belly. A picture showed it stuck to a building outside a second-story window, peering over the sill. But the rep holding the remote-control box kept ramming the robot into a cloth-draped wall at the back of his booth. The robot lost traction on the loose fabric and flipped over on its back, wheels spinning. A rep from the neighboring booth volunteered use of his filing cabinet. The little robot zipped across the floor, bumped the cabinet, and, with a soft whir, climbed straight up the side. When it got to the top it extended a metal stalk bearing a tiny camera and scanned the applauding crowd.
I continued along the perimeter, trying to avoid eye contact with the reps. Since it was the first day of the show, they were fresh and alert, rocking on their heels at the edges of their booths, their eyes darting from name badge to name badge in search of potential customers. I picked up an M4 carbine resting on a table in the Chatten Associates booth. The gun's grip had been modified to simulate a computer mouse. It had two rubber keys and a thumb stick for operating a miniature radio-controlled tank sporting an assault rifle in its turret.
"You'll need this," said Kent Massey, Chatten's chief operating officer. He removed a helmet from a mannequin's head and placed it on mine. Then he adjusted the heads-up display, a postage stamp-sized LCD screen that floated in front of my right eye. The idea behind the setup was that a soldier could simultaneously keep one eye on the battlefield while piloting the robot via a video feed beamed back to his heads-up display. He never had to take his finger off the trigger.
I blinked and saw a robot's-eye view of traffic cones arranged on a fluorescent green square of artificial turf. I turned my head first to the left, then to the right. The gimbal-mounted camera in the tank mimicked the motion, swiveling left, then right. I pushed the thumb stick on the carbine's pistol grip. The tank lurched forward, knocking down a cone.
"Try not to look at the robot," Kent advised.
I turned my back to him and faced the aisle. It was difficult for me to imagine how the soldier of the future would manage both the stress of combat and the information overload that plagues the average office worker. Simply driving the tank made me dizzy, despite Kent's claims that Chatten's head-aiming system increased "situational awareness" and "operational efficiency" by 400 percent. Then again, I wasn't Army material. I was too old, too analog. As a Boeing rep would later explain to me, they were "building systems for kids that are in the seventh and eighth grades right now. They get the PDAs, the digital things, cell phones, IM."
As I crashed the tank around the obstacle course, conventioneers stopped in the aisle to determine why I was pointing a machine gun at them. I aimed the muzzle at the floor.
"The one mission that you simply cannot do without us is armed reconnaissance," Kent said over my shoulder. "Poke around a corner, clear a house . . . We lost thirty-eight guys in Fallujah in exactly those kinds of circumstances, plus a couple hundred wounded. If [the robot] gets killed, there's no letter to write home."
Robots have always been associated with dehumanization and, more explicitly, humanity's extinction. The word "robot" is derived from the Czech word for forced labor, "robota," and first appeared in Karel Capek's 1920 play, R.U.R. (Rossum's Universal Robots), which ends with the destruction of mankind.
This view of robots, popularized in such movies as the Terminator series, troubles Cliff Hudson, who at the time coordinated robotics efforts for the Department of Defense. I ran into Cliff on the second day of the show, outside Carnegie Mellon's National Robotics Engineering Center's booth. Like the scientists in R.U.R., Cliff saw robots as a benign class of mechanized serfs. Military robots will handle most of "the three Ds: dull, dangerous, dirty-type tasks," he said, such as transporting supplies, guarding checkpoints, and sniffing for bombs. The more delicate task of killing would remain in human hands.
"I liken it to the military dog," Cliff said, and brought up a briefing given the previous day by an explosive-ordnance disposal (EOD) officer who had just returned from Iraq. The highlight of the briefing was an MTV-style video montage of robots disarming IEDs. It ended with a soldier walking away from the camera, silhouetted against golden evening sunlight, his loyal robot bumping along the road at his heels. Cliff pressed his hands together. "It's that partnership, it's that team approach," he said. "It's not going to replace the soldier. It's going to be an added capability and enhancer."
Adjacent to where we stood talking in the aisle was a prototype of the Gladiator, a six-wheeled armored car about the size of a golf cart, built by Carnegie Mellon engineers for the Marines. It was one mean enhancer. The prototype was equipped with a machine gun, but missiles could be attached to it as well.
"If you see concertina wire, you send this down range," Cliff said, maintaining his theme of man/robot cooperation. "And then the Marines can come up behind it. It's a great weapon." Despite its capabilities, the Gladiator hadn't won the complete trust of the Marines. "It's a little unstable," Cliff admitted. "Most people are uncomfortable around it when the safety is removed."
Reps proffering business cards began circling around Cliff and his entourage, sweeping me aside. Jorgen Pedersen, a young engineer with thin blond hair and a goatee, watched the scene with bemused detachment, his elbows propped on the Gladiator's turret. Jorgen had written the Gladiator's fire-control software.
"How safe is this thing?" I asked him.
"We wanted it to err on the side of safety first," Jorgen said. "You can always make something more unsafe." In the early stages of the Gladiator's development, Jorgen had discovered that its communications link wasn't reliable enough to allow machine-gun bursts longer than six seconds. After six seconds, the robot would stop firing. So he reprogrammed the fire-control system with a fail-safe.
"You may have great communications here," Jorgen said, touching the Gladiator with his fingertips. "But you take one step back and you're just on the hairy edge of where this thing can communicate well."
The integrity of data links between unmanned systems and their operators is a major concern. Satellite bandwidth, already in short supply, will be stretched even further as more robots and other sophisticated electronics, such as remote sensors, are committed to the battlefield. There's also the possibility that radio signals could be jammed or hijacked by the enemy. But these problems are inherent to the current generation of teleoperated machines: robots that are controlled by humans from afar. As robots become more autonomous, fulfilling missions according to pre-programmed instructions, maintaining constant contact with human operators will be unnecessary. I asked Jörgen if robots would someday replace soldiers on the battlefield. He reiterated the need for a man in the loop.
"Maybe that's because I'm short-sighted based on my current experiences," he said. "Maybe the only way that it could happen is if there's no other people out on that field doing battle. It's just robots battling robots. At that point, it doesn't matter. We all just turn on the TV to see who's winning."
It is almost certain that robot deployment will save lives. both military and civilian. And yet the prospect of robot-on-human warfare does present serious moral and ethical, if not strictly legal, issues. Robots invite no special consideration under the laws of armed conflict, which place the burden of responsibility on humans, not weapons systems. When a laser-guided bomb kills civilians, responsibility falls on everyone involved in the kill chain, from the pilot who dropped the bomb to the commander who ordered the strike. Robots will be treated no differently. It will become vastly more difficult, however, to assign responsibility for noncombatant deaths caused by mechanical or programming failures as robots are granted greater degrees of autonomy. In this sense, robots may prove similar to low-tech cluster bombs or land mines, munitions that "do something that they're not supposed to out of the control of those who deploy them, and in doing so cause unintended death and suffering," according to Michael Byers, professor of global politics and International law at the University of British Columbia.
The moral issues are perhaps similar to those arising from the use of precision-guided munitions (PGMs). There's no doubt that PGMs greatly limit civilian casualties and collateral damage to civilian infrastructure such as hospitals, electrical grids, and water systems. But because PGM strikes are more precise compared with dropping sticks of iron bombs from B-52s, the civilian casualties that often result from PGM strikes are considered necessary, if horribly unfortunate, mistakes. One need look no further than the PGM barrage that accompanied the ground invasion of Iraq in 2003. "Decapitation strikes" aimed at senior Iraqi leaders pounded neighborhoods from Baghdad to Basra. Due to poor intelligence, none of the fifty known strikes succeeded in finding their targets. In four of the strikes forty-two civilians were killed, including six members of a family who had the misfortune of living next door to Saddam Hussein's half brother.
It's not difficult to imagine a similar scenario involving robots instead of PGMs. A robot armed only with a machine gun enters a house known to harbor an insurgent leader. The robot opens fire and kills a woman and her two children instead. It's later discovered that the insurgent leader moved to a different location at the last minute. Put aside any mitigating factors that might prevent a situation like this from occurring and assume that the robot did exactly what it was programmed to do. Assume the commander behind the operation acted on the latest intelligence, and that he followed the laws of armed conflict to the letter. Although the deaths of the woman and children might not violate the laws of armed conflict, they fall into a moral black hole where no one, no human anyway, is directly responsible. Had the innocents of My Lai and Haditha been slain not by errant men but by errant machines, would we know the names of these places today?
More troubling than the compromised moral calculus with which we program our killing machines is how robots reduce even further the costs, both fiscal and human, of the choice to wage war. Robots do not have to be recruited, trained, fed, or paid extra for combat duty. When they are destroyed, there are no death benefits to disburse. Shipping them off to hostile lands doesn't require the expenditure of political capital either. There will be no grieving robot mothers pitching camp outside the president's ranch gates. Robots are, quite literally, an off-the-shelf war-fighting capability -- war in a can.
This bloodless vision of future combat was best captured by a billboard I saw at the exhibition, in the General Dynamics booth. The billboard was titled "Robots as Co-Combatants," and two scenes illustrated the concept in the garish style of toy-model-box art. One featured UGVs positioned on a slope near a grove of glossy palm trees. In the distance, a group of mud-brick buildings resembling a walled compound was set against a barren mountain range. Bright red parabolas traced the trajectories of mortar shells fired into the compound from UGVs, but there were no explosions, no smoke.
The other scene was composed in the gritty vernacular of television news footage from Iraq. A squad of soldiers trotted down the cracked sidewalk of a city street, past stained concrete facades and terraces awash in glaring sunlight. A small, wingless micro-UAV hovered above the soldiers amid a tangled nest of drooping telephone lines, projecting a cone of white light that suggested an invisible sensor beam. And smack in the foreground, a UGV had maneuvered into the street, guns blazing. In both scenes, the soldiers are incidental to the action. Some don't even carry rifles. They sit in front of computer screens, fingers tapping on keyboards.
On the last day of the show, I sat in the concession area, chewing a stale pastry and scanning the list of the day's technical sessions. Most were dry, tedious affairs with such titles as "The Emerging Challenge of Loitering Attack Missiles." One session hosted by Foster-Miller, the company that manufactures the SWORDS robot, got my attention: "Weaponization of Small Unmanned Ground Vehicles." I filled my coffee cup and hustled upstairs.
I took a seat near the front of the conference room just as the lights dimmed. Hunched behind a podium, a Foster-Miller engineer began reading verbatim from a PowerPoint presentation about the history of SWORDS, ending with a dreary bullet-point list cataloguing the past achievements of the TALON robot, SWORDS's immediate predecessor.
"TALON has been used in most major, major... " The engineer faltered.
"Conflicts," someone in the audience Stage-whispered. I turned to see that it was Bob Quinn. He winked at me in acknowledgment.
"Conflicts," the engineer said. He ended his portion of the talk with the same video montage that had inspired Cliff Hudson to compare robots to dogs. TALON robots were shown pulling apart tangles of wire connected to IEDs, plucking at garbage bags that had been tossed on the sides of darkened roads, extracting mortar shells hidden inside Styrofoam cups. Bob Quinn took the podium just as the final shot in the montage, that of the soldier walking down the road with his faithful TALON robot at his heels, faded on the screen behind him. The lights came up.
"The 800-pound gorilla, or the bully in the playpen, for weaponized robotics -- for all ground-based robots -- is Hollywood," Bob said. The audience stirred. Bob strolled off the dais and stood in the aisle, hands in his pockets. "It's interesting that UAVs like the Predator can fire Hellfire missiles at will without a huge interest worldwide. But when you get into weaponization of ground vehicles, our soldiers, our safety community, our nation, our world, are not ready for autonomy. In fact, it's quite the opposite."
Bob remained in the aisle, narrating a series of PowerPoint slides and video clips that showed SWORDS firing rockets and machine guns, SWORDS riding atop a Stryker vehicle, SWORDS creeping up on a target and lobbing grenades at it. His point was simple: SWORDS was no killer robot, no Terminator. It was a capable weapons platform firmly in the control of the soldiers who operated it, nothing more. When the last video clip didn't load, Bob stalled for time.
"We've found that using Hollywood on Hollywood is a good strategy to overcome some of the concerns that aren't apparent with UAVs but are very apparent with UGVs," he said. Last February a crew from the History Channel had filmed SWORDS for an episode of Mail Call, a half-hour program hosted by the inimitable R. Lee Ermey, best known for his role as the profane drill sergeant in the movie Full Metal Jacket. Ermey's scowling face suddenly appeared onscreen, accompanied by jarring rock music.
"It's a lot smarter to send this robo-soldier down a blind alley than one of our flesh-and-blood warriors," Ermey shouted. "It was developed by our troops in the field, not some suit in an office back home!"
Ermey's antic mugging was interspersed with quick cutaways of SWORDS on a firing range and interviews with EOD soldiers.
"The next time you start thinking about telling the kids to put away that video game, think again!" Ermey screamed. He jabbed his finger into the camera. "Some day they could be using those same kinds of skills to run a robot that will save their bacon!"
"That's a good way to get off the stage," Bob said. He was smiling now, soaking in the applause. "I think armed robots will save soldiers' lives. It creates an unfair fight, and that's what we want. But they will be teleoperated. The more as a community we focus on that, given the Hollywood perceptions, the better off our soldiers will be."
Downstairs in the exhibit hall, I saw that Boeing had also learned the value of Hollywood-style marketing. I had stopped by the company's booth out of a sense of obligation more than curiosity: Boeing is the lead contractor for FCS. While I was talking to Stephen Bishop, the FCS business-development manager, I noticed a familiar face appear on the laptop screen behind him.
"Is that -- MacGyver?"
Stephen nodded and stepped aside so that I could get a better view of the laptop. The face did indeed belong to Richard Dean Anderson, former star of the television series MacGyver and now the star of a five-minute promotional film produced by Boeing. Judging by the digital special effects, the film probably cost more to make than what most companies had spent on their entire exhibits. Not coincidentally, the film is set in 2014, when the first generation of FCS vehicles are scheduled for full deployment. An American convoy approaches a bridge near a snowy mountain pass somewhere in Asia, perhaps North Korea. The enemy mobilizes to cut the Americans off, but they are detected and annihilated by armed ground vehicles and UAVs.
At the center of this networked firestorm is Richard Dean Anderson, who sits inside a command vehicle, furrowing his brow and tapping a computer touchscreen. As the American forces cross the bridge, a lone enemy soldier hiding behind a boulder fires a rocket at the lead vehicle and disables it. The attack falters.
"I do not have an ID on the shooter!" a technician yells. Anderson squints grimly at his computer screen. It's the moment of truth. Does he pull back and allow the enemy time to regroup, or does he advance across the bridge, exposing his forces to enemy fire? The rousing martial soundtrack goes quiet.
"Put a 'bot on the bridge," Anderson says.
A dune-buggy-like robot darts from the column of vehicles and stops in the middle of the bridge in a heroic act of self-sacrifice. The lone enemy soldier takes the bait and fires another missile, destroying the robot and unwittingly revealing his position to a micro-UAV loitering nearby. Billions of dollars and decades of scientific research come to bear on this moment, on one man hiding behind a snow-covered boulder. He is obliterated.
"Good job," Anderson sneers. "Now let's finish this."
The film ends as American tanks pour across the bridge into enemy territory. The digitally enhanced point of view pulls back to reveal the FCS network, layer by layer, vehicle by vehicle, eighteen systems in all, until it reaches space, the network's outer shell, where a spy satellite glides by.
"Saving soldiers' lives," Stephen said, glancing at his press manager to make sure he was on message. I commended the film's production values. Stephen seemed pleased that I'd noticed.
"Three-stars and four-stars gave it a standing ovation at the Pentagon last November," he told me.
"You can't argue with MacGyver," I said. "Because it's all about saving soldiers' lives," Stephen said.
"Works for congressmen, works for senators, works for the grandmother in Nebraska."
Later that summer I visited Picatinny Arsenal, "Home of American Firepower," in New Jersey, to see a live-fire demonstration of the SWORDS robot. SWORDS was conceived at Picatinny by a small group of EOD soldiers who wanted to find a less dangerous way to "put heat on a target" inside caves in Afghanistan. Three years later, SWORDS was undergoing some final tweaks at Picatinny before being sent to Aberdeen Proving Ground for its last round of safety tests. After that, it would be ready for deployment.
"As long as you don't break my rules you'll be fine," said Sergeant Jason Mero, motioning for us to gather around him. Sgt. Mero had participated in the initial invasion of Iraq, including the assault on Saddam International Airport. He had buzzed sandy brown hair, a compact build, and the brusque authority common to non-commissioned officers. He told us exactly where we could stand, where we could set up our cameras, and assured us that he was there to help us get what we needed. Other than the "very, very loud" report of the M240 machine gun, there was little to worry about.
"The robot's not going to suddenly pivot and start shooting everybody," he said, without a hint of irony.
A crew from the Discovery Networks' Military Channel dragged their gear onto the range. They were filming a special on "War-bots," and the producer was disappointed to learn that the SWORDS robot mounted with a formidable-looking M202 grenade launcher wasn't operable. He would have to make do with the less telegenic machine-gun variant. The producer, Jonathan Gruber, wore a canvas fishing hat with the brim pulled down to the black frames of his stylish eyeglasses. Jonathan gave stage directions to Sgt. Mero, who knelt in the gravel next to SWORDS and began describing how the loading process works.
"Sergeant, if you could just look to me," Jonathan prompted. "Good. So, is a misfeed common?"
"No, not with this weapon system," Sgt. Mero said. "It's very uncommon." "My questions are cut out," Jonathan said. "So if you could repeat my question in the answer? So, you know, 'Misfeeds are not common.'"
"Mis--" Sgt. Mero cleared his throat. His face turned red. "However, misfeeds are not common with the M240 bravo."
"Okay, great. I'm all set for now, thanks."
The firing range was scraped out of the bottom of a shallow gorge, surrounded on all sides by trees and exposed limestone. Turkey vultures circled above the ridge. The weedy ground was littered with spent shell casings and scraps of scorched metal. Fifty yards from where I sat, two human silhouettes were visible through shoulder-high weeds in front of a concrete trap filled with sand. Sgt. Mero hooked a cable to SWORDS's camera, then flipped a red switch on the control box. I felt the M240's muzzle blast on my face as SWORDS lurched backward on its tracks, spilling smoking shells on the ground.
A cloud of dust billowed behind the silhouettes. Sgt. Mero fired again, then again. With each burst, recoil pushed SWORDS backward, and Sgt. Mero, staring at the video image on the control box's LCD screen, readjusted his aim. I could hear servos whining. When Sgt. Mero finished the ammunition belt, he switched off SWORDS and led us downrange to the targets.
"So, um, Sergeant?" Jonathan said. "As soon as you see our camera you can just start talking."
"As you see, the M240--"
"And Sergeant?" Jonathan interrupted. "I don't think you have to scream. You can just speak in a normal voice. We're all close to you."
"The problem with a heavy machine gun is, obviously, there's going to be a lot of spray," Sgt. Mero said, bending down to pick up one of the silhouettes that had fallen in the weeds. "Our second guy over here that we actually knocked down -- he didn't get very many bullets, but he actually got hit pretty hard."
Through the weeds I spotted the SWORDS robot squatting in the dust. My heart skipped a beat. The machine gun was pointed straight at me. I'd watched Sgt. Mero deactivate SWORDS. I saw him disconnect the cables. And the machine gun's feed tray was empty. There wasn't the slightest chance of a misfire. My fear was irrational, but I still made a wide circle around the robot when it was time to leave.
Within n our lifetime, robots will give us the ability to wage war without committing ourselves to the human cost of actually fighting a war. War will become a routine, a program. The great nineteenth-century military theorist Carl von Clausewitz understood that although war may have rational goals, the conduct of war is fundamentally irrational and unpredictable. Absent fear, war cannot be called war. A better name for it would be target practice.
Back on the firing line, Sgt. Mero booted up SWORDS and began running it around the range for the benefit of the cameras. It made a tinny, rattling noise as it rumbled over the rocks. A Discovery crewman waddled close behind it, holding his camera low to the ground. He stumbled over a clump of weeds, and for a second I thought he was going to fall on his face. But he regained his balance, took a breath, and ran to catch up with the robot.
"I think I'm good," Jonathan said after the driving demonstration. "Anything else you want to add about this?"
"Yeah," Sgt. Mero said, smiling wryly. "It kicks ass. It's awesome." In repentance for this brief moment of sarcasm, Sgt. Mero squared his shoulders, looked straight into the camera, and began speaking as if he were reading from cue cards. "These things are amazing," he said breathlessly. "They don't complain, like our regular soldiers do. They don't cry. They're not scared. This robot here has no fear, which is a good supplement to the United States Army."
"That's great," Jonathan said.
** Steve Featherstone is a writer and photographer in Syracuse, New York. His last article for Harper's Magazine, "The Line Is Hot, " appeared in the December 2005 issue.
* Harpers, February, 2007.
Illustrations by Travis Coburn
|FAIR USE NOTICE:
This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. I am making such material available in my efforts to advance understanding of issues of environmental and humanitarian significance. I believe this constitutes a 'fair use' of any such copyrighted material as provided for in section 107 of the US Copyright Law. In accordance with Title 17 U.S.C. Section 107, the material on this site is distributed without profit to those who have expressed a prior interest in receiving the included information for research and educational purposes. For more information go to: http://www.law.cornell.edu/uscode/17/107.shtml. If you wish to use copyrighted material from this site for purposes of your own that go beyond 'fair use', you must obtain permission from the copyright owner.