• Tous les prix incluent la TVA.
Il ne reste plus que 11 exemplaire(s) en stock (d'autres exemplaires sont en cours d'acheminement).
Expédié et vendu par Amazon. Emballage cadeau disponible.
Quantité :1
Wired for War: The Roboti... a été ajouté à votre Panier
+ EUR 2,99 (livraison)
D'occasion: Très bon | Détails
Vendu par worldofbooksfr
État: D'occasion: Très bon
Commentaire: The book has been read, but is in excellent condition. Pages are intact and not marred by notes or highlighting. The spine remains undamaged.
Vous l'avez déjà ?
Repliez vers l'arrière Repliez vers l'avant
Ecoutez Lecture en cours... Interrompu   Vous écoutez un extrait de l'édition audio Audible
En savoir plus
Voir les 3 images

Wired for War: The Robotics Revolution and Conflict in the 21st Century (Anglais) Broché – 29 décembre 2009

5 étoiles sur 5 1 commentaire client

Voir les formats et éditions Masquer les autres formats et éditions
Prix Amazon
Neuf à partir de Occasion à partir de
Format Kindle
"Veuillez réessayer"
"Veuillez réessayer"
EUR 9,22
EUR 8,10 EUR 6,20

Offres spéciales et liens associés

Produits fréquemment achetés ensemble

  • Wired for War: The Robotics Revolution and Conflict in the 21st Century
  • +
  • Drone Warfare: Killing by Remote Control
Prix total: EUR 22,94
Acheter les articles sélectionnés ensemble

Descriptions du produit


Table of Contents

Title Page

Copyright Page
































Photo Insert


Children at War



Corporate Warriors:
The Rise of the Privatized Military Industry

Published by the Penguin Group
Penguin Group (USA) Inc., 375 Hudson Street, New York, New York 10014, U.S.A. Penguin
Group (Canada), 90 Eglinton Avenue East, Suite 700, Toronto, Ontario, Canada M4P 2Y3
(a division of Pearson Penguin Canada Inc.) Penguin Books Ltd, 80 Strand, London WC2R 0RL,
England Penguin Ireland, 25 St. Stephen’s Green, Dublin 2, Ireland (a division of Penguin
Books Ltd) Penguin Books Australia Ltd, 250 Camberwell Road, Camberwell, Victoria 3124,
Australia (a division of Pearson Australia Group Pty Ltd) Penguin Books India Pvt Ltd,
11 Community Centre, Panchsheel Park, New Delhi-110 017, India Penguin Group (NZ),
67 Apollo Drive, Rosedale, North Shore 0632, New Zealand (a division of Pearson
New Zealand Ltd) Penguin Books (South Africa) (Pty) Ltd, 24 Sturdee Avenue,
Rosebank, Johannesburg 2196, South Africa


Penguin Books Ltd, Registered Offices: 80 Strand, London WC2R 0RL, England


First published in 2009 by The Penguin Press,
a member of Penguin Group (USA) Inc.


Copyright © P. W. Singer, 2009

All rights reserved


ISBN: 9781440685972




Without limiting the rights under copyright reserved above, no part of this publication may be reproduced, stored in or introduced into a retrieval system, or transmitted, in any form or by any means (electronic, mechanical, photocopying, recording or otherwise), without the prior written permission of both the copyright owner and the above publisher of this book.


The scanning, uploading, and distribution of this book via the Internet or via any other means without the permission of the publisher is illegal and punishable by law. Please purchase only authorized electronic editions and do not participate in or encourage electronic piracy of copyrightable materials. Your support of the author’s rights is appreciated.

This is your last chance. After this, there is no turning back. You take the blue pill—the story ends, you wake up in your bed and believe whatever you want to believe. You take the red pill—you stay in Wonderland and I show you how deep the rabbit-hole goes. Remember that all I am offering is the truth. Nothing more.



—Larry and Andy Wachowski, The Matrix, 1999



Those people who think they know everything are a great annoyance to those of us who do.





Because robots are frakin’ cool.

That’s the short answer to why someone would spend four years researching and writing a book on new technologies and war. The long answer is a bit more complex.

As my family will surely attest, I was a bit of an odd kid. All kids develop their hobbies and even fixations, be it baseball cards or Barbie dolls. Indeed, I have yet to meet a six-year-old boy who did not have an encyclopedic knowledge of all things dinosaur. For me growing up, it was war. I could be more polite and say military history, but it was really just war. In saying the same about his childhood, the great historian John Keegan wrote, “It is not a phrase to be written, still less spoken with any complacency.” But it is true nonetheless.

Perhaps the reason lies in the fact that the generations before me had all served in the military. They left several lifetimes’ worth of artifacts hidden around the house for me to pilfer and play with, whether it was my dad’s old military medals and unit insignia, which I would take out and pin to my soccer jersey, or the model of the F-4 Phantom jet fighter that my uncle had flown over Vietnam, which I would run up and down the stairs on its missions to bomb Legoland.

But the greatest treasure trove of all was at my grandparents’ house. My grandfather passed away when I was six, too young to remember him as much more than the kindly man whom we would visit at the nursing home. But I think he may have influenced this aspect of me the most.

Chalmers Rankin Carr, forever just “Granddaddy” to me, was a U.S. Navy captain who served in World War II. Like all those from what we now call “the Greatest Generation,” he was one of the giants who saved the world. Almost every family gathering would include some tale from his or my grandmother’s (“Maw Maw” to us grandkids) experiences at war or on the home front.

It’s almost a cliché to say, but the one that stands out is the Pearl Harbor story; although, as with all things in my family, it comes with a twist. On December 7, 1941, my grandfather was serving in the Pacific Fleet on a navy transport ship. For three months after the Pearl Harbor attack, the family didn’t hear any word from him and worried for the worst. When his ship finally came back to port (it had actually sailed out of Pearl Harbor just two days before the attack), he immediately called home to tell his wife (my grandmother) and the rest of his family that he was okay. There were only two problems: he had called collect, and that side of my family is Scotch-Irish. No one would accept the charges. While my grandfather cursed the phone operator’s ear off, in the way that only a sailor can, on the other end the family explained to the operator that since he was calling, he must be alive. So there was no reason to waste money on such a luxury as a long-distance phone call.

Granddaddy’s study was filled with volume after volume of great books, on everything from the history of the U.S. Navy to biographies of Civil War generals. I would often sneak off to this room, pull out one of the volumes, and lose myself in the past. These books shaped me then and stay with me now. One of my most prized possessions is an original-edition 1939 Jane’s Fighting Ships that my grandfather received as a gift from a Royal Navy officer, for being part of the crew that shipped a Lend-Lease destroyer to the Brits. As I type these very words, it peers down at me from the shelf above my computer.

My reading fare quickly diverged from that of the other kids at Myers Park Elementary School. A typical afternoon reading was less likely to be exploring how Encyclopedia Brown, Boy Detective, cracked The Case of the Missing Roller Skates than how Audie Murphy, the youngest soldier ever to win the Medal of Honor, went, as he wrote in his autobiography, To Hell and Back. War soon morphed over into the imaginary world that surrounds all kids like a bubble. Other kids went to Narnia, I went to Normandy. While it may have looked like a normal Diamondback dirt bike, my bicycle was the only one in the neighborhood that mounted twin .50-caliber machine guns on the handlebars, to shoot down any marauding Japanese Zeros that dared to ambush me on my way to school each morning. I still remember my mother yelling at me for digging a five-foot-deep foxhole in our backyard when I was ten years old. She clearly failed to understand the importance of setting up a proper line of defense.

I certainly can’t claim to have been a normal kid, but in my defense, you also have to remember the context. To be so focused on war was somewhat easier in that period. It was the Reagan era and the cold war had heated back up. The Russians wouldn’t come to our Olympics and we wouldn’t go to theirs, the military was cool again, and we had no questions about whether we were the good guys. Most important, as a young Patrick Swayze and Charlie Sheen taught us in Red Dawn, not only were the Commies poised to parachute right into our schools, but it was likely us kids who would have to beat them back.

What I find interesting, and a sign of the power of Hollywood’s marketing machine, is that usually some artifact from science fiction is in the background of these memories, intertwined with the history. For example, when I think back to my childhood bedroom, there are the model warships from my grandfather’s era lined up on display, but also Luke, Leia, Han, and Chewbacca peeking up from my Star Wars bedsheets.

As most of science fiction involved some good guy battling some bad guy in a world far, far away, the two memes of my fantasy world went together fairly well. In short, your author was the kind of little boy to whom a stick was not a mere piece of wood, but the makings of a machine gun or a lightsaber that could save the world from both Hitler and Darth Vader.


I look back on these memories with some embarrassment, but also guilt. Of course, even then, I knew that people die in war and many soldiers didn’t come home, but they were always only the buddy of the hero, oddly enough usually from Brooklyn in most World War II movies. The reality of war had no way of sinking in.

It was not until years later that I truly understood the human costs of war. I remember crossing a jury-rigged bridge into Mostar, a town in Bosnia that saw some of the worst fighting in the Yugoslav civil war. I was there as part of a fact-finding mission on the UN peacekeeping operation. Weeks of back-and-forth fighting had turned block after block of factories and apartments on the riverfront into a mass of hollowed-out hulks. The pictures of World War II’s Stalingrad in an old book on my grandfather’s shelf had sprung up to surround and encompass me. The books never had any smell other than dust, but here, even well after the battles, a burnt, fetid scent still hung in the air. Down the river were the remnants of an elegant 500-year-old bridge, which had been blasted to pieces by Serb artillery. The people, though, were the ones who drove it home. “Haunted” is the only adjective I can think of to describe the faces of the refugees.

The standout memory, though, was of a local provincial governor we met with. A man alleged to have orchestrated mass killing and ethnic cleansing campaigns for which he would soon after be indicted, he sat at an immense wooden desk, ominously framed by two nationalist paramilitary (and hence illegal) flags. But he banally talked about his plans to build up the tourism industry after the war. He explained that the war had destroyed many of the factories and cleaned out whole villages. So on the positive side, the rivers were now clear and teeming with fish. Forget the war crimes or the refugees, he argued, if only the United States and United Nations would wise up and give him money, the package tourists would be there in a matter of weeks.

This paradox between the “good” wars that I had fought in my youth and the seamy underside of war in the twenty-first century has since been the thread running through my writing. During that same trip, I met my first private military contractors, a set of former U.S. Army officers, who were working in Sarajevo for a private company. Their firm wasn’t selling widgets or even weapons, but rather the very military skills of the soldiers themselves. This contradiction between our ideal of military service and the reality of a booming new industry of private companies leasing out soldiers for hire became the subject of my first book, Corporate Warriors: The Rise of the Privatized Military Industry. During the research, I was struck by another breakdown of the traditional model of who was supposed to be at war. In West Africa, the main foes of these new private soldiers were rebel bands, mostly made up of children. Many of these tiny soldiers had been abducted from their schools and homes. For me as a child, war had merely been a matter of play; for these children, war was the only way to survive. My next book, Children at War, tried to tell their story, in a way that didn’t just tug at heartstrings, but also explained the causes and effects of child soldiers, such that we might finally act to end this terrible practice.

This contradiction of war as we imagine it to be, versus how it really is, isn’t just the matter of a young boy growing up and putting his lightsaber away. It is part of something bigger that has haunted humanity from its very start.

One of the original sins of our species is its inability to live at peace. From the very beginning of human history, conflicts over food, territory, riches, power, and prestige have been constant. The earliest forms of human organization were clans that first united for hunting, but soon also for fighting with other clans over the best hunting grounds. The story of the dawn of civilization is a story of war, as these clans transformed into larger tribes and then to city-states and empires. War was both a cause and effect of broader social change. From war sprung the very first specializations of labor, the resulting stratification into economic classes, and the creation of politics itself.

The result is that much of what is written in human history is simply a history of warfare. It is a history that often shames us. And it should. War is not just merely human destruction, but the most extreme of horror and waste wrapped together. Our great religions view war as perhaps the ultimate transgression. In the Bible, for example, King David was prohibited from building his holy Temple, because, as God told him, “You are a warrior who has shed blood” (1 Chronicles 28). The ancient prophets’ ideal vision of the future is a time when we “will learn warfare no more” (Isaiah 2:4). As one religious scholar put it, “War is a sign of disobedience and sinfulness. War is not intended by God. All human beings are made in the image of God and they are precious and unique.”

The same disdain for war was held by our great intellectuals. Thucydides, the founder of both the study of history as well as the science of international relations, described war as a punishment springing from man’s hubris. It is our arrogance chastised. Two thousand years later, Freud similarly described it as emanating from our Thanatos, the part of our psyche that lives out evil.

Yet for such a supposed abomination, we sure do seem to be obsessed with war. From architecture to the arts, war’s horrors have fed the heights of human creativity. Many of our great works of literature, arts, and science either are inspired by war or are reactions to it, from the founding epics of literature like Gilgamesh and the Iliad to the great painters of surrealism to the very origins of the fields of chemistry and physics.

War then, appears in many more guises than the waste of human destruction that we know it to be. War has been described as a testing ground for nobility, the only true place where man’s “arête” (excellence) could be won. In the Iliad, the master narrative for all of Western literature, for example, “fighting is where man will win glory.” From Herodotus to Hegel, war is described as a test of people’s vitality and even one culture’s way of life versus another. War is thus often portrayed in our great books as a teacher—a cruel teacher who reveals both our strengths and faults. Virtues are taught through stories of war from Homer to Shakespeare, while evils to avoid are drawn out by war in stories ranging from Aeschylus to Naipaul.

War is granted credit for all sorts of great social change. Democracy came from the phalanx and citizen rowers of the ancient Greeks, while the story of modern-day civil rights would not be the same without Rosie the Riveter or the African American soldiers of the Red Ball Express in World War II.

War then is depicted as immoral, yet humanity has always found out-clauses to explain its necessity and celebration. The same religions that see violence as a sin also licensed wars of crusade and jihad. And it is equally the case in politics. We repeatedly urge war as the means to either spread or defeat whatever ideology is in vogue at the time, be it enlightenment, imperialism, communism, fascism, democracy, or even simply “to end all wars.”

This paradox continues in American politics today. Avoidance of war has been a traditional tenet of our foreign policy. Yet we have been at war for most of our nation’s history and many of our greatest heroes are warriors. We are simultaneously leaders of weapons development, being the creator of the atomic bomb, and the founders of arms control, which seeks its ban.

We are repulsed by the idea of war, and yet entranced by it. In my mind, there are two core reasons for humankind’s almost obsessive-compulsive disorder. The first is that war brings out the most powerful emotions that define what it is to be human. Bravery, honor, love, leadership, pity, selflessness, comradeship, commitment, charity, sacrifice, hate, fear, and loss all find their definitive expressions in the fires of war. They reach their ultimate highs and lows, and, in so doing, war is almost addictive to human culture. As William James put it, “The horror is the fascination. War is the strong life; it is life in extremis.”

The other reason that war so consumes us is that for all humanity’s advancement, we just can’t seem to get away from it. After nearly every war, we cite the immense lessons we learned that will prevent that calamity from repeating itself. We say over and over, “Never again.” Yet the reality is “ever again.”


If humanity’s fascination comes down to how war reveals its best and worst qualities, this book comes from wrestling with a new contradiction in warfare that humanity finds itself hurtling toward. We embrace war, but don’t like to look to its future, including now one of the most fundamental changes ever in war.

Mine is a generation that, as one analyst put it, is “producing more history than it can consume.” With all the focus over orange alerts and Iraq, it is tough to take a step back and notice some of the tidal shifts we are living through. For example, in my lifetime, computers went from an oddity to omnipresent. I still remember when my dad first took me to the local science museum at the age of eight to see what a computer looked like. You could only communicate with it through an obtuse “Basic” language, a sort of evil technologic shorthand. As I recall, the only useful thing we could do with that early computer (I think it was a Texas Instruments) was design a smiley face made out of hundreds of the letter M, which we then printed on one of those old spool printers that you had to tear the paper off the edges. Today, the last thing my wife does before she goes to bed is check her e-mail on a handheld computer linked up wirelessly to a shared global server, all the while brushing her teeth. In the blink of an eye for history, something revolutionary happened.

Computers seem overwhelming enough, but over the last few years I became more and more convinced that my generation was living through something perhaps even more momentous. From the robot vacuum cleaner that patrols my floors to the unmanned planes that my friends in the air force use to patrol Iraq, humanity has started to engineer technologies that are fundamentally different from all before. Our creations are now acting in and upon the world without us.

I certainly cannot claim to be the only one to see these changes. Bill Gates, the world’s richest man and perhaps most responsible for the spread of computers, for instance, describes robotics today as being where the computer industry was around 1980, poised to change the way we think about what technology can do for us. “As I look at the trends that are now starting to converge, I can envision a future in which robotic devices will become a nearly ubiquitous part of our day-to-day lives. . . . We may be on the verge of a new era, when the PC will get up off the desktop and allow us to see, hear, touch and manipulate objects in places where we are not physically present.”

By the end of 2007, a United Nations report found that there were 4.1 million robots around the world working in people’s homes (as vacuum cleaners and the like). That is, there were more robots than the entire human population of Ireland. The same study found that this “personal” robotics industry had a current market value around $17 billion. What is more important than the raw numbers is the trajectory of the growth. In 2004, the number of personal robots in the world was estimated at 2 million. By 2007, it had doubled. Another 7 million more were expected to be bought by the end of 2008.

Looking forward, many see the numbers expanding at higher rates. By 2010, one technology research group predicts there will be 55.5 million personal robots in the world. This would be just the start. Indeed, in South Korea (human population 49 million), the Ministry of Information and Communication has announced plans to put a robot in every home by 2013. Here in the United States, it will likely take a little longer. One industry leader projects 2014 as the year by which 10 percent of the American population will have some form of personal robot in their household.

Robots are also showing up at work, from the more than forty-five hundred drones doing crop-dusting on Japanese farms to what many males would perhaps find the most disturbing example of mechanized outsourcing, the robot that handles security access at the offices of Victoria’s Secret. Indeed, assembly-line factory robotics is an $8 billion a year industry, growing at a 39 percent pace in the United States. This is not good news for everyone, of course, as it has put many blue-collar workers out of work, most notably at carmakers. Roughly one of every ten workers in automobile manufacturing is now a robot, and Toyota has announced a plan to eventually automate all its factories.

These trends project an industry that many analysts believe is poised for a breakout. Future Horizons, a technology research group based in Kent, England, describes how “the electronics industry is on the cusp of a robotics wave.” Many even think that by 2025, the robotics industry might rival the automobile and computer industries in both dollars and jobs. BusinessWeek summed up the future of the industry as “A Robotics Gold Mine.”

To put it another way, the robots that had once only populated my action figure collection were now becoming all too real. Science fiction appeared to be turning into science reality.


If robots were starting to appear in almost every aspect of life, I began to wonder how they would matter for war and politics. It is with some trepidation that I made such a seeming leap of reason. Indeed, people have long peered into the future and then gotten it completely and utterly wrong. My favorite example took place on October 9, 1903, when the New York Times predicted that “the flying machine which will really fly might be evolved by the combined and continuous efforts of mathematicians and mechanicians in from one million to ten million years.” That same day, two brothers who owned a bicycle shop in Ohio started assembling the very first airplane, which would fly just a few weeks later.

Similarly botched predictions frequently happen in the military field. General Giulio Douhet, the commander of Italy’s air force in World War I, is perhaps the most infamous. In 1921, he wrote a best-selling book called The Command of the Air, which argued that the invention of airplanes made all other parts of the military obsolete and unnecessary. Needless to say, this would be news both to my granddaddy, who sailed out to another world war just twenty years later, and to the soldiers slogging through the sand and dust of Iraq and Afghanistan today.

The result is yet another paradox. It is completely normal to look forward into the future in realms like science, business, or even the weather. But forecasts of the future and, even more important, serious explorations of the changes that might result from such a future are generally avoided in the study of war. People play it safe and the gatekeepers of the field often try to knock down anything that feels too unfamiliar.

My own first experience with this was when I began my research on private military firms. A senior professor thereupon informed me that I would do well to quit graduate school and instead “go become a screenwriter in Hollywood,” for thinking to waste his time on such a fiction as companies providing soldiers for hire. I still wonder how he squares this worldview with the 180,000 private military contractors now deployed in Iraq. A similar thing happened when I first presented my early research on the problem of child soldiers. A professor at Harvard University told me that she didn’t believe child soldiers existed and that I was “making it up.” Today there are some 300,000 children at war around the globe, fighting in three out of every four wars.

The irony is that while we accept change in other realms, we resist trying to research and understand change in the study of war. For example, the very real fear about what the environment will look like as far away as 2050 has driven individuals, governments, and companies alike to begin (belatedly) changing their practices. Yet we seem willing to stay oblivious to the changes that will come well before then for war, even though, just like the changes in global climate, we can already see the outlines of the transformation under way.

Each time I perused the Sharper Image catalog or read a report mentioning a drone taking out a terrorist camp in Afghanistan, I felt myself living at the time of the most important weapons development since the atomic bomb. One could even argue that the rise of these digital warriors is more significant, in that robotics alters not merely the lethality of war, but the very identity of who fights it. The end of humans’ monopoly on war surely seemed something momentous, which historians would talk about centuries from now, if humankind is so lucky to still be around.

Yet for something so seemingly important, no one was talking about it. Time and again, I was struck by this disconnect. For example, as I describe later in the book, I once went to a major conference in Washington, D.C., on “the revolution in military affairs.” The speakers included many of the most notable scholars in the field, as well as several key political and military leaders. And yet, over the course of several hours of pontificating on what was supposedly new and exciting in security issues today, not one mention was made of these new technologies, not even a single word.

Another time I arrived at the airport, facing a long flight but having forgotten a book to read on the plane. So I picked up one of those potboiler paperback novels at the bookstore. It turned out to be a courtroom drama about the suspicious murder of a beautiful scientist. About halfway through the flight, I read one of the characters describe the scientist’s work. “Genetics, nanotechnology, and robotics.... It has the capacity to replace the NBCs of the last century—nuclear, biological and chemical. In its own way the potential is much more insidious. There is always a downside. The other side of the coin of progress. Some people don’t want to take the chance. You can see why. The question is: How do you stop it? How do you put the genie of knowledge back in the bottle?”

And with that, I thought, a fictional character in a cheesy crime novel had just put more thought into the future of war than pretty much the entire set of real-world university political science departments, think tanks, and the foundations that fund them.

This lack of study began to disturb and fascinate me more and more. A failure to research, understand, and weigh the changes going on around us could only lead to bad results for our politics and policies. Yet some of the most important changes in the wars of today and tomorrow were either not talked about at all or merely tossed aside, as one military expert put it, to the categories of “science fiction and futurism.”

And that didn’t seem right. My fear also began that while all this change was incredibly exciting, it was also somewhat terrifying. We seemed to be repeating past cycles of only dealing with a huge change after the fact, when the genie was already out of the bottle. And thus somewhere along the way from reading military books in my grandfather’s study to playing with lightsabers in the backyard, I decided that the serious questions that surround robotics in war and what happens when humankind’s monopoly over it is broken were worthy of study. Wired for War is the result.


When we look back at history, one thing that stands out is how the truly momentous events were often missed. When Gutenberg invented the printing press, no one held a parade. Likewise, when Hitler decided to give up his painting career, no one thought to convince him to give selling crappy watercolors just one more try, in lieu of a bid for world domination. This is all the more difficult in our present chaotic news environment of talk shows, blogs, webcasts, and so forth. As one writer put it, “The true watersheds in human affairs are seldom spotted amid the tumult of headlines broadcast on the hour.”

If my growing sense was that we are in the midst of something important, maybe even a revolution in warfare and technology that will literally transform human history, then my aim in the book’s research quickly became to capture that incredible moment. Imagine, I thought, if we had been able to wrestle with the great changes that atomic bombs brought to politics while they were being invented, rather than waiting to puzzle our way through their implications years later. Moreover, as I set off on my research path, I quickly learned that what was impossible in 1945 is possible now. This revolution is not occurring in secret desert test facilities, but playing out right in front of us.

My goal then became to write a book built on careful scholarship, resting on hard-core research, not speculation or exaggeration. I hoped to offer not only an entry point into this exciting and unnerving change, but also a 360-degree view of what was going on, a resource that could prove useful to leaders and public alike, both today and tomorrow. And yet I also hoped to bring readers the same sense of wonder and amazement that originally drove me on this journey.

If you haven’t noticed by now, this will be a book somewhat different from the normal look at either war or technology. It’s a product of who I am and the forces that shape me. I am the kid who played with Transformers who now consults for the military. I am a scholar who studied under Sam Huntington, one of the most distinguished political scientists of the twentieth century, and yet I am shamefully addicted to watching The Real World. The eloquence and brilliance of authors like John Keegan and Jared Diamond inspire me, and yet the writer I read most religiously is Bill Simmons, the irreverent “Sports Guy” columnist for ESPN, who blogs on the finer points of the NBA Draft and The Bachelor dating show.

Fortunately, the topic I am wrestling with is located where warfare, history, politics, science, business, technology, and popular culture all come together. So unlike most books on war or politics, this one isn’t aimed at just one audience. The issues of robotics and war are so compelling and important that people with all sorts of interests and backgrounds can and should dig into them. In other words, you will find references of both sorts. There will be references of the scholarly type, pointing to the sources of the data, and there will be references of the pop culture type, pointing to parallels and lessons in mass media.

As such, the research for this book involved a melding of methodologies. I spent nearly four years seeking out anything and everything useful that I could find on the topic, whatever the source. I checked out musty old history books that hadn’t left the library in years. I scoured the last twenty years of each U.S. military branch’s professional journals, printing table-high stacks of any article relevant to the topics of war, technology, leadership, and change. I searched the online archives of all the major technology journals. Indeed, I even spent several wonderful days cruising through the “Wookipedia,” the Internet hub for all things Star Wars.

The comedian Stephen Colbert famously said, “I don’t trust books. They’re all fact, no heart.” So I also made sure to interview anyone I could find who brought an important or unique perspective to the issues. I sought out the ideas of robot scientists and weapons developers, professors and journalists, human rights activists and science fiction writers, as well as the men and women now using these new technologies to fight. Rank mattered less to me than what they could bring to the issue. I quizzed four-star generals and secretaries of the army, navy, and air force, as well as nineteen-year-old specialists at the bottom of the chain of command. I met with pilots of robotic drones, who had never left the United States, and special operations soldiers just back from missions in Iraq and Afghanistan. Robots have no one nationality, so my interviews also became quite international. I gathered the views of everyone from German army officers and an Indian news editor to a set of Iraqi insurgents. Where possible, these individuals are identified, but sometimes the interviewees chose not to be named, which I have respected in the citations.

Sometimes these interviews would take place in person and sometimes via e-mail or phone. The research took me from robot factories and military bases around the world to one interview of an Arab general from the backseat of his BMW 7 series luxury car, discussing robot strike scenarios as we tooled about town. A hotel conference room oddly enough turned out to be the most dangerous of all these research locales. While I was observing a meeting of robotics developers and their military counterparts, a rogue robot tried to run me over. A demonstration model brought in by one of the developers, the little bugger was programmed to patrol around the room but avoid contacting any humans. But it just kept on coming and coming and nearly broke my foot. In my mind, it makes me a bit like the nerdy version of John Connor, hunted down by a machine obviously sent to keep you from reading about our robotic future.


As you work your way through the chapters, then, you will likely note a few things. Some admittedly may be a bit different from the traditional book that originates at a public policy “think tank.”

The first is that there is a mix of traditional hard data (numbers, statistics, and the like) as well as a heavy dose of untraditional anecdotes. In turn, the chapters tend to weave together scores of “characters,” rather than following just one throughout the book. My research and interviews brought in an amazing and colorful cast of people at work in this field. In the translation of research to book, I didn’t want to lose this human side. The reason is that, ironically for a topic on nonhuman changes in war, the stories and personalities are what tell us the most about where we are today, as well as where we are heading tomorrow. Or as one scientist described, “These robots are extensions of us.”

The use of vignettes, personalities, and anecdotes also has a methodological rationale. It is not just a more effective way of giving readers a true “feel” of what is going on and capturing our historic moment in time. It is also an echo of the strategy used by ethnographers, who collect individual stories and anecdotes to discern broader trends and conclusions. Indeed, so much of what I learned, as well as how we tend to communicate to each other, came via storytelling that it seems only fitting to share many of these stories in the context of the issues. Indeed, one story may be an anecdote, but a collection of them is data.

Second, the book deals with the future and thus has to be, in part, predictive or conceptual. As the earlier examples showed, this is no easy task. The prognostications of nonscientists often fail because they frequently don’t pay close attention to what is technically feasible or not. In turn, scientists’ predictions tend to overstate the positive, especially when it comes to war. Franklin, Edison, Nobel, and Einstein, for example, all thought that their inventions would end war. They knew the science, but not the social science. Both groups tend to disregard how social change can intervene and intertwine with technology, yielding not one definite future, but rather many possible futures.

Researchers have found that these three problems can be diminished by relying on actual facts rather than hopes or fears, having a firm technical and social science footing, making conclusions built on sound reasoning, and being sure not to ignore the doubts of the skeptics. This book follows those lessons. For instance, you’ll read here only about technologies either operating now or already at the prototype stage. I steer clear of the imaginary ones fueled by the Klingon power packs, dragon’s blood, or the hormones of teenage wizards.

Third, for a book supposedly on the future, there is a lot of history. My sense is not that history repeats itself, but that there are patterns and lessons that we can draw from, a key way to ground any look forward. There will be much change in the future of war, but also much continuity.

Fourth, nothing in this book is classified information. I only include what is available in the public domain. Of course, a few times in the course of the research I would ask some soldier or scientist about a secretive project or document and they would say, “How did you find out about that? I can’t even talk about it!” “Google” was all too often my answer, which says a lot both about security as well as what AI search programs bring to modern research.

Fifth, the book makes many allusions to popular culture, not something you normally find in a research work on war, politics, or science. Some references are obvious and some are not (and thus the first reader to send a complete list of them to me at www.pwsinger.com will receive a signed copy of the book and a Burger King Transformers collectible). It is also, as far as I know, the first book to come out of a think tank with a recommended music playlist, designed to get into the vibe of the research results, also available on the Web site.

The reason for this different approach is not simply to break the mold, or rather mould, of scholarly style or to give heart attacks to the old guard with my generation’s manner of thinking and writing, even on important issues like war. Rather, as much as we pointy-headed scholars hate to admit, this is how people process information most efficiently. Humankind has long best understood and digested things that are new by flavoring them with stories of personal experience (“There was this one time, in band camp, where we ...”) as well as by allusions to what is already culturally familiar, especially icons, symbols, and metaphors (“It’s just like when ...”). And, whether we like it or not, our twenty-first-century folklore is that of the popular movies, TV shows, music, gadgets, and books that shaped us growing up.

You have now been dutifully warned of what may come. I hope you find the results of this journey simultaneously interesting, educational, and maybe even a bit scary. In other words: frakin’ cool.





We are building the bridge to the future while standing on it.





There was little to warn of the danger ahead. The Iraqi insurgent had laid his ambush with great cunning. Hidden along the side of the road, the bomb looked like any other piece of trash or scrap metal. American soldiers call these jury-rigged bombs “IEDs,” official shorthand for improvised explosive devices. The team hunting for the IED is called an Explosive Ordnance Disposal, or EOD, team; they are the military’s bomb squads.

Before Iraq, the EOD teams were not much valued by either the troops in the field or their senior leaders. They usually deployed to battlefields only after the fighting was done, to defuse any old weapons caches or unexploded ammunition that might be found. It was dangerous work, but not one that gained the EODs much acclaim. But in Iraq, the IED quickly became the insurgents’ primary way of lashing back at U.S. forces. In the first year of the fighting, there were 5,607 roadside bomb attacks. By 2006, the insurgents were averaging nearly 2,500 a month.

Cheap and easy to make, IEDs took a grievous toll, becoming the leading cause of casualties among American troops as well as Iraqi civilians. They also limited the ability of U.S. forces to move about safely and carry out their missions, such that the commanding general quickly determined that among all the myriad problems in Iraq, “IEDs are my number one threat.” In response, the Pentagon soon was spending more than $6.1 billion to counter IEDs in Iraq.

The EOD teams were tasked with defeating this threat, roving about the battlefield to find and defuse the IEDs before they could explode and kill. The teams went from afterthought to, as one journalist put it, “one of the most important assignments on the battlefield.” In a typical tour in Iraq, each team will go on more than six hundred IED calls, defusing or safely exploding about two IEDs a day. Perhaps the best sign of how critical the EOD teams became is that the insurgents began offering a rumored $50,000 bounty for killing an EOD team member.

Unfortunately, this particular mission would not end well. By the time the soldier had advanced close enough to see the telltale explosive wires of an IED, it was too late. There was no time to defuse the bomb and no time to escape. The IED erupted in a wave of flame.

Depending on how much explosive the insurgent has packed into an IED, a soldier must be as far as fifty yards away to escape death and even as much as half a mile away to escape injury from the blast and bomb fragments. Even if you are not hit, the pressure from the blast can break your limbs. This soldier, though, had been right on top of the bomb. Shards of metal shrapnel flew in every direction at bullet speed. As the flames and debris cleared and the rest of the team advanced, they found little left of the soldier. Hearts in their throats, they loaded the remains onto a helicopter, which took them back to the base camp near Baghdad International Airport.

Writing home after such an incident may be the toughest job for a leader. That night, the team’s commander, a U.S. Navy chief petty officer, did his sad duty. The effect of this explosion had been particularly tough on his unit. They had lost their most fearless and technically savvy soldier. More important, they had also lost a valued member of the team, a soldier who had saved the others’ lives many times over. The soldier had always taken the most dangerous roles, scouting ahead for IEDs and ambushes. Despite this, the other soldiers in the unit had never once heard a complaint.

This being a war in the age of instant communication, there was no knock on the door of some farmhouse in Iowa, as is always the case in the old war movies. Instead, the chief’s letter was sent via e-mail. In his condolences, the chief noted the soldier’s bravery and sacrifice. He apologized for his inability to change what had happened. But he also expressed his thanks and talked up the silver lining to the tragedy. As the chief wrote, “When a robot dies, you don’t have to write a letter to its mother.”


The destination of the e-mail was a gray, concrete, two-story office building located in a drab corporate park just outside Boston. On the corner of the building is the sign for the soldier’s maker, a company called iRobot.

The complex is an outgrowth of the Burlington Shopping Mall, so across the street are a Men’s Wearhouse discount suit store and a Macaroni Grill, a faux Italian restaurant chain known less for its pasta than the fact that it allows you to crayon on your tablecloth. It may seem an odd cradle for the future of war, but then again, no one standing outside a bicycle shop in Dayton, Ohio, some hundred years back thought, “Ah yes, this must be the home of a new era of package tourism, lost luggage, and strategic bombing.”

The inside of iRobot is just like any other office building, with drab colors on the walls and mind-numbing rows of cubicles filled with staff punching away at their keyboards. The difference at iRobot is that the obligatory corporate board-room doubles as a small museum of robots and every so often a loud thump comes from a robot crashing into the wall. When I arrive for my visit, some of the employees are testing out a tracked robot, driving it down the hallway with a jury-rigged Xbox video game controller. Think Office Space crossed with Asimov.

iRobot was founded in 1990 by three MIT computer geeks, Colin Angle, the CEO, Helen Greiner, the chairman of the board, and Rodney Brooks, their former professor, who doubled as chief technical officer. Brooks was already considered one of the world’s leading experts on robotics and artificial intelligence, Greiner would eventually be named one of “America’s Best Leaders” by U.S. News & World Report, and Angle’s work would become so influential that his undergrad thesis paper ended up at the Smithsonian. iRobot, though, was no sure thing at the start. There was no real market for robots, the company’s first home was Angle’s living room, and their CEO’s only previous job had been as a summer camp counselor.

iRobot the company took its name from I, Robot the Isaac Asimov science fiction novel (later made into a Will Smith movie). Asimov laid out a vision in which humans of the future share the world with robots. His fictional robots carry out not only mundane chores but also make life-and-death decisions.

The real-world firm started out slowly with some small-scale government contracts and several attempts at robotic toys. Its first robot was Genghis, a tiny bot designed to scramble across the surfaces of other planets for NASA. On the toy front, it tried to sell a doll that laughed when tickled and a robot dinosaur, a velociraptor inspired by the movie Jurassic Park. None of their products made a splash. “We were the longest overnight success story ever,” said Greiner.

In time, iRobot developed two products that would make its mark on the world. The first is Roomba, the first mass-marketed robotic vacuum cleaner. Roomba is a disc-shaped robot thirteen inches in diameter and just over three inches high. It is basically a large Frisbee that roams about the floor, automatically vacuuming it clean. Roomba’s sensors figure out the size and shape of your room, and with the push of the “clean” button it goes to work. Indeed, Roomba is smart enough to avoid falling down the stairs and even knows how to return to its charger when the power is running low. Roomba actually evolved from Fetch, a robot that the company designed in 1997 for the U.S. Air Force. Fetch cleaned up cluster bomblets from airfields; Roomba cleans up dust bunnies under sofas. Released in 2002, Roomba became a media darling, appearing in everything from the Sharper Image catalog to the Today show, and soon was one of the most sought-after gadgets to give at Christmas.

iRobot’s other breakout product was PackBot, the “soldier” blown up by that IED in Iraq. PackBot came out of a contract from the Defense Advanced Research Projects Agency, or DARPA, in 1998. Weighing forty-two pounds and costing just under $150,000, PackBot is about the size of a lawn mower. It is typically controlled via remote control, although it can drive itself, including even backtracking to wherever it started its mission. PackBot moves using four “flippers,” essentially tank treads that can rotate on one axis. These allow PackBot not only to roll forward and backward like regular tank tracks, but also to flip its tracks up and down to climb stairs, rumble over rocks, squeeze down twisting tunnels, and even swim in under six feet of water. The tracks are made of a hard rubberlike polymer that iRobot patented. They are specially designed to be used on any surface, ranging from the mud of a battlefield to the tiled floor of an office building.

The designers at iRobot look at their robots as “platforms.” PackBot has eight separate payload bays and hookups that allow its users to swap in whatever they need: mine detector, chemical and biological weapons sensor, or just extra power packs. The EOD version of the PackBot that served in Iraq comes with an extendable arm on top that mounts both a head, containing a high-powered zoom camera, and a clawlike gripper. Soldiers use these to drive up to IEDs, peer at them closely, and then, using the gripper, disassemble the bomb, all from a safe distance.

PackBot made its operational debut on the fateful day of September 11, 2001. With all air traffic grounded after the destruction of the World Trade Center, engineers from iRobot loaded their robots into cars and drove down to help in the rescue and recovery efforts at Ground Zero. A New York Times article entitled “Agile in a Crisis, Robots Show Their Mettle” described them as “rescuers [that] are unaffected by the carnage, dust and smoke that envelop the remains of the World Trade Center. They are immune to the fatigue and heartbreak that hang in the air.”

Soon after, PackBot went to war. As U.S. forces deployed to Afghanistan, troops came across massive cave complexes that had to be scouted out, but were often booby-trapped. The only specialized tool the troops had were flashlights, and they had to crawl through the caves on hands and knees. Usually, the GIs would send their local Afghan allies down into the caves first, but as one soldier put it, “We began to run out of Afghans.” iRobot was then asked by the Pentagon to send help. Just six weeks later, PackBots made their debut in a cave complex near the village of Nazaraht, in the heart of Taliban territory. iRobot was now at war.

With both the Roomba and PackBot becoming hits (the first test robots sent to Afghanistan were so popular with the troops, they wouldn’t let the company take them back), the business that had started in a living room took off. In the next five years, the company’s revenue and profits grew by a factor of ten. By 2007, more than three million Roombas had been sold at over seven thousand retail stores. On the military side, the war robot business grew by as much as 60 percent a year, culminating in a $286 million Pentagon contract in 2008 to supply as many as three thousand more machines. The PackBot was in such demand that the space reserved for it in iRobot’s museum was empty when I visited the offices. The display model had been deployed to Iraq.

With these successes under its belt, iRobot was ready for the big time. It entered the stock market, with its IPO underwritten by two of the most prestigious investment houses in the world, Morgan Stanley and J.P. Morgan. On the first day of trading, iRobot’s public value hit $620 million. At the market’s close, a PackBot rang the bell at the New York Stock Exchange, the first robot ever to do so.


iRobot’s business model splits its sales effort between a consumer division that targets robots for the home and a government and industrial robots division that mainly targets the military. The military business currently makes up about a third of revenue, but market analysts are “really excited by it” and predict it will soon become about half the company’s revenue. iRobot also has a vibrant research team led by Andrew Bennett, who was part of the team that raced to New York on 9/11. This group lays the groundwork for future advances, and has some fifty patents either approved or pending.

This split in iRobot’s customer base can make for some amusement. iRobot may be the only company that sells at both Pentagon trade shows and Linens ’n Things. In the customer testimonials section of its Web site, the chief’s letter about his robot in Iraq is just below one from “Janine,” a housewife from Connecticut. While he talked about how his robot saved lives in battle, she thanked the company because “I have four boys and two cats and this little ‘robot’ keeps my rugs and hardwood floors dirt and hair free!”

The firm plans to continue to advance the frontiers of cleaning floors and fighting wars. It has followed up the Roomba with Scooba, which washes and scrubs floors, and the Dirt Dog, a heavy-duty cleaner designed for sucking up nuts and bolts in workshops and off factory floors. The online advertisements for the robots tell potential buyers, “You’ve done enough; leave the cleaning to a robot.”

“One of our challenges is getting people who aren’t familiar with the product and who haven’t really thought about robots being real before to give it a shot,” says Colin Angle, iRobot’s CEO. “This is all very new stuff. We’re continually trying to find new ways of helping people get over the skepticism to really imagine how robots in their lives could really be helpful.” Indeed, with less than 1 percent of U.S. households owning cleaning robots, “the demographics of our purchasers suggest we’re just scratching the surface of what’s possible,” says Angle. iRobot recently launched a multimillion-dollar advertising campaign called “I Love Robots” that shows people talking about their robots and the work they do.

On the military side, iRobot has similar dreams of growth. It has new and improved versions of the PackBot as well as a host of plans to convert any type of vehicle into a robot, be it a car or ship, using a universal control unit that you plug into the engine and steering wheel. One new robot that iRobot’s designers are especially excited to show off is the Warrior. Weighing about 250 pounds, the Warrior is essentially a PackBot on steroids. It has the same basic design, but is about five times bigger. Warrior can run a four-minute mile for five hours, while carrying 100 pounds. Yet it is agile enough to fit through a doorway and go up stairs. iRobot built the robot, even though, as one designer put it, “there are no clear buyers yet ... we don’t know yet just who will use it.” The firm is essentially using the Field of Dreams model: if they build it, the buyers will come.

Warrior is really just a mobile platform, with a USB port on top. USB ports are the universal connectors used to plug anything into a computer, from your mouse to a printer. With the USB on Warrior, users can hook up whatever they want to their robot, whether it be sensors, a gun, and a TV camera for battle, or an iPod and loudspeakers for a mobile rave party. The long-term strategy then is that other companies will focus on the plug-in market, while iRobot corners the market for the robotic platforms. What Microsoft did for the software industry, iRobot hopes to do for the robotics industry.

With this kind of grand vision, its rapid growth, and immense financial backing, iRobot may well be on its way to becoming the Ford or GE of the twenty-first century. Indeed, Asimov’s book that inspired its name tells the fictional history of a small company, “U.S. Robotics,” that becomes the largest corporation in the world within fifty years of its founding.

It sure sounds exciting, but also comes with a catch. iRobot, the company, may well be ignoring the warnings of I, Robot the book. Isaac Asimov is remembered not merely for his vision of the future, but also for his “Three Laws of Robotics” that supposedly guided robots’ development in his fictional world. The laws are so simple, yet so complex in their implications, that ethicists now teach them at colleges in the real world. Asimov’s first and most fundamental law is: “A robot may not harm a human being, or, through inaction, allow a human being to come to harm.”

It is hard to square the fictional rules with the present reality of a company at war. Some argue that Asimov would definitely not approve of the latest plug-in accessory for the PackBot, a shotgun. The folks at the company think such people are missing the point; the firm is leading a thrilling technologic revolution. When Helen Greiner is asked how Asimov might react to iRobot, she responds, “I think he would think it’s cool as hell.”


Just a twenty-minute drive from iRobot’s offices outside the Burlington Mall is an old industrial park in Waltham, Massachusetts. Here, in a complex of brown concrete-block buildings dating back to the 1950s, is the headquarters of the Foster-Miller company.

Like iRobot, Foster-Miller was founded by MIT graduates. Eugene Foster and Al Miller were engineers who shared an office at MIT and did consulting work on the side. After graduating, Al Miller left, never to be heard from again. His office replacements were Charles Kojabashian and Edward Nahikian. Foreign-sounding last names weren’t a big selling point in 1955, and so the trio continued the work under the name of Foster-Miller Associates. A year later, Foster-Miller opened its shops in Waltham.

Foster-Miller makes the PackBot’s primary competitor, the Talon, which first hit the market in 2000. The Talon looks like a small tank, driven by two treads that run its length. Weighing just over a hundred pounds, it is a bit bigger than the PackBot. It too has a retractable arm with a gripper, but mounts its main sensors on a separate antennalike “mast” sticking up from the body and carrying a zoom camera. Talon can go up to speeds of about 5.5 mph, the equivalent of a decent jog on a treadmill, a pace it can maintain for five hours.

Like the PackBot, the Talon helped sift through the wreckage at Ground Zero and soon after deployed to Afghanistan. And like iRobot, Foster-Miller has boomed, doubling the number of robots it sells every year for the last four years. The company received an initial $65 million in orders for Talons in the first two years of the insurgency in Iraq. By 2008, there were close to two thousand Talons in the field and the firm had won a $400 million contract to supply another two thousand. Under an additional $20 million repair and spare parts contract, the company also operates a “robot hospital” in Baghdad. Foster-Miller now makes some fifty to sixty Talons a month, and repairs another hundred damaged systems.

The similarities between the two firms end there. iRobot was started by researchers, focused on invention. iRobot’s facilities are mostly cubicles in a large office building, as it outsources much of the manufacture of its robots to factories in the Midwest and China. In its lobby, the name of each visitor is displayed on a flat-screen television mounted on the wall of the reception area.

Foster-Miller was founded by engineers, focused on the practical end. Foster-Miller’s headquarters are a complex of more than two hundred thousand square feet of offices, labs, and machine shops. It makes most of its products on site. In its lobby, visitors are greeted by an elderly receptionist, who then announces your arrival on one of those old intercom microphones that I last saw at my elementary school.

In the back of the Foster-Miller complex is a large warehouse that you enter from an employee parking lot. In the corner, men tinker at various high tables loaded with machinery. A large American flag hangs from the ceiling. It all looks like Santa’s workshop for robots, crossed with a car company advertisement.

When I toured Foster-Miller’s shop in the autumn of 2006, more than twenty-five Talons were lined up on the floor. In one row were shiny new Talons ready for shipment to Iraq. In a second row were dinged-up robots back from the war for repair, their arms a bit mangled and with burnt scars on various parts. I noticed some scorched paper stuck to one of the bots. Edward Godere, a vice president at Foster-Miller, explained, “The soldiers have started taping Playboy centerfolds to the side of the robots. It’s the twenty-first-century version of the pin-up art on the bomber planes during World War II.”

The two companies also see the world quite differently. iRobot, as its research team leader Andrew Bennett puts it, “is all about robotics.” Still a research firm at heart, it has little interest in other industrial sectors and turns down opportunities that it sees as “boring.” As one of the researchers put it, “We don’t build Buicks.”

Helen Greiner goes even further. “These robots are on a mission and so are we: to bring robots into the mainstream. . . . We can make robots to do a better job than humans in some cases.” The result is a fairly unique corporate mission statement: “Have Fun, Make Money, Build Cool Stuff, Deliver a Great Product, and Change the World.”

Whereas iRobot just works on robotics, Foster-Miller makes everything from armor for tanks to air conditioners for gold mines. At Foster-Miller, the motto is, “We engineer ideas into reality,” and there is no interest in trying to change the world via inventions. For example, Foster-Miller’s Talon gets its night vision from simply slipping a soldier’s night vision goggles over the robot’s camera and drives on tank treads that originally came from a snowmobile. By contrast, iRobot’s PackBot drives on specially designed tracks that took nine months to develop and originally came with so much artificial intelligence software that the army actually asked iRobot to make PackBot dumber by stripping out some of the programs.

Foster-Miller is also noticeably more comfortable in its relationship with the Pentagon than iRobot seems to be. It is, as its vice president Bob Quinn puts it, “a defense firm at heart,” with roughly 90 percent of its business defense and security related. Or, as one Foster-Miller executive put it frankly, “We’re industrialists looking for needs to meet. You gotta follow the money.”

As the market takes off, however, Foster-Miller is finding more and more of its business identity in the robotics sector. Moreover, it is bleeding over to other jobs. For example, the company has a long history of engineering for the navy. The navy wants to reduce the number of personnel on its ships because one fewer sailor on board saves $150,000 a year in operating costs. So Foster-Miller came up with a design for an “automated galley” that would go into the latest warship. The system starts with sailors ordering their meal in advance by computer. A management system then allocates the food onboard to their preferences and the meal is transferred from storage by a robot. It is then cooked largely by automation and sent down via a “hot food” robot to a service station, where each sailor picks it up.

In looking at the potential futures of the two firms, it is also notable that they have vastly different ownership structures. iRobot is publicly held, meaning anybody with an online account can buy a slice of its future. Many believe this will drive it more and more toward widening its role in consumer products to balance out the defense growth. By contrast, Foster-Miller is privately held and expresses no interest in consumer robotics. Indeed, in 2004 it was bought by QinetiQ for $163 million. QinetiQ is a multibillion-dollar partnership between the Defence Evaluation and Research Agency (the British government’s defense labs, which were privatized in 2001) and the Carlyle Group.

Carlyle is one of those quietly influential firms that conspiracy theorists love. It is the only large private equity firm located in Washington, D.C., and oversees some $44 billion in equity capital. Its members and advisers include former secretary of state James A. Baker III, former secretary of defense Frank C. Carlucci (who was also the college wrestling partner of then secretary of defense Donald Rumsfeld), former White House budget chief Richard Darman, former British prime minister John Major, and former president George H. W. Bush. This “who’s who” is certainly enough fodder for the conspiracy theorists. Raising eyebrows even further, the bin Laden family was one of the Carlyle Group’s investors. They made out pretty well; the Wall Street Journal reported the family got 40 percent annual returns on its investments in Carlyle. In one of those stranger-than-fiction moments, on the very morning the hijacked planes smashed into the World Trade Center, the Carlyle Group was holding its annual investor conference, with Shafiq bin Laden, the brother of Osama bin Laden, in attendance.

The two companies feel a keen sense of competition with each other, and with their close distance, tensions are certainly there. At iRobot, researchers describe their rivals as thinking, “We hear that robots are trendy, so let’s do that.” At Foster-Miller, they retort, “We don’t just do robots and we don’t suck dirt.”

The two companies have even become locked in a bit of a marketing war. If robots were pickup trucks, Foster-Miller represents the Ford model, stressing how the Talon is “Built Tough.” Its promotional materials describe the Talon as “The Soldier’s Choice.” They repeatedly mention its ruggedness, and even make a point to highlight an e-mail from a marine in Iraq, who wrote of his unit’s Talon, “I wouldn’t use anything else over here.”

The executives at Foster-Miller love to recount tales of how the Talon has proven it “can take a punch and stay in the fight.” One Talon was riding in the back of a Humvee while the truck was crossing a bridge. The unit was ambushed and an explosion blew the Talon into the river. After the battle ended, the soldiers found the damaged control unit and drove the Talon right out of the river. Another Talon serving with the marines was once hit by three rounds from a .50-caliber heavy machine gun (meaning the robot was actually a victim of friendly fire), but still kept working. The repair facility in Waltham has even worked on one Talon that was blown up on three separate occasions, each time just giving it new arms and cameras.

The iRobot team bristles at the idea that their systems are “agile but fragile.” They insist that the PackBot is tough too, but being more science-oriented, cite various statistics on how it can survive a 400 g-force hit, what they describe as the equivalent of being tossed out of a hovering helicopter onto a concrete floor. They are most proud of the fact that their robots have a 95 percent out-of-the-box reliability rate, higher than any others in the marketplace, meaning that when the soldiers get them in the field, they can trust the robot will work as designed.

Beneath all the difference and rancor that divides the companies, they are similar in one telling way. The hallways and cubicles of both their offices are covered with pictures and thank-you letters from soldiers in the field. A typical note from an EOD soldier reads, “This little guy saved our butts on many occasions.”


For all its talk of eschewing new inventions in lieu of simple solutions, Foster-Miller is where matters get even more revolutionary. Just down from the workshop repair room of Talons sits what Time magazine called one of the “most amazing inventions of the year.” In technology circles, new products that change the rules of the game, such as what the iPod did to portable music players, are called “killer applications.” Foster-Miller’s new product gives this phrase a literal meaning.

Like the PackBot, the Talon comes in all sorts of different versions, including EOD, reconnaissance, and a hazmat (hazardous materials) robot. The real killer app, though, is its SWORDS version. This robot’s name comes from the acronym for Special Weapons Observation Reconnaissance Detection System. SWORDS is the first armed robot designed to roam the battlefield.

The SWORDS is basically the Talon’s pissed-off big brother, with its gripping arm replaced with a gun mount. Akin to a Transformers toy made just for soldiers, SWORDS is armed with the user’s choice of weaponry. The robot’s mount can carry pretty much any weapon that weighs under three hundred pounds, ranging from an M-16 rifle and .50-caliber machine gun to a 40mm grenade launcher or an antitank rocket launcher. In less than a minute, the human soldier flips two levers and locks his favorite weapon into the mount. The SWORDS can’t reload itself, but it can carry two hundred rounds of ammunition for the light machine guns, three hundred rounds for the heavy machine guns, six grenades, or four rockets. One report on SWORDS declares that “with this increased firepower, soldiers and their robots will be able to wreak absolute havoc on the battlefield.”

Unlike the PackBot, SWORDS has very limited intelligence on its own, and is remote-controlled from afar by either radio or a spooled-out fiber optic wire. The control unit comes in a suitcase that weighs about thirty pounds. It opens up to reveal a video screen, a handful of buttons, and two joysticks that the soldier uses to drive the SWORDS and fire its weapons. At the time of my visit, Foster-Miller was exploring replacing the controller with a Nintendo Game Boy-style controller, hooked up to virtual reality goggles.

The operator sees what SWORDS sees through five cameras mounted on the robot: a target acquisition scope linked to the weapon, a 360-degree camera that can pan and tilt, a wide-angle zoom camera mounted on the mast, as well as front and rear drive cameras. With these various views, the operator can not only see as if they have eyes in the back of their head, but farther than had previously been possible when shooting a gun. As one soldier put it, “You can read people’s nametags at 300 to 400 meters, whereas the human eye can’t pick that up. You can see the expression on his face, what weapons he is carrying. You can even see if his [weapon’s] selector lever is on fire or on safe.” The cameras can also see in night vision, meaning the enemy can be fired on at any hour and in any climate. This capability has gained added appeal in current operations; during the 2003 invasion of Iraq, three days of sandstorms shut down U.S. forces.

The inspiration for the SWORDS is generally credited not to a scientist, but to a soldier, army sergeant first class David Platt. Platt first used the Talon while sifting through the wreckage at the World Trade Center and later at EOD tasks. His thinking behind giving the robot a gun was fairly straightforward: “It’s small. It’s quiet, and it goes where people don’t want to be.”

In keeping with Foster-Miller’s philosophy, converting the Talon to SWORDS was a “bootstrap development process.” It only took six months and less than $3 million to make the first prototype. As one of the developers put it, “It is important to stress that not everything has to be super high tech. You can integrate existing componentry and create a revolutionary capability.” Guided by that ethic, these lethal little gunslingers cost just $230,000.

Napoleon once said, “There are but two powers in the world, the sword and the mind. In the long run, the sword is always beaten by the mind.” The invention of the SWORDS might one day invalidate his statement. In an early test of its guns, the robot hit the bull’s-eye of a target seventy out of seventy tries. In a test of its rockets, it hit the target sixty-two out of sixty-two times. In a test of its antitank rockets, it hit the target sixteen out of sixteen times. A former navy sniper summed up its “pinpoint precision” as “nasty.”

The robot’s zoom lens not only extends the shooter’s sight, but matches it exactly to the weapon’s. Rather than trying to align their eyes in exact symmetry with the gun in their hand, it is as if the soldier’s eagle eye was the gun. The weapon also isn’t cradled in the soldier’s arms, moving slightly with each breath or heartbeat. Instead, it is locked into a stable platform. As army staff sergeant Santiago Tordillos says, “It eliminates the majority of shooting errors you would have.”

The robot can be set to fire either one bullet at a time or in bursts of eight bullets. Since it is a precisely timed machine pulling the trigger, the “one shot” mode means that any weapon, even a machine gun, can be turned into a sniper rifle. Finally, it makes no difference to the robot whether it is at the shooting range or in the middle of a firefight; the situation does not affect its accuracy. “The SWORDS doesn’t care when it’s being shot at. Indeed, it would like you to shoot at it,” says Sergeant Platt. “That way we can identify you as a valid target and engage you properly.”

The arming of SWORDS has opened up a host of new roles for robotic systems on the battlefield beyond just bomb disposal. Missions so far for what Fox News called the “G.I. of the 21st century” include street patrols, reconnaissance, sniping, checkpoint security, as well as guarding observation posts. It is especially attuned for urban warfare jobs, such as going first into buildings and alleyways where insurgents might hide. SWORDS’s inhuman capabilities could well result in even more intrepid missions. For example, the robot can drive through snow and sand and even drive underwater down to depths of one hundred feet, meaning it could pop up in quite unexpected places. Likewise, its battery allows it to be hidden somewhere in “sleep” mode for at least seven days and then wake up to shoot away at any foes. Described one report of the robotic gunner, “They have been a hit with the soldiers.”


The PackBot, Talon, and SWORDS are only a few of the many new unmanned systems that are operating in war today. When U.S. forces went into Iraq, the original invasion had zero robotic systems on the ground. By the end of 2004, the number was up to 150. By the end of 2005, it was up to 2,400. By the end of 2006, it had reached the 5,000 mark and growing. It was projected to reach as high as 12,000 by the end of 2008.

The unmanned systems roaming about Iraq come in all sorts of shapes and sizes. One of the smallest, but most commonly used, is the MARCBOT (MultiFunction Agile Remote-Controlled Robot). MARCBOT looks like a toy truck with a video camera mounted on a tiny antennalike mast. Costing only $5,000, the tiny bot is used to scout out where the enemy might be and also to drive under cars and search for hidden explosives. Many soldiers are so used to driving remote-controlled cars growing up that it typically takes less than an hour to learn how to use the system. MARCBOT isn’t just notable for its small size. The little truck actually drew first blood on the battlefield, even before SWORDS. One unit of soldiers jury-rigged their MARCBOTs to carry a Claymore antipersonnel mine. Whenever they thought an insurgent was hiding in an alley, they would send a MARCBOT down first, not just to scout out the ambush, but to take them out with the Claymore. Of course, each insurgent found meant $5,000 worth of a blown-up robot’s parts, but so far the army hasn’t billed the soldiers.

All told as of 2008, some twenty-two different robot systems were operating on the ground in Iraq. As one retired army officer put it, “The Army of the Grand Robotic is taking place.”

The world of unmanned systems at war doesn’t end at ground level. They have also taken to the air. One of the most notable is the Predator. The Predator is a UAV (unmanned aerial vehicle), or drone, that “looks like a baby plane.” At twenty-seven feet in length, it is just a bit smaller than a Cessna, and is powered by a “pusher” propeller in the back. Unlike most planes, the Predator lacks a cockpit and its tail wings are canted downward, instead of the normal sideways; one observer even said it looked like “a flying meat fork.” Since it is made of composite materials instead of metals, the Predator weighs just 1,130 pounds. Perhaps its best quality is that it can spend some twenty-four hours in the air, flying at heights of up to twenty-six thousand feet.

Each Predator costs just under $4.5 million, which sounds like a lot until you compare it to the cost of other military planes. Indeed, for the price of one F-22, the air force’s latest jet, you can buy eighty-five Predators. More important, the low price and lack of a human pilot means that the Predator can be used for missions where it might be shot down, such as traveling low and slow over enemy territory. About a quarter of the cost of the Predator actually goes into the “Ball,” a round mounting under the nose of the drone. The rotating Ball carries two variable-aperture TV cameras, one for seeing during the day and an infrared one for night, as well as a synthetic-aperture radar that allows the Predator to peer through clouds, smoke, or dust. The exact capabilities of the system are classified, but soldiers say they can read a license plate from two miles up. It also carries a laser designator to lock on to any targets that the cameras and radar pick up.

Predators are flown by what are called “reachback” or “remote-split” operations. While the drone flies out of bases in the war zone, the pilot and sensor operator for the plane are physically located seventy-five hundred miles away, connected with the drone plane only via satellite communications. Their control panels look a bit like one of the 1980s’ two-player video games you used to see at arcades, each sitting behind three TV screens (one screen has a video feed of what the drone is seeing, one displays technical data, and the third is the navigation map, akin to the GPS display in a car).

The Predator has thus introduced not only tactical but also organizational changes in the units that use them. The mechanics and ground crew go with the plane to the battle zone, usually an “undisclosed location,” shorthand for a base in an allied state in the Persian Gulf. The pilots flying the planes remain in the United States, working out of a set of converted single-wide trailers. Most of these trailer parks for robots are located at Nellis and Creech air force bases, just outside of Las Vegas and Indian Springs, Nevada. But as trailer parks tend to do, these drone bases are multiplying. There are plans to start up Predator operations at bases in Arizona, California, New York, North Dakota, and Texas.

Predators originally were designed for reconnaissance and surveillance, flying over enemy territory to scout for targets and monitor the situation. The prototypes were first used in the Balkan wars, but truly entered their own after 9/11. Indeed, in the first two months of operations in Afghanistan, some 525 targets were laser-designated by Predators. The generals, who had once had no time for such systems, couldn’t get enough of them. Tommy Franks, the commander of all U.S. forces in the region at the time, declared, “The Predator is my most capable sensor in hunting down and killing Al Qaeda and Taliban leadership and is proving critical to our fight.”

“Our major role is to sanitize the battlefield,” says Service Airman Medric Jones. “We ... make sure our own guys aren’t walking into danger.” A typical reconnaissance operation in Iraq involves the Predator circling over a city like Baghdad from five miles up. The pilots communicate with commanders, flight coordinators, intelligence teams (who might be located in the region or back in the States), and even troops on the ground via e-mail or radio. Sometimes, they send out the Predator’s live feed via “Rover,” a remote video system that transmits what the Ball is seeing to Panasonic notebook computers carried by the troops on the ground.

If the enemy is spotted, the Predator can also orchestrate the attack, pointing its laser at targets and even warning the troops if there are any “squirters,” bad guys running away. “I can watch the rear of a building for a bad guy escaping when troops go in the front, and flash an infrared beam on the guy that our troops can see with their night-vision goggles,” said U.S. Air Force Major John Erickson. Erickson’s experience is illustrative of the changes. He had been an F-16 pilot, but when he tells his grandkids about what he did in the Iraq war, it will be about the eighteen months he spent flying a Predator, never leaving the ground.

Predators don’t just watch from afar, but have also begun to kill on their own. The backstory of how this happened is one of the sad “what ifs?” of what could have been done to prevent the 9/11 attacks. Over the course of 2000 and 2001, Predators operated by the CIA sighted Osama bin Laden in Afghanistan many times, usually when he was driving in a convoy between his training camps. But the unmanned spy planes were toothless and could only watch as bin Laden soon disappeared.

The idea then arose to arm the drone by mounting laser-guided Hellfire missiles on the wings. Since the Predator already could direct missiles at targets with its lasers, the only difference is that the drone would carry its own, instead of having to rely on the kindness of strangers to blow up those below. The Predator would truly become a predator.

The plan made sense but quickly got mired in bureaucratic politics, as the CIA and air force argued over who would have control over the now-armed drones and, most important, whose budget would be stuck with the $2 million in costs. It seems like a small amount in retrospect, a “shoestring operation,” according to the air force general in charge of the effort. But, as he now laments, “it was a big problem, I hate to say.” With the two agencies at loggerheads, a senior White House official was needed to cut through the dispute. But terrorism was not at the top of the priority list of the new Bush administration. The issue of how to deal with bin Laden and the growing warnings of an attack inside the United States was tabled until everyone got back from their summer vacations. “It saddens me to know we could have done a heck of a lot more,” says the officer.

After 9/11 and the more than three thousand people killed, the issue of $2 million became null and void. The CIA armed its Predators and the air force decided that it couldn’t be left behind. In the first year, armed Predators took out some 115 targets in Afghanistan on their own. Many commented on the oddity of a war where many of the forces still rode to battle on horses, and yet robotic drones were flying above. In the words of one U.S. officer, it was “the Flintstones meet the Jetsons.”

Predators continue to operate over Afghanistan today. Frequently, the drones carry an American flag onboard, which is then given to the family of a soldier killed in the fighting. In one case, a Predator carrying a flag for a family actually took out the same group of Taliban that had killed their son.

With the precedent set in Afghanistan, the Predator also joined the fight in Iraq. Among its first missions was to help take down the Iraqi government’s television transmissions, which broadcast the infamous “Baghdad Bob” propaganda. In the days and weeks that followed, the Predator struck at everything from suspected insurgent safe houses to cars being prepped for suicide attacks.

The ugly little drone has quickly become perhaps the busiest U.S. asset in the air. From June 2005 to June 2006, Predators carried out 2,073 missions, flew 33,833 hours, surveyed 18,490 targets, and participated in 242 separate raids. Even with this massive effort, there is demand for more. Officers estimate that they get requests for some 300 hours of Predator imagery a day, but that there are only enough Predators in the air to supply a little over 100 hours a day. The result is that the Predator fleet has grown from less than 10 in 2001 to some 180 in 2007, with plans to add another 150 over the next few years.

Besides the Predator, there are many other drones that fill the air over Iraq and Afghanistan. At some forty feet long, Global Hawk could be described as the Predator’s big brother. Others uncharitably say it looks like “a flying albino whale.” The Global Hawk was originally conceived as the replacement for the U-2 spy plane, which dates back to the 1950s. Besides not putting a human pilot in harm’s way (the U-2 is perhaps most famous for the crisis over the downing of pilot Francis Gary Powers at the height of the cold war), “physiological factors” limited the amount of time that the U-2 pilots could fly missions (that is, they would pass out from fatigue, boredom, or a buildup in their kidneys). In contrast, Global Hawk can stay in the air up to thirty-five hours. Powered by a turbofan engine that takes it to sixty-five thousand feet, the stealthy Global Hawk carries synthetic-aperture radar, infrared sensors, and electro-optical cameras. Working in combination, these sensors can do a wide-area search to look at an entire region, or focus in on a single target using the “high-resolution spot mode.” The link of the sensors with the long flight time means that the drone can fly some three thousand miles, spend twenty-four hours mapping out a target area of some three thousand square miles, and then fly three thousand miles back home. In other words, Global Hawk can fly from San Francisco, spend a day hunting for any terrorists in the entire state of Maine, and then fly back to the West Coast.

Like the Predator, the Global Hawk is linked back to humans on the ground, but it mainly operates autonomously rather than being remotely piloted. Using a computer mouse, the operator just clicks to tell it to taxi and take off, and the drone flies off on its own. The plane then carries out its mission, getting directions on where to fly from GPS (Global Positioning System) coordinates downloaded off a satellite. Upon the return, “you basically hit the land button,” describes one retired air force officer.

With such capability, the Global Hawk is not cheap. The plane itself costs some $35 million, but the overall support system runs over $123 million each. Even so, the U.S. Air Force plans to spend another $6 billion to build up the fleet to fifty-one drones by 2012.

At the smaller end of the scale in Iraq and Afghanistan are unmanned planes flown not out of air force bases back in the United States, but rather launched by troops on the ground. The big army units fly Shadow, which looks like the sort of radio-controlled planes flown by model plane hobbyists. Just over twelve feet long, it takes off and lands like a regular plane. Compared to a Predator or Global Hawk, however, it is underpowered, only able to stay up five hours and fly seventy miles. Driven by a propeller, it has a distinctive noise that sounds like a weed-whacker flying overhead. Most of the Shadow’s UAV pilots are enlisted soldiers, such as Private First Class Ryan Evans, who explains why he volunteered to fly robotic planes in lieu of performing his normal army duties. “It is more of a rush that you are in control of something in the sky.”

The most popular drone, though, is one of the smallest. The Raven is just thirty-eight inches long and weighs four pounds. In a sort of irony, soldiers launch the tiny plane using the same over-the-shoulder motion that the Roman legionnaires used in war two thousand years ago, just tossing a robot instead of a javelin. The Raven then buzzes off, able to fly for ninety minutes at about four hundred feet. Raven carries three cameras in its nose, including an infrared one. Soldiers love it because they can now peer over the next hill or city block, as well as get their own spy planes to control, rather than having to beg for support from the higher-ups. “You throw the bird up when you want to throw it. You land it when you want to land,” says Captain Matt Gill, a UAV company commander with the 82nd Airborne Division. The other part of the appeal is that the pilots of the Raven are just regular soldiers; a cook from the 1st Cavalry is actually considered among the best. In just the first two years of the Iraq war, the number of Ravens in service jumped from twenty-five to eight hundred.

A veritable menagerie of unmanned drones now circles above the soldier in Iraq, reporting back to all sorts of units. The small UAVs like Raven or the even smaller Wasp (which carries a camera the size of a peanut) fly just above the rooftops, sending back video images of what’s on the other side of the street. The medium-sized ones like Shadow circle over entire neighborhoods, at heights above fifteen hundred feet, and are tasked out by commanders at division headquarters to monitor for anything suspicious. Reporting back to pilots thousands of miles away, the larger Predators roam above entire cities at five thousand to fifteen thousand feet, combining “reconnaissance with firepower.” Finally, sight unseen, the Global Hawks every so often zoom across the entire country at some sixty thousand feet, monitoring anything electronic and capturing reams of detailed imagery for intelligence teams to sift through. Because they rarely see the Global Hawks, officers in the field joke that these pictures are mainly used to fill the PowerPoint briefings for the generals back in D.C. Added together, by 2008, there were 5,331 drones in the U.S. military’s inventory, almost double the amount of manned planes. That same year, an air force lieutenant general forecast that “given the growth trends, it is not unreasonable to postulate future conflicts involving tens of thousands.”

The reach of unmanned systems also extends to the sea. REMUS, the Remote Environmental Monitoring Unit, is helping to clear Iraqi waterways of mines and explosives. Shaped like a torpedo, REMUS is about six feet long, weighs eighty-eight pounds, and costs $400,000. It was originally built by the Woods Hole Oceanographic Institute to carry out automated surveys of coasts, reefs, and shipwrecks, but the navy soon modified it for military uses, purchasing more than 140 of the undersea robots by 2008.

It is another modification of an unmanned system originally designed for sea that may be the most novel to hit the battlefield in Iraq and Afghanistan. Second behind the threat of IEDs is that from mortars and rockets. Insurgents will often set up a mortar or rocket launcher in a residential neighborhood, quickly pop off a few rounds at an unsuspecting U.S. base, and then get out of the area before any response can be made. Although most miss their targets, plenty of damage and many casualties have been caused by lucky shots.

Enter the Counter Rocket Artillery Mortar technology, or CRAM for short. The navy has long equipped many of its ships with the Phalanx 20mm Gatling gun, capable of firing up to forty-five hundred rounds per minute. The radar-guided gun is mounted in a cylindrical shell that tilts and moves in circles, such that the sailors affectionately call it “R2-D2,” after the little robot in Star Wars. The gun was designed as a “last chance” defense against antiship missiles that skim just above the waves. R2-D2 automatically tracks and shoots down any missiles that have gotten past all other defenses and are too quick for humans to react to.

CRAM is basically R2-D2 taken off the ship and crammed (mounted) onto a flatbed truck. Its software was modified to target mortar shells and rockets instead of missiles, with the idea that it would essentially put up a wall of bullets to protect bases. Tests showed that CRAM had a 70 percent shootdown capability. By December 2007, at least twenty-two of them were deployed to Afghanistan and Iraq.

Not all has gone perfectly with the CRAM. The original, naval version of R2-D2 used bullets made of depleted uranium. As they were intended to fall into the middle of the sea, no one worried much about what happened to the shells after they fired. In an urban environment, thousands of bullets filled with radioactive dust falling from the sky is more of a concern. So the shells had to be altered to incendiary rounds that blow up in midair, but are less effective. Also, R2-D2 apparently once mistook an American helicopter flying over Baghdad for the Emperor’s Death Star. It locked in on the chopper to shoot it down, as if it were a rocket with some funny rotors spinning on the top. So CRAM had to be reconfigured to avoid any “blue on blue” friendly fire incidents. Finally, R2-D2 does not come cheap. Once you count in all the radar and control elements, the CRAM required a congressional earmark of $75 million in funding.


The “war on terrorism” hasn’t just taken place on battlegrounds far far away. The result has been the creation of immense bureaucracies and massive spending dedicated to this war at home, or what we now call “homeland security.”

A few numbers illustrate the vast industry that has been built around homeland security. In 1999, there were nine companies with federal contracts in homeland security. By 2003, there were 3,512. In 2006, there were 33,890. The business of protecting buildings, borders, and airports and preparing to respond to disaster generates $30 billion a year and is projected to reach $35 billion by 2011. As one report on the homeland security industry put it, “Thank you, Osama bin Laden!”

This money has not just been spent on the Einsteins who seize your shampoo at airports, but also on new technology research for homeland security. Popular Science reported that it “reached heights not seen since the Sputnik era.” In 2003, $4 billion of the newly formed Department of Homeland Security’s budget went to technology research programs.

The outcome is that unmanned systems have also started to serve on the front lines of the war at home. One of the early scares in the war on terrorism was the rash of letters carrying deadly anthrax powder sent to prominent officials and media. Some of the powder also leaked out inside post offices. Since those attacks, some one thousand robots have been installed to sort parcels, with the U.S. Postal Service planning to add as many as eighty thousand more.

With the war on terror involving the need to protect everything from airports to office buildings, industry analysts also foresee a booming market for “sentrybots.” These systems can guard entrances, automatically patrol perimeters, check IDs, and even use facial recognition software to know who should or shouldn’t be allowed into the area. Examples of such systems range from the Guard Robo made by Sohgo Security Services, which looks like Rosie, the maid from The Jetsons, to the Robot Guard Dog made by Sanyo. It looks, well, like a robot guard dog, just with a video camera for eyes and a mobile phone mounted inside to call for help whenever it finds intruders. An executive I met at one robotics conference predicted that “we will sell tens of thousands of them to everything from military bases to power plants.”

With America under threat, robots haven’t just hidden out in post offices or passively stood guard. They also have taken flight to guard the nation’s borders. While the Predator was originally designed for the military to find enemy missiles and tanks, the federal government quickly became interested in its potential for another role. Through most of 2005 and 2006, the Department of Homeland Security flew a Predator drone over the U.S.-Mexico border. The robot border-cop helped arrest 2,309 people and seize seven tons of marijuana.

In 2008, DHS presented plans to Congress to buy eighteen drone planes to patrol the U.S. border. Of course, all realize that the drones are actually focused on stopping a different type of border crosser than al-Qaeda agents—illegal immigrants. “But the acceptability of using these systems for border surveillance has increased dramatically since terrorism became such a real, in-our-backyard threat,” says Cyndi Wegerbauer of General Atomics, which sold the Predator drone to the Border Patrol.

Indeed, in the war to defend against would-be immigrants, robots have also gone to work not only for the government, but also for the private border patrols, or “militias,” as some have called themselves. One example is the “Border Hawk” drones serving with the American Border Patrol, a private organization operating in Cochise County, Arizona.

Some have accused the American Border Patrol of racism. Its founder, Glenn Spencer, is certainly a controversial figure. He describes illegal immigration as “The Second Mexican-American War” and Latin America as “a cesspool of a culture” that threatens the “death of this country.”

Spencer may sound like a sad throwback to the 1950s or even 1350s, but his group’s technology is twenty-first century. They operate three drones that carry video and infrared cameras. The drones are launched by radio control and then automatically fly a patrol pattern using GPS, staying at four hundred feet, just below what the government requires for certification. While in the air, they search out any illegal immigrants crossing the border and record the images to TiVo for playback and review. The group doesn’t arrest the illegal aliens themselves, but passes on the information to the United States Border Patrol as well as loads its robots’ footage onto the Internet using a satellite connection, or, as the group describes, “broadcasting the invasion live on the internet.”

Besides battling terrorists and would-be immigrants, the war at home also involves responding to disaster. In the aftermath of 9/11, brave little PackBots and Talons joined the search for survivors. In the aftermath of Hurricane Katrina, Silver Fox UAVs searched for survivors in flooded areas of New Orleans, while two tiny robotic helicopters from the Center for Robot-Assisted Search and Rescue at the University of South Florida worked on the Mississippi coast. Many think robotic systems will have an even wider role in future disasters. For example, after Katrina, cell phone towers went out because of storm damage and a lack of power, which hampered both residents on the ground as well as rescue efforts. During the next disaster, the plan is to use a UAV as an “aerial cell tower.”


PackBot, Talon, SWORDS, Predator, Global Hawk, and all their digital friends are the first signs that something big is going on. Man’s monopoly of warfare is being broken. We are entering the era of robots at war.

It sure sounds like science fiction to claim such a wild thing. But we have to remember that pretty much everything we now take for granted sounded like fiction at some point, whether it was the fantastic dreams of mechanical flying beasts to the absurd concept of talking to someone on the other side of the world.

What follows is an effort to understand this change, to travel through this new world of unmanned war and unwrap just what it might mean. Part 1 attempts to capture this moment of great change, to understand the changes that we are creating. In order to assess what is going on in technology, robotics, and war today, it will explore such key issues as the history of robots, how these new technologies work, what is coming in the next wave, who is working on them, and what inspires them. Then, part 2 of this book will explore what all this change is creating for us. It will cover everything from the resulting shifts in how wars are fought and who is fighting them to important questions that our new machine creations are starting to raise in politics, law, and ethics. War just won’t be the same.



The further backward you look, the further forward you can see.





“Perhaps the most wonderful piece of mechanism ever made” is how the famous Scottish engineer Sir David Brewster would describe it some one hundred years after it was invented. By contrast, the great poet Johann Wolfgang von Goethe called it “most deplorable ... like a skeleton [with] digestive problems.” The two men were talking about Vaucanson’s duck, the mechanical wonder of its age, or, as present-day scientists call it, “the Defecating Duck.”

Jacques de Vaucanson was born in Grenoble, France, in 1709. At the age of twenty-six, he moved to Paris, then the center of culture and science during the Age of Enlightenment. Inspired by Isaac Newton’s idea of the universe as a great clock that had been set in motion by the Creator, the Deist philosophers of the time saw the world as guided by mechanical forces. They believed that everything, from gravity to love, could be understood if you could just scientifically reason it out.

Arriving in this cauldron of rationality gone wild, Vaucanson became fascinated with the concept of using reason and mechanics to reproduce life itself. More important, needing funds, the young engineer hit upon the idea of “getting assistance by producing some machines that could excite public curiosity.” So he did what any other enterprising young man would do: he built a duck.

Vaucanson’s duck was no ordinary duck; it was actually an intricate mechanical creation modeled after a sculpture in the gardens of the Palais des Tuileries, a cultural center at the time, now more famous as one of the sites of The Da Vinci Code. While the duck looked lifelike from the outside, the true amazement was that it could stand up, sit down, preen, waddle, quack, eat pellets of corn, drink water, and then, wonder of wonders, defecate. Claiming that he had made the duck with methods “copied from Nature,” Vaucanson presented the mechanical fowl at the court of King Louis XV. The duck then became the talk of all the Paris salons, as the nation’s leaders debated how it worked and just what it signified for politics, philosophy, and life itself.

Once the duck was placed on public display, people came from far and wide, paying an admission fee equivalent to a week’s wages. Also accompanying the bird were mechanical mandolin, flute, and piano players, who tapped their feet, moved their heads, and seemed to breathe as they played music. But it was the duck and, most important, the inexplicable fact that it could do number two that was the star attraction. The duck seemed to show that the incomprehensible processes of life could be re-created.

Vaucanson became a rich man and soon thereafter was given the highest possible honor for a scientist, election to the esteemed Académie des sciences, joining such luminaries as Descartes, Colbert, and Pascal. The duck was then sent out on tour (where the German poet Goethe would meet it some years later, showing its age like all great stars do when they’ve been on the road for too long), and Vaucanson would become the director of the French government’s silk mills. In 1745, he would invent the world’s first automated loom, which used a system of cards with holes punched in them to repeatedly create patterns in silk. Centuries later, these punch cards would inspire the early developers of computers.

It wasn’t until four decades after its invention that the duck’s secret was discovered. It was, in fact, unable to digest food. The corn that was seemingly eaten and then digested was instead stored in a pod hidden in the back of the duck’s throat, initiating a timer that would then, after a suitable pause, release another hidden container of “artificial excrement.” The duck, and its waste, were both frauds. But if Vaucanson’s duck was a hoax at re-creating life, it was a remarkably intricate one. The blueprints for the mechanical bird show it to involve hundreds of moving, interlocking parts and scores of inventions, all for the sole purpose of simulating the most routine part of life’s daily business.


Vaucanson’s duck is relevant today because it illustrates how humankind’s attempts to use technology to mimic and replace life go further back than we often think. The robots searching for IEDs in Iraq didn’t just spring out from nowhere. They have a past that shapes their present and future.

The idea of creating mechanical beings to replace the work of humans is at least as old as ancient Greek and Roman mythology. For example, the god of metalwork (Hephaestus to the Greeks, Vulcan to the Romans) had a host of mechanical servants that he made out of gold. Occasionally, he also gave out his creations to the mortals, one example being Talos, a huge statue that protected the island of Crete by throwing huge boulders at any ships that came nearby. If any stranger made it ashore, Talos would heat up his metallic arms to a red-hot glow and then give the intruder a deadly welcome hug. Talos was later the name for an Apple computer operating system, as well as the first computer-guided missiles on U.S. Navy ships.

These myths were not just stories, but became inspirations for both real-world philosophers and inventors. Indeed, it was in this period that Aristotle (384-322 B.C.), one of the founding philosophers of Western thought, would describe his vision of the ultimate free world: “If every tool, when ordered, or even of its own accord, could do the work that befits it ... then there would be no need either of apprentices for the master workers or of slaves for the lords.”

Likewise, the engineers of ancient times made advances that were often well beyond what we might think possible. Around 350 B.C., the Greek mathematician Archytas of Tarentum built “the Pigeon,” a mechanical bird that was propelled by steam. Besides building what was likely the world’s first model airplane, Archytas used it to carry out one of the first studies of flight. Perhaps most remarkable was the “Antikythera computer.” In A.D. 1900, a Greek sponge diver found a wreck of an ancient Greek sailing ship that had sunk off the island of Antikythera near Crete around 100 B.C. In the wreck was a small box about the size of a laptop computer. It contained thirty-seven gears that, when a date was entered, worked to calculate the position of the sun, moon, and other planets. Many credit it as the first known mechanical analog computer.

This fascination with mechanical creations subsided during the Dark Ages, but would rise again in the Renaissance, perhaps most famously with Leonardo da Vinci. Among his many sketches is a mechanical knight. Like most of his flashes of brilliance, such as his plans for helicopters and planes, the design was ahead of its time. If built, this sixteenth-century version of the SWORDS, armed with a sword, would have been able to sit up and move its arms and legs. The fascination with such systems, though, was not limited to Europe. In feudal Japan in the 1600s, several craftsmen were noted for having made automated dolls that served tea.

For all the wonder of these early mechanical creations, though, it is important to note that they were not actually what we now think of as robots. The devices typically did the same thing every time they were activated, rather than moving about or responding to any changes in the environment. That is, they were automated, but not robotic. Moreover, many turned out to be hoaxes, either elaborate ones like Vaucanson’s duck that actually pushed the frontiers of technology in the pursuit of fakery, or more traditional ones. The most famous of this latter type may have been “The Turk,” later referenced in the Terminator series. This was a “chess automaton” in the shape of a Turkish-looking figure on top of a cabinet, made by Wolfgang von Kempelen in what is now Slovakia. Preceding IBM’s chess-playing supercomputer Deep Blue by almost two hundred years, “The Turk” consistently beat humans at chess, including even Napoleon. It turned out, though, that von Kempelen had hidden a dwarf chessmaster inside.

Ducks and Turks aside, most of the research to develop technologies that replicated human powers was frequently intertwined with war. Archimedes, for example, may have been the most influential scientist in ancient history, shaping the future development of the fields of mathematics, physics, engineering, and astronomy. In his era, though, he was best known for his various inventions used in the defenses of the city of Syracuse. These ranged from a “death ray” (supposedly using mirrors to light ships afire) to a huge “claw” (a large crane that grabbed ships). Similarly, the field of modern chemistry was largely founded by Antoine-Laurent Lavoisier, who served Louis XVI as one of the directors of the French national commission on gunpowder.

In no area was this link greater than in the first calculating machines, the forerunners of computers. Charles de Colmar is credited with inventing the first mechanical calculator, which he called the Arithmometer, in 1820. The machine was as big as a desk. His first customers were the French and British militaries, which used it for navigation and plotting the trajectory of cannonballs. Similarly, the Royal Navy hired Charles Babbage, the man generally credited with designing the first programmable computer. Babbage’s 1822 machine, called a “difference engine,” was designed of some twenty-five thousand parts. In a foretaste of the innovators of today, Babbage was also a bit of an oddball. He once baked himself in an oven for four minutes, just “to see what would happen.”


Ultimately, technology caught up with ambition around the turn of the twentieth century. Science finally had advanced to create machines that could be controlled from afar and move about on their own. The robotic age was getting closer, and robots’ link with war would become even more closely intertwined.

The first real efforts started with Thomas Edison and Nikola Tesla, two rival scientists and the first of what we now would call electrical engineers. While working on various ways to transmit electricity, Edison and Tesla both experimented with radio-control devices. Because of his eccentric personality and lack of a good public relations team like Edison, Tesla would not gain the same place in history as his rival, the “Wizard of Menlo Park,” and died penniless.

Tesla, though, did perhaps the most remarkable work at the time with remote-control devices. He first mastered wireless communication in 1893. Five years later, he demonstrated that he could use radio signals to remotely control the movements of a motorboat, holding a demonstration at Madison Square Garden. Tesla tried to sell this first remotely operated vehicle, along with the idea of remote-controlled torpedoes, to the U.S. military, but was rejected. As Tesla recounted, “I called an official in Washington with a view of offering the information to the government and he burst out laughing upon telling him what I had accomplished.” Tesla would not be the last inventor to find out that what was technically possible mattered less than whether it was bureaucratically imaginable. Two brothers from Dayton, Ohio, had the same experience a few years later when they first tried to sell their invention of manned flight.

The foundations then were laid for remote-controlled vehicles and weapons just as the First World War began. World War I proved to be an odd, tragic mix of outmoded generalship combined with deadly new technologies. From the machine gun and radio to the airplane and tank, transformational weapons were introduced in the war, but the generals could not figure out just how to use them. Instead, they clung to nineteenth-century strategies and tactics and the conflict was characterized by brave but senseless charges back and forth across a no-man’s-land of machine guns and trenches.

With war becoming less heroic and more deadly, unmanned weapons began to gain some appeal. On land, there was the “electric dog,” a three-wheeled cart (really just a converted tricycle) designed to carry supplies up to the trenches. A precursor to laser control, it followed the lights of a lantern. More deadly was the “land torpedo,” a remotely controlled armored tractor, loaded up with one thousand pounds of explosives, designed to drive up to enemy trenches and explode. It was patented in 1917 (appearing in Popular Science magazine) and a prototype was built by Caterpillar Tractors just before the war ended. In the air, the first of what we would now call cruise missiles was the Kettering “Bug” or “aerial torpedo.” This was a tiny unmanned plane that used a preset gyroscope and barometer to automatically fly on course and then crash into a target fifty miles away. Few of these remote-controlled weapons were bought in any numbers and most remained prototypes without any effect on the fighting.

The only system to be deployed in substantial numbers was at sea. Here, the Germans protected their coast with FL-7s, electronically controlled motorboats. The unmanned boats carried three hundred pounds of explosives and were designed to be rammed into any British ships that came near the German coast. Originally, they were controlled by a driver who sat atop a fifty-foot-high tower on shore, steering through a fifty-mile-long cable that spooled out of the back of the boat. Soon after, the Germans shifted the operator from a tower onto a sea-plane that would fly overhead, dragging the wire. Both proved unwieldy, and in 1916 Tesla’s invention of wireless radio control, now almost two decades old, was finally deployed in warfare.

Perhaps reflecting the fact that they were outnumbered in both these wars, the Germans again proved to be more inclined to develop and use unmanned systems when fighting began again in World War II. The best known of their weapons, akin to the land torpedo, was called the Goliath. About the size of a small go-cart and having a small tank track on each side, the Goliath of 1940 was shaped almost exactly like the Talon that Foster-Miller makes over six decades later. It carried 132 pounds of explosives. Nazi soldiers could drive the Goliath by remote control into enemy tanks and bunkers. Some eight thousand Goliaths were built; most saw service as a stopgap on the Eastern Front, where German troops were outnumbered almost three to one.

In the air, the Germans were equally revolutionary, deploying the first cruise missile (the V-1), ballistic missile (V-2), and jet fighter (Me-262). The Germans were also the first to operationally use remotely piloted drones. The FX-1400, known as the “Fritz,” was a 2,300-pound bomb with four small wings, tail controls, and a rocket motor. The Fritz would drop from a German plane flying at high altitude. A controller in the plane would then guide it into the target using a joystick that steered by radio. The Fritz made a strong debut in 1943, when the Italian battleship Roma was trying to defect to the Allies. Not knowing of the Fritz, the Italian sailors saw a German bomber plane, but didn’t worry too much as it was at a distance, height, and angle from which it couldn’t drop a bomb on top of them. A Fritz launched from the bomber and then flew into the Roma, sinking it with more than a thousand sailors lost.

Présentation de l'éditeur

In Wired for War, P. W. Singer explores the great­est revolution in military affairs since the atom bomb: the dawn of robotic warfare. We are on the cusp of a massive shift in military technology that threatens to make real the stuff of I, Robot and The Terminator. Blending historical evidence with interviews of an amaz­ing cast of characters, Singer shows how technology is changing not just how wars are fought, but also the politics, economics, laws, and the ethics that surround war itself. Travelling from the battlefields of Iraq and Afghanistan to modern-day "skunk works" in the midst of suburbia, Wired for War will tantalise a wide readership, from military buffs to policy wonks to gearheads.

Aucun appareil Kindle n'est requis. Téléchargez l'une des applis Kindle gratuites et commencez à lire les livres Kindle sur votre smartphone, tablette ou ordinateur.

  • Apple
  • Android
  • Windows Phone
  • Android

Pour obtenir l'appli gratuite, saisissez votre adresse e-mail ou numéro de téléphone mobile.

Détails sur le produit

En savoir plus sur l'auteur

Découvrez des livres, informez-vous sur les écrivains, lisez des blogs d'auteurs et bien plus encore.

Dans ce livre

(En savoir plus)
Parcourir et rechercher une autre édition de ce livre.
Parcourir les pages échantillon
Couverture | Copyright | Table des matières | Extrait | Index | Quatrième de couverture
Rechercher dans ce livre:

Commentaires en ligne

5.0 étoiles sur 5
5 étoiles
4 étoiles
3 étoiles
2 étoiles
1 étoiles
Voir le commentaire client
Partagez votre opinion avec les autres clients

Meilleurs commentaires des clients

Format: Broché Achat vérifié
Livre très intéressant traitant du sujet de l'utilisation de la robotique dans les conflits du futur. Futur pas si lointain, puisque les drones sont très largement employés dans les conflits actuels. Mais on n'y parle pas uniquement des drones, exosquelette...

Pour la petite info, les développeurs du Jeu Call Of Duty Black Ops 2 se sont inspirés de ce livre pour la trame de leur jeu.
Remarque sur ce commentaire 2 sur 2 ont trouvé cela utile. Avez-vous trouvé ce commentaire utile ? Oui Non Commentaire en cours d'envoi...
Merci pour votre commentaire.
Désolé, nous n'avons pas réussi à enregistrer votre vote. Veuillez réessayer
Signaler un abus

Commentaires client les plus utiles sur Amazon.com (beta)

Amazon.com: HASH(0x947b83e4) étoiles sur 5 132 commentaires
124 internautes sur 135 ont trouvé ce commentaire utile 
HASH(0x94802b7c) étoiles sur 5 JohnHawley El Paso, Texas 10 mars 2009
Par John K. Hawley - Publié sur Amazon.com
Format: Relié Achat vérifié
I work as an engineering psychologist in a U.S. Army organization that is in the forefront of R&D on military robotics and automated command and control systems. Hence, I read P.J. Sanger's Wired for War with considerable interest. I can relate to much of his discussion on an experiential basis. We routinely encounter and try to provide solutions for many of the problems Sanger discusses. As a point of interest, I was the technical lead on an Army effort looking at human performance contributors to the fratricides by the Patriot air defense missile system during the recent Gulf War mentioned on page 125. As is usually the case in a casual summary of complex events, Sanger's description of these events is superficially accurate, but there is a lot more to the story. Also, I've been told that his remark on page 197 about the radar on the DIVAD gun locking onto the exhaust fan of a port-a-potty is an urban legend. I've heard about this alleged incident, but I've never been able to find anyone in the Army air defense community who ever witnessed it personally. We work tests on that class of systems all the time, so we know the players.
Overall, I thought Sanger did a good job of describing the state of the art in robotic military systems and addressing the potential sociological and psychological impact of using these systems in current and future military operations. From my perspective, the central operational issue in using armed robotic systems in combat is balancing autonomy with effective human control (the focus of Sanger's Chapter 6.). In my view, he correctly refers to this topic as the "Issue-That-Must-Not-Be-Discussed." I was particularly struck by the difference between the attitude of those having the most on-the-ground experience with these systems (e.g. Robert Quinn's remark on page 124 that he can't even imagine how unmanned systems would "ever be able to autonomously fire their weapons.") and the almost casual attitude on this subject expressed by many of the decision makers we deal with daily. Their attitude is best summarized by the remark attributed to an unnamed former secretary of the army who responded "No" when asked if he could identify any challenges that the greater use of unmanned systems would bring to the military.
The reality associated with greater autonomy on the part of armed robotic systems is that there will likely be many more "oops moments" (Sanger's page 196) than are politically and operationally tolerable. Based on our assessment of the Patriot fratricides during the recent Gulf War, these incidents were an example of an oops moment on the part of an armed robotic system. If the past is any indicator of the future, such incidents will result in initial "surprise" and "shock" on the part of the leadership that these advanced systems behaved thusly, followed by the imposition of restrictive rules of engagements that effectively take the offending system out of the fight. Sanger is correct that we need a more realistic assessment by those in policy-making jobs of the potential problems associated with the use of armed, autonomous robotic systems in actual combat--but I'm not holding my breath waiting for this to happen.
Armed robotic systems will be fielded. They will be allowed to operate autonomously. Oops moments will occur. And unpleasant fallout and scapegoating will take place in the aftermath of such incidents. The issue of control in accord with human intent versus the illusion of control is complex and will not easily be solved. Software glitches aside, oops moments will mostly result from what Dave Woods of Ohio State University terms the "brittleness problem of automata:" An inability to satisfactorily handle unusual or ambiguous situations. I fear that the "Strong AI" necessary to satisfactorily address the brittleness problem will remain tantalizingly just over the technical horizon for some time to come.
79 internautes sur 88 ont trouvé ce commentaire utile 
HASH(0x94812e94) étoiles sur 5 A little too sensationalist, not enough real. 3 février 2011
Par jwman - Publié sur Amazon.com
Format: Format Kindle
Singer paints a picture of vastly capable robots and software that are fielded right now. As someone working in the robotics field and trying to provide autonomous behaviors for government applications, I see how far this is from reality. As is often the case, however, reality doesn't create much buzz or sell many books.

This book feeds the popular misconception that robots are smart and getting smarter. I have a brother-in-law that was asking me about my work and how I'd done some simple AI design for computer board games for fun a long time ago. He made the comment, "I bet all that is coming in handy in your current job". I had to tell him that no, creating strategy-based behaviors for Risk has almost zero relevance to modern robotics -- we're nowhere close to a strategic level of thinking. As an industry, we're still at the level of getting a robot to move from point A to point B consistently and without running into anything. The videos on YouTube posted by researchers show some incredible things, but research is almost always 10-15 years ahead of a solid, marketable solution (toy problems in the lab are comparatively easy, real-world complexity is HARD).

The reality is this: Most mobile robots in theater right now are glorified remote control cars, operated by soldiers less than a few hundred meters away via cameras mounted on the robots. Singer talks a great deal about the Foster-Miller Talon and iRobot Packbot, because they are far and away the most common and prominent platforms in theater. However, the examples of autonomy he gives never deal with those platforms. Why? Because they have almost no autonomy for the units in the field.

Autonomy for mobile robotics is HARD. Very hard. Singer glosses over this fact by talking about the "inevitable" Singularity that is supposed to happen somewhere around 2030. Basically, the premise is that robots are not smart because they can't think fast enough to process all of the data. This is wildly inaccurate. Mobile robots are not smart because humans have not managed to impart intelligent decision-making to them. It's not like the robotics field is bemoaning the slowness of today's processors as a reason for autonomy failings. "Oh it WOULD have worked if only I had 10x the computational power..."

Also the chapter on the "Singularity" bothered me because it shows a distinct lack of understanding of the current state of the art. Computers are not getting faster at the moment. Processor speeds have flat-lined at about 3.0 GHz for the past several years because any faster and heat dissipation becomes an intractable problem. Even the speed of super-computers is capping out for the same reasons (a little more complex -- heat dissipation is easier if processors are spread out, but higher speed requires physical closeness because of the time needed for current to travel. So high-speed computers need to be compact for speed, but spacious for heat dissipation...). New computers today are coming out with more cores, so computers technically have more raw computing power (though even this has near-term limitations that prevent us from the Singularity) but computers are awful at taking advantage of parallelism. Singer points out that our brains are massively parallel, and this is why we have an edge on computers. This is true, but even is we had a computer with the same amount of parallelism, we couldn't take advantage of it. Someone has to program the thing, and no algorithms exist that mimic the behavior of the human brain (contrary to the picture that tech bloggers paint). The fact of the matter is, we don't know how the human brains works. Singer's main argument here is pointing to the trend: compare what computers can do today with what they could do 30 years ago. The flaw is in assuming that the same trend will continue. Extrapolation is notoriously bad, even though it seems to have such predictive power. The reason for such growth can be explained by us tackling the "low-hanging fruit" as far as computers go, and we're fast approaching an era of more incremental improvement as the "easy" problems are solved. Futuristic technologies such as alternative processor architectures, Quantum computing, optical computing, etc. are nowhere near workable. But even if we had a full-fledged Quantum computer right now, we really wouldn't know what to do with it. The software algorithms don't exist for it, and many of those that do don't provide dramatic improvement over what we have today.

The most valuable part of the book (and the only reason it didn't get 1 star) was the first few chapters describing a historical view of robotics as an industry.
42 internautes sur 45 ont trouvé ce commentaire utile 
HASH(0x94812ee8) étoiles sur 5 A truly eye-opening book, superbly researched and written 2 février 2009
Par James Beswick - Publié sur Amazon.com
Format: Relié Achat vérifié
I first heard the author talking on NPR about this topic, and both that interview and the first chapter of this book show his excitement and deep interest and understanding of this subject. For such a weighty hardback, it's remarkably hard to put down, and each section evolves intelligently from the last. I particularly enjoyed the references to modern culture, given that robotics has largely been a subject of science fiction in the last few decades rather than yielding anything practical in reality.

Well, at least so I thought - it turns out that over 12,000 robots are at war in Iraq and Afghanistan as we speak. The companies producing these machines were spurred by the very real necessities of dealing with guerrilla warfare, and avoiding the human toll associated with such difficult environments. Through a combination of human-controlled and artificially-intelligent hardware, these robots back up our soldiers and provide a super-human level of robustness and accuracy.

The author raises the complex moral questions associated with having machines killing people on the frontline, and the issues that arise when mistakes occur. There's also a fascinating discussion of stress disorders that remote pilots are suffering from - these men and women sit in offices in the US, controlling machines on the battleground far away, and return home for dinner every day after "a day's fighting".

It's also interesting to look at the design of some of the machines and their control interfaces, many of which look like Wall-E with a machine gun. Weapons companies have copied controllers from the Playstation and Xbox, taking advantage of a generation that is comfortable using these devices without extensive retraining. The distance between shooting people on Halo and making real life-or-death decisions in operating a military robot is almost absurdly non-existent.

I don't want to steal the book's thunder at all since this is one of the most gripping reads I've found in a while, and would highly recommend to everyone. While not a robotics book or a war book, it falls somewhere in the middle, and the topic is enthusiastically presented. The most chilling part is clearly that the science fiction of movies such as The Terminator is really not too far away, and we're on a cusp of a robotics revolution that will be as profound as the domination of the PC.
24 internautes sur 31 ont trouvé ce commentaire utile 
HASH(0x94812fb4) étoiles sur 5 Making war impersonal 2 février 2009
Par Julie Neal - Publié sur Amazon.com
Format: Relié
This frightening and funny book helped me understand the future of war in all its technological splendor. What was once the stuff of science fiction, such as machines thinking for themselves, is now our military's reality.

Unfortunately, as Isaac Asimov quotes in Wired for War: "The saddest aspect of life right now is that science gathers knowledge faster than society gathers wisdom."

The military began using robots primarily to fill the "Three D" roles people were poor at: jobs that were Dull, Dirty or Dangerous. Unmanned systems "don't need to sleep, don't need to eat, and find monitoring empty desert sands as exciting as partying at the Playboy mansion." The use of unmanned systems has exploded, especially since the attacks of September 11. As one U.S. Navy researcher puts it: "The robot is our answer to the suicide bomber."

I heard an NPR interview with the author, and what struck me most was his description of how impersonal war has become. Almost like playing video games, people here in the states can launch missiles and cause all kinds of mayhem on battlefields overseas, untouched by all the messiness of being on site. Singer reveals the disdain combat troops sometimes have for these faraway operators, even though they are on the same side.

All sorts of pop culture references are woven through the book, including The Iron Giant, The Matrix, Night of the Living Dead, Predator, Star Wars, The Terminator, Total Recall, Wall-E and the Nintendo Wii. There is also a glossy-page insert of 32 black and white photographs.

The book poses provocative ethical questions about the new trend of one-step-removed killing. I'll be thinking about this one for a long time.

Here's the chapter list:

Author's Note: Why a Book on Robots and War?

Part One: The Change We Are Creating
1. Introduction: Scenes from a Robot War
2. Smart Bombs, Norma Jeane, and Defecating Ducks: A Short History of Robotics
3. Robotics for Dummies
4. To Infinity and Beyond: The Power of Exponential Trends
5. Coming Soon to a Battlefield Near You: The Next Wave of Warbots
6. Always in the Loop? The Arming and Autonomy of Robots
7. Robotic Gods: Our Machine Creators
8. What Inspires Them: Science Fiction's Impact on Science Reality
9. The Refuseniks: The Roboticists Who Just Say No

Part Two: What Change is Creating For Us
10. The Big Cebrowski and the Real RMA: Thinking About Revolutionary Techniques
11. "Advanced" Warfare: How We Might Fight With Robots
12. Robots That Don't Like Apple Pi: How the U.S. Could Lose the Unmanned Revolution
13. Open-Source Warfare: College Kids, Terrorists, and Other New Users of Robots at War
14. Losers and Luddites: The Changing Battlefields Robots Will Fight On and the New Electronic Sparks of War
15. The Psychology of Warbots
16. YouTube War: The Public and Its Unmanned Wars
17. Changing the Experience of War and the Warrior
18. Command and Control... Alt-Delete: New Technologies and Their Effect on Leadership
19. Who Let You in the War? Technology and the New Demographics of Conflict
20. Digitizing the Laws of War and Other Issues of (Un)Human Rights
21. A Robot Revolt? Talking About Robot Ethics
22. Conclusion: The Duality of Robots and Humans
7 internautes sur 8 ont trouvé ce commentaire utile 
HASH(0x94819498) étoiles sur 5 Broad, but not deep 16 juin 2009
Par J. Weill - Publié sur Amazon.com
Format: Format Kindle Achat vérifié
Wired for War is an interesting albeit time-consuming read about how technology is forever changing the way we and our enemies fight.

I first heard about this book on a video game podcast, and I was so impressed with the free sample that I sprang for the full book -- my first Kindle book purchase that cost me over $10. P.W. Singer introduces himself as a lifelong student of military history. He has a keen interest in the way that wars are fought, not just from a tactical standpoint but in terms of the impact that war has on society.

Singer's style is quotable and fairly accessible, with a few pop-culture references sprinkled in throughout. If you were to read the last sentence of every paragraph, you would have an extremely punchy, exciting, and sensational summary. Some of the technologies described seem more powerful and scary because of their accessibility: with tiny budgets, Iraqi insurgents can make simple weapons capable of disrupting American military operations. It's also easy to understand, for example, why fighter jet pilots feel threatened by drone pilots who effectively need only to use a PlayStation controller to carry out their missions. There were also interesting discussions on how war fought by robots can actually galvanize popular support against the side using robots by painting the tech-savvy nation as too cowardly to risk their own soldiers' lives.

I wish that Wired for War were more compact. Two technologies that are referenced frequently are completely autonomous robots (ones that need no guidance to move and even no guidance to fire at a target) and remotely-piloted robots (e.g. drones in Iraq controlled by pilots working in Nevada). Singer makes many references of the possible transition from manned to unmanned combat using robots, but the point is belabored without any evidence that it has been tried. In fact, when Singer describes how the military has trusted unmanned systems to decide where to drop a bomb, the results are tragic too often for there to be a tolerable risk level. Too much time is spent on hypotheses that might not be proven true for some time yet. The discussions include many interviews with military men past and present, but technical details are limited -- often because the most ambitious ideas are still fantasy in 2009.
Ces commentaires ont-ils été utiles ? Dites-le-nous


Souhaitez-vous compléter ou améliorer les informations sur ce produit ? Ou faire modifier les images?