Peter W. Singer/Wired For War at The Complex Terrain Lab
Hi All. I'd like to call your attention to another symposium coming up next week at The Complex Terrain Laboratory, from 30 March to 2 April. This one is on Peter Singer's new book Wired For War: The Robotics Revolution and Conflict in the 21st Century (Penguin, 2009). Singer's opening remarks will be posted early Monday morning.
Not all the stuff we do at CTlab is SWJ material, but I think this one will be of interest to readers. Plenty of ink has been spilled on the subject of robotics/unmanned systems, and symposium participants will be adding a fair bit of output to that over the next week.
Confirmed participants include:
Kenneth Anderson (Law; American University)
Matt Armstrong (Public Diplomacy; Armstrong Strategic Insights Group)
John Matthew Barlow (History; John Abbott College)
Rex Brynen (Political Science; McGill University)
Antoine Bousquet (International Relations; Birkbeck College, London)
Charli Carpenter (International Relations; UMass-Amherst)
Andrew Conway (Political Science; NYU)
Jan Federowicz (History; Carleton University)
John T. Fishel (National Security Policy; University of Oklahoma)
Michael A. Innes (Political Science; University College London)
Martin Senn (Political Science; University of Innsbruck)
Marc Tyrrell (Anthropology; Carleton University)
Quite a few of them are active elsewhere on the web, and several are active SWC participants. Their blogs include Arms Control, Duck of Minerva, In Harmonium, MountainRunner, Opinio Juris, PaxSims, Spatialities, Third World Wired, and Zero Intelligence Agents.
I'd be especially interested to read what the COIN/CT experts here at SWC think of what Singer's written, on the issue of unmanned systems in general, and on whatever we manage to come up during our event. For quick post tracking, proceedings will be compiled and indexed here. All comments welcome, whether at CTlab or here at SWC.
Mike
Frakin’ Cool and Winning Wars (Book Review)
Frakin’ Cool and Winning Wars (SWJ Book Review)
by Robert L. Goldich
Frakin’ Cool and Winning Wars (Full PDF Article)
Quote:
After Operation Desert Storm in 1991, there was a fusillade of remarks about how American technological superiority was the decisive factor in how we won the war. General H. Norman Schwarzkopf would have none of this. He stated that although our weapons and equipment were indeed technologically superior to those of the Iraqis, we would have won the war if we had had their equipment, and they had had ours. P.W. Singer would have done well to ponder this remark at some point in the researching and writing of Wired for War.
Not read the book, but...
As the Middle East Editor of Unmanned Vehicles, the premier journal in the field, for which Pete Singer provided an article, I will follow this with some interest.
Personally, I just don't see the complexity in applying robotics to war. The problem tends to be more ignorance and aspiration based. The operational problems are generally solvable (as I have said in print) given simple and robust conceptual guidance, which we are not currently short of.
Ground crawling robots proved decisive in very obvious ways during Cast Lead
Symposium Opening Remarks
Peter Singer's opening remarks are now up, and several participant posts should be up later in the day. From Singer:
Quote:
I wanted to thank Michael, Matt, and Matthew, and all the other for putting together this symposium. It is a great honor to have one’s work looked at in this way so soon after coming out.
Wired for War is a book about how something big that is going on today in the overall history of war, and maybe even of humanity itself. The US military went into Iraq with just a handful of robotic drones in the air and zero unmanned systems on the ground, none of them armed. Today, there are over 7,000 drones in the US inventory and another roughly 12,000 on the ground. But these are just the first generation, the Model T Fords and Wright Flyers compared to what is already in the prototype stage. And, yes, the tech industry term of “killer application” doesn’t just describe what iPods did to the music industry, but also applies to the arming of our creations with everything from hellfire missiles to .50 caliber machine guns.
There's more. Looking forward to reading what develops in this SWC thread.
Having Our Cake and Eating It, Too...
I always thought that expression was a bit light - I mean, you need to have your cake, to eat it, right? Can't eat it if you don't have it, right?
Anyway.
Ken Anderson, who teaches law at American University and blogs on the laws of armed conflict at Opinio Juris, has posted a pretty provocative piece that deals in large part with the non-law response to precision targeting. Having our cake and eating it, too, is his point: increased demand for technological precision in war, matched by increasingly negative response to what that means in practice: essentially, as Ken puts it, removing the anonymity of war and replacing it with much narrower focus on well identified, high value targets. The result, I'd have to agree with Ken, is more than a little bewildering, it not downright hypocrytical.
Thoughts?
is autonomy really the issue?
While I disagree that no one is talking about autonomy, this is not the real issue. We've attempted to dismiss technology-related problems through painful parsing, excuses, and legal mumbo-jumbo before. While that may ultimately succeed in the American, or even Western, court of public opinion, it doesn't work in the places where opinions really matter.
More Bourbon, guaranteed cure
for bad sinuses. Sinii??? :D
Ref : The will of the people(aka perceptions)
When we discuss technology in all its forms and the perceptions of those who this tech will be aimed at helping/hurting would it not be important that we not put the cart before the horse.
What I mean is if we were to take for example tech systems that use behavioral analysis to explore possible points of contention or probabilities of a given thing happening( search for bots, anti-spam, anti-viral, social network mapping, etc) these should involve a high level of skepticism in their acceptance as singular solutions. Each requires somewhere in the process that
DUH check that only a human being can actually put the final approval/denial on its results.
This is not saying that they aren't effective on their own and they can and do effectively identify and at least minimize issues related the the realm they monitor.
Down side is I don't know about the rest of you but I'll be darned if I can get most of my family or friends to actually leave them turned on because they stand in the way of their ease to interact with the world through the internet.
Same generally goes for more advanced robotic applications involved in performing missions of great variety in replacing human beings. Sure those robots at the factory give the ability to build a whole lot more cars each day then humans could and may be more accurate but on the flipside if one of them gets set wrong then everything has to stop in order to get it straight before starting the whole line again, or even worse nobody notices and 6000 cars hit the road with a defect which may cost lives.
Take that to the next step and apply it to UAV's, ground systems etc. Disposable bot to check for a bomb great, bot whose supposed to determine whether a house is dangerous or not, or a human- What you gonna do when all the right elements for a concoction exist in a small enough area That it presumes its there to make what could be made from it and simply disposes of it without asking.
Long and short same argument you've probably heard a million times but seems worth restating- They may help you do what you do better, or even do what you faster and more effectively but that means they can also screw it up faster and more effectively then you could ever do.
Until we actually figure out our own brains it's probably a really bad idea to work to hard to try replicating in digital autonomy that which we have yet to explain sufficiently about ourselves.