If I finish the book this weekend I'm planning on carpet bombing the CTLAB. If not I'll drink more Tequila. Options are good.
Hi All. I'd like to call your attention to another symposium coming up next week at The Complex Terrain Laboratory, from 30 March to 2 April. This one is on Peter Singer's new book Wired For War: The Robotics Revolution and Conflict in the 21st Century (Penguin, 2009). Singer's opening remarks will be posted early Monday morning.
Not all the stuff we do at CTlab is SWJ material, but I think this one will be of interest to readers. Plenty of ink has been spilled on the subject of robotics/unmanned systems, and symposium participants will be adding a fair bit of output to that over the next week.
Confirmed participants include:
Kenneth Anderson (Law; American University)
Matt Armstrong (Public Diplomacy; Armstrong Strategic Insights Group)
John Matthew Barlow (History; John Abbott College)
Rex Brynen (Political Science; McGill University)
Antoine Bousquet (International Relations; Birkbeck College, London)
Charli Carpenter (International Relations; UMass-Amherst)
Andrew Conway (Political Science; NYU)
Jan Federowicz (History; Carleton University)
John T. Fishel (National Security Policy; University of Oklahoma)
Michael A. Innes (Political Science; University College London)
Martin Senn (Political Science; University of Innsbruck)
Marc Tyrrell (Anthropology; Carleton University)
Quite a few of them are active elsewhere on the web, and several are active SWC participants. Their blogs include Arms Control, Duck of Minerva, In Harmonium, MountainRunner, Opinio Juris, PaxSims, Spatialities, Third World Wired, and Zero Intelligence Agents.
I'd be especially interested to read what the COIN/CT experts here at SWC think of what Singer's written, on the issue of unmanned systems in general, and on whatever we manage to come up during our event. For quick post tracking, proceedings will be compiled and indexed here. All comments welcome, whether at CTlab or here at SWC.
Mike
If I finish the book this weekend I'm planning on carpet bombing the CTLAB. If not I'll drink more Tequila. Options are good.
Sam Liles
Selil Blog
Don't forget to duck Secret Squirrel
The scholarship of teaching and learning results in equal hatred from latte leftists and cappuccino conservatives.
All opinions are mine and may or may not reflect those of my employer depending on the chance it might affect funding, politics, or the setting of the sun. As such these are my opinions you can get your own.
Frakin’ Cool and Winning Wars (SWJ Book Review)
by Robert L. Goldich
Frakin’ Cool and Winning Wars (Full PDF Article)
After Operation Desert Storm in 1991, there was a fusillade of remarks about how American technological superiority was the decisive factor in how we won the war. General H. Norman Schwarzkopf would have none of this. He stated that although our weapons and equipment were indeed technologically superior to those of the Iraqis, we would have won the war if we had had their equipment, and they had had ours. P.W. Singer would have done well to ponder this remark at some point in the researching and writing of Wired for War.
As the Middle East Editor of Unmanned Vehicles, the premier journal in the field, for which Pete Singer provided an article, I will follow this with some interest.
Personally, I just don't see the complexity in applying robotics to war. The problem tends to be more ignorance and aspiration based. The operational problems are generally solvable (as I have said in print) given simple and robust conceptual guidance, which we are not currently short of.
Ground crawling robots proved decisive in very obvious ways during Cast Lead
Infinity Journal "I don't care if this works in practice. I want to see it work in theory!"
- The job of the British Army out here is to kill or capture Communist Terrorists in Malaya.
- If we can double the ratio of kills per contact, we will soon put an end to the shooting in Malaya.
Sir Gerald Templer, foreword to the "Conduct of Anti-Terrorist Operations in Malaya," 1958 Edition
Bill, do you have a direct URL for Singer's article in Unmanned Vehicles? I can add it to the list of reference material we're compiling at CTlab.
Infinity Journal "I don't care if this works in practice. I want to see it work in theory!"
- The job of the British Army out here is to kill or capture Communist Terrorists in Malaya.
- If we can double the ratio of kills per contact, we will soon put an end to the shooting in Malaya.
Sir Gerald Templer, foreword to the "Conduct of Anti-Terrorist Operations in Malaya," 1958 Edition
Mike,
Are you going to discuss the impact of robotics when our enemies start using this technology against us? Bill
Hi Bill. An important point. If participants don't bring it up, I'll be sure to moderate it into the discussion. Thanks for suggesting it.
Mike
certainly explicit in Singer's writing. And if we have the technology others do too, or will soon get it. Of course, it will migrate to our enemies as well as our friends. Such is the inevitable course of technological innovation. For its implications, we'll see what we all say in the symposium.
Hat tip to you Bill
Cheers
JohnT
Peter Singer's opening remarks are now up, and several participant posts should be up later in the day. From Singer:
There's more. Looking forward to reading what develops in this SWC thread.I wanted to thank Michael, Matt, and Matthew, and all the other for putting together this symposium. It is a great honor to have one’s work looked at in this way so soon after coming out.
Wired for War is a book about how something big that is going on today in the overall history of war, and maybe even of humanity itself. The US military went into Iraq with just a handful of robotic drones in the air and zero unmanned systems on the ground, none of them armed. Today, there are over 7,000 drones in the US inventory and another roughly 12,000 on the ground. But these are just the first generation, the Model T Fords and Wright Flyers compared to what is already in the prototype stage. And, yes, the tech industry term of “killer application” doesn’t just describe what iPods did to the music industry, but also applies to the arming of our creations with everything from hellfire missiles to .50 caliber machine guns.
I always thought that expression was a bit light - I mean, you need to have your cake, to eat it, right? Can't eat it if you don't have it, right?
Anyway.
Ken Anderson, who teaches law at American University and blogs on the laws of armed conflict at Opinio Juris, has posted a pretty provocative piece that deals in large part with the non-law response to precision targeting. Having our cake and eating it, too, is his point: increased demand for technological precision in war, matched by increasingly negative response to what that means in practice: essentially, as Ken puts it, removing the anonymity of war and replacing it with much narrower focus on well identified, high value targets. The result, I'd have to agree with Ken, is more than a little bewildering, it not downright hypocrytical.
Thoughts?
Last edited by Mike Innes; 04-02-2009 at 05:04 AM. Reason: typo
Just looked through some of the posts on CT Lab, and a couple of points spring to mind.
Closely observing the one military to make extensive use of "unmanned vehicles" in a recent conflict, I can safely say that NO ONE is talking about autonomy, in the terms currently being discussed and especially when it comes to lethality.
The primary uses of "unmanned systems" are pretty well codified and pretty well understood, based on recent experience. None of the items raised so far are in any way much relevant to how the actual user communities see the capabilities developing.
The current areas of discussion have very little to do with law (other than ROE) and everything to do with application within the battle space, where ever that maybe.
Infinity Journal "I don't care if this works in practice. I want to see it work in theory!"
- The job of the British Army out here is to kill or capture Communist Terrorists in Malaya.
- If we can double the ratio of kills per contact, we will soon put an end to the shooting in Malaya.
Sir Gerald Templer, foreword to the "Conduct of Anti-Terrorist Operations in Malaya," 1958 Edition
such a carmudgeon. Actually, I agree with you in the main and my next post to the symposium - later this AM - will expand on what you are saying along with Robert Goldich's review of the book posted on the SWJ the other day and some sci fi for the fun of it.
Cheers
JohnT
While I disagree that no one is talking about autonomy, this is not the real issue. We've attempted to dismiss technology-related problems through painful parsing, excuses, and legal mumbo-jumbo before. While that may ultimately succeed in the American, or even Western, court of public opinion, it doesn't work in the places where opinions really matter.
I was wondering if the sea of fog I'm swimming in from a sinus infection had rendered me combat ineffective...
As for the answer... Where do opinions really matter?? High School of course
Hacksaw
Say hello to my 2 x 4
for bad sinuses. Sinii???
just as soon as I can escape my four walled cell
Hacksaw
Say hello to my 2 x 4
Ken,
the public opinion I'm referring to is the war of perceptions, the struggle for minds and wills over both the indigenous population and a global audience who may be antagonistic to or conversely supportive of the mission.
When we discuss technology in all its forms and the perceptions of those who this tech will be aimed at helping/hurting would it not be important that we not put the cart before the horse.
What I mean is if we were to take for example tech systems that use behavioral analysis to explore possible points of contention or probabilities of a given thing happening( search for bots, anti-spam, anti-viral, social network mapping, etc) these should involve a high level of skepticism in their acceptance as singular solutions. Each requires somewhere in the process that
DUH check that only a human being can actually put the final approval/denial on its results.
This is not saying that they aren't effective on their own and they can and do effectively identify and at least minimize issues related the the realm they monitor.
Down side is I don't know about the rest of you but I'll be darned if I can get most of my family or friends to actually leave them turned on because they stand in the way of their ease to interact with the world through the internet.
Same generally goes for more advanced robotic applications involved in performing missions of great variety in replacing human beings. Sure those robots at the factory give the ability to build a whole lot more cars each day then humans could and may be more accurate but on the flipside if one of them gets set wrong then everything has to stop in order to get it straight before starting the whole line again, or even worse nobody notices and 6000 cars hit the road with a defect which may cost lives.
Take that to the next step and apply it to UAV's, ground systems etc. Disposable bot to check for a bomb great, bot whose supposed to determine whether a house is dangerous or not, or a human- What you gonna do when all the right elements for a concoction exist in a small enough area That it presumes its there to make what could be made from it and simply disposes of it without asking.
Long and short same argument you've probably heard a million times but seems worth restating- They may help you do what you do better, or even do what you faster and more effectively but that means they can also screw it up faster and more effectively then you could ever do.
Until we actually figure out our own brains it's probably a really bad idea to work to hard to try replicating in digital autonomy that which we have yet to explain sufficiently about ourselves.
Any man can destroy that which is around him, The rare man is he who can find beauty even in the darkest hours
Cogitationis poenam nemo patitur
Bookmarks