A sample text widget

Etiam pulvinar consectetur dolor sed malesuada. Ut convallis euismod dolor nec pretium. Nunc ut tristique massa.

Nam sodales mi vitae dolor ullamcorper et vulputate enim accumsan. Morbi orci magna, tincidunt vitae molestie nec, molestie at mi. Nulla nulla lorem, suscipit in posuere in, interdum non magna.

CLick here to expand all course descriptions

Posts by (3)

Speakers 10/9: Nathan Shedroff

His talk was mostly lighthearted, which was nice, but as far as I’m concerned, his work brings some serious questions to the table. Drone and robot warfare, the stuff of countless science fiction stories, is a life-threatening issue for our time. But discussion of weapons was basically not present in his talk until the Q&A period. Hoping to steer the conversation to something a little less fun and easy, I asked him “Did you see any good doomsday device interfaces?” His answer referred to two different doomsday devices, both being rockets that had conversational interfaces. Here’s the first part of his answer to my question:

We did, actually. The funkiest, craziest, silliest one is from a film called Dark Star, it was made during the seventies, where the doomsday device, literally they were these rockets. This ship traveled around space destroying planets, and I can’t remember exactly why, maybe to make room for thoroughfares–like you would need those in space. I can’t remember  why. But they had these doomsday, planet-destroying missiles, and they talked to the missiles. The missiles have not just voices, but personalities. And they have to talk this missile out of prematurely exploding and killing everyone on the ship. Because some glitch released the missile early, and the missile is all set and raring, it’s sort of like a basketball player who can’t wait to get into the game, and yet the game’s not happening, and they have to convince the missile not to fire.

Let’s consider a couple important concepts that come to light in his response. First of all, the doomsday device that he cited was programmed to carry out its mission, and the great fear was that the device would execute its mission in a way contrary to the operator’s intent. Second, the doomsday device had a form of failsafe, in that the operator could attempt to reason with it, or convince it to deviate from the execution of its mission. I dug around in The Economist just now to find this article, the source of this quote I remembered reading about robot warfare:

One way of dealing with these difficult questions is to avoid them altogether, by banning autonomous battlefield robots… Campaign groups such as the International Committee for Robot Arms Control have been formed in opposition to the growing use of drones. But autonomous robots could do much more good than harm. Robot soldiers would not commit rape, burn down a village in anger or become erratic decision-makers amid the stress of combat.

When I read that article a few months ago, the statement struck me as a fundamentally narrow-minded evaluation of the role of humans in warfare. I wonder how many times a village was spared precisely because a human’s “erratic decision-making” caused a soldier to have pity on the people of a village. I wonder how many times a commanded execution of a prisoner did not happen, because a person felt mercy that a robot cannot have. The true danger of these devices, which are already emerging from science fiction into reality, is that they will carry out their orders well. The true fear should be that we will not be able to reason with them. How do you beg for mercy from an unmanned drone hovering a mile above you, while it is shooting hellfire missiles at you for a reason you will probably never know?

[This is an excerpt from a longer essay about his talk that appears on my blog at ]

3 comments to Speakers 10/9: Nathan Shedroff

  • Maria Paula

    I am also afraid of a war full of robots, but I am extremely positive regarding mankind awareness on non-violence in the next decades (otherwise I would be very afraid of living). Robots should not fight our wars and neither should we fight our wars. There should be no wars… Anyway, I want to add to your post that even the laws of Asimov may be tricky. The movie I, Robot (which is completely different from the book) is an example of what could happen if these laws are too enforced and somehow misunderstood by the robots. I mean, it is quite clear that humans harm other humans all the time and what would you do if you were a robot programmed to not allow a human being to come to harm? Get crazy, maybe? hehe

  • ktp242

    Well, I am not sure if I should put this under the forum though it is related to Nathan Shedroff’s speech because it is quite funny…

    Anyway, a German friend gave me this link of a sci-fi movie, Raumschiff Surprise (English title: Spaceship Surprise), which was a megahit in Germany in 2004. And yes, it is about a coming doomsday in 2304. However, the movie is a parody making fun of Star Trek and the Starship Enterprise. Two points here I think may pertain to the speech. One is about the sci-fi user interface. I remember Nathan has shown us a scene in Minority Report about the cons of an interface needed people to operate with their arms. It is cool yet tiresome and sometimes just not that usable. Raumschiff Surprise has a quite hilarious scene teasing it. The link is below (with English subtitles).

    Another point is about the plot of Raumschiff Surprise. The doomsday in the movie is coming due to the invasion of Martian humans, whose ancestors immigrated to the Mars from the Earth 300 hundred years ago. And in 2304 they head back to attack the Earth. Our heroes in the movie therefore are planning to do time travel back to the past to stop humans immigrating to the Mars. They want to prevent the new technology being discovered so that humans may not be able to make space travel possible. I think it is quite interesting that the movie somehow mentions remaining a related low technology status is the way to stop wars. I do not totally go with it. But if we look back to our history, we probably might find that weapons are always among in the first tasks that new technologies are applied to. And in reverse, there are some new technologies, such as the internet, are invented just because of the needs of warfare.