Cindy Grimm and Bill Smart: Robotics

In his 1941 short story, “Runaround,” Isaac Asimov created his three laws of robotics:

  1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  2. A robot must obey orders given it by human beings, except where such orders would conflict with the first law, and
  3. A robot must protect its own existence as long as such protection is not conflict with the first or second law.

In this episode of the Plutopia podcast, Cindy Grimm and Bill Smart feel Asimov’s laws miss the mark when it comes to laws of robotics.

Bill Smart:

Asimov’s three laws, everyone brings them up. They were a plot device by a guy – who had never seen a robot – to drive story. If you were to think about it for five seconds, they are a reasonable three things to come up with. If you think about it for ten seconds, they fall apart of course. We do need structures and we do need a way of thinking about the ethics and morality of a social impact of robots. But I think it’s way more nuanced – way, way, way more nuanced – than you could compile into a paragraph of text.

Cindy Grimm:

I mean something even simple like, a robot should always do what you tell them to do, they should always get out of your way, that might not actually make sense. If the robots are all trying to get out of your way and so on, but there’s a needed delivery or they actually need to get somewhere, then maybe the robot does actually need to take priority over the humans. You can’t really have those conversations when you start from the place of, you know, all robots should kowtow to humans and stuff.

Cindy Grimm is an American computer scientist, roboticist, and mechanical engineer. Bill Smart researches the areas of robotics and machine learning. Both are professors in the College of Engineering at Oregon State University.

Links

Related posts

Gathering Online: Social Media vs Community

Vietnam Redux

Ann-Marie Wilson: Opposing FGM Globally