Isaac Asimov inspired roboticists with his science fiction and especially his robot laws. The first one says:
A robot may not injure a human being or, through inaction, allow a human being to come to harm.
Artist and roboticist Alexander Reben has designed a robot that purposefully defies that law.
"It hurts a person and it injures them," Reben says. His robot pricks fingers, hurting "in the most minimal way possible," he says.
And the robot's actions are unpredictable — but not random. "It makes a decision in a way that [I] as the creator cannot predict," Reben says. "When you put yourself near this robot, it will decide whether or not to hurt and injure you."
Though it may seem like a slightly silly experiment, Reben is making a serious point: He's trying to provoke discussion about a future where robots have the power to make choices about human life.
Reben's robot is not very elaborate. It's just a robotic arm on a platform, smaller than a human limb and shaped a bit like the arm on one of those excavators they use in construction — but instead of a shovel, the end has a pin. (And in case you were wondering, each needle is sterilized.)
"You put your hand near the robot and it senses you," Reben explains. "Then it goes through an algorithm to decide whether or not it's going to put the needle through your finger."
I put my finger beneath the arm. The waiting is the hardest part, as it swings past me several times. Then I feel a tiny sting when it finally decides to prick me.
Reben created this robot because the world is getting closer to a time when robots will make choices about when to harm a human being. Take self-driving cars. Ford Motors recently said it planned to mass produce autonomous cars within five years. This could mean that a self-driving vehicle may soon need to decide whether to crash the car into a tree and risk hurting the driver or hit a group of pedestrians.
"The answer might be that 'Well, these machines are going to make decisions so much better than us and it's not going to be a problem,' " Reben says. "They're going to be so much more ethical than a human could ever be."
But, he wonders, what about the people who get into those cars? "If you get into a car do you have the choice to not be ethical?"
And people want to have that choice. A recent poll by the MIT Media Lab found that half of the participants said they would be likely to buy a driverless car that put the highest protection on passenger safety. But only 19 percent said they'd buy a car programmed to save the most lives.
Asimov's fiction itself ponders a lot of the gray areas of his laws. There are a total of four — the fourth one was added later as the zeroth law:
0. A Robot may not harm humanity or by inaction allow humanity to come to harm.
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
In Asimov's stories, the laws are often challenged by the emotional complexities of human behavior. In a screenplay derived from his famous I, Robot, the protagonist is a detective who doesn't like robots because one had saved him in a car crash, but let the girl beside him die based on a statistical determination that she was less likely to survive.
Still, Asimov's laws are often cited by scientists in the field as a kind of inspiration and talking point as we move toward a world of increasingly sophisticated machines.
"The ability to even program these laws into a fictional robot is very difficult," Reben says, "and what they actually mean when you really try to analyze them is quite gray. It's a quite fuzzy area."
Reben says the point of making his robot was to create urgency — to put something in the world now, before machines have those powers in self-driving cars.
"If you see a video of a robot making someone bleed," he says, "all of a sudden it taps into this viral nature of things and now you really have to confront it."