I know we just discussed robots, but you know I'm a robo-lover. And I just watched a program on the Science Channel about robots and sentience. (Plus I am dissatisfied with how both
Transcendence and
Her handled the issue. Both movies were just dull. Dull. Dull. Dull. And they could have been very thoughtfully handled.)
Anyway, the question that seems to come up in these movies and this show is, if these robots don't know what it is to be human, how can they empathize with humans? What reason would they have not to destroy us? In real life, in the last five years or so, some companies have experimented with trying to program Asmimov's Robotic Laws into robot minds. It hasn't really worked. The problem is, people say, robots don't have the same values we do.
They're like bald barbers--they don't really care about your hair.
I'm just kidding. I'm sure bald barbers care about hair. It's the male gynocologists you have to watch out for.
But all this assumes the robots would act in a
human way. That is, people fear that if robots develop sentience, they will realize they are better than us and just do away with us. Which may be what a
human would do, but we don't know how a robot would react. The argument is that humans are inefficient and not as durable and not as smart as robots so the robots would say
"Hey, let's not clutter up our planet with all these inefficient, stupid, breakable things. That's just logical."
And maybe it
is logical. But lots of machines are not logical. They may be
mostly logical, but they can be really random sometimes, too. You know what I'm talking about if you've ever had to deal with office machinery, manufacturing machinery, a car, a DVR, or an iPhone.
But also, it seems like faulty if/then logic on our part. Look, if the robot
can't be sentient because it is acting out of pure logic, with no emotion, and even
if it sees humans as inefficient, stupid, and not durable, there's no reason for robots to destroy humans. Why would they? That's not logical. We don't eat the same food. We don't breathe the same air. We're not taking all the robot women. Why would they
want to kill us off?
Wouldn't they have to have some sort of emotion to feel jealous of humans, or angry, or disgusted in order to decide to kill us off?
And if there is emotion, then why can't robots empathize with humans? (Unless we are saying robots only have negative emotions. That seems a little racist, doesn't it?) Look, when you go to a bald barber, he doesn't shave off all your hair because he doesn't have any, right? And your cat doesn't kick you out of the house even though it's much more efficient for him to get his own food off the shelf, tear open the bag, and feed himself without having to wait for you to come home from work and do it. And children don't kill their parents because they are no longer of any use to them.
The truth is, no one really knows if robots can be or
are sentient right now. There are some pretty strong arguments in the scientific community as to whether animals are. No one knows if robots can develop emotions. Or if they should. I just find it very interesting to think about all of this.
What are your thoughts?