Thursday, March 17, 2011

The Singularity -or- Why I'm not afraid of the coming robot holocaust

A little creepy
Artificial intelligence! It's an exciting subject for me, being that I like to write a bit of quirky sci-fi on the side. More than artificial intelligence, I want to talk about our (some say inevitable) future robot holocaust. With how we rest on technology, if the technology became sentient, isn't it reasonable to think that it would take the very short step from surrounding us to ruling us, destroying us, or turning us into human batteries?

In my mind, the fear of doomsday-through-artificial-intelligence is fed through a misconception about the nature of intelligence. Human beings are different than the rest of the animals on the planet in two major ways. Firstly, we are, in some very special ways, the most intelligent animals around. Secondly, we rule the world. We lord over this place like a king ape, with our big scepter and crown, making the plants and animals bend to our needs. So, it's only natural that we would get nervous when something that's potentially more intelligent than us comes onto the scene. It's not hard to imagine that, if we start creating slaves that are stronger and smarter than us, we could end up being the next endangered species.

But keep in mind that intelligence is not the same as a wish to rule the world, or even a wish to be free from bondage.

Let's think about Data, the humanoid robot from Star Trek. Data was intelligent, but without emotions. At least, he was supposed to be. Watching that show as an adult, though (Which I did one time. Really.) I realize that Data did have emotions. Because, if someone, or something, is truly without emotion, then they will never move from one spot. If I lost all of my emotion right now, I wouldn't be driven by my desire to spread my ideas, so I would stop typing this blog. I wouldn't have any reason to hold my bladder, because I wouldn't fear the consequences of peeing my pants while sitting here. I wouldn't get up and eat, because I wouldn't feel discomfort at the sensation of hunger, nor displeasure at the feeling of wasting away. Every move we make is, at its root, driven by an emotion. We feel the emotion, and then use our intelligence to decide how to accommodate it. This is always running in the background. If Data didn't have any emotions, he would never have gotten out of the crate he was shipped in.

Google's fancy self-driving car
So, how intelligent could you make a machine before it hit you in the face and took your wallet? We could make it as intelligent as we wanted. In fact, according to Steven Levy, author of a Wired Magazine article that I enjoyed, we've already got artificial intelligence. There are computers that can think faster, and better, in very specialized ways, than humans. Jeopardy champion Watson comes to mind.

Well, where does this put my philosophy on emotion? Why aren't these emotionless machines sitting and rotting, as opposed to vacuuming our floors and driving our future cars? Well, it seems to me that these machines do have emotions. Their emotions are very few, and very simple, but they are there. A Roomba is driven to vaccuum all the time, and it uses its intelligence to figure out how to do it. Watson is driven to answer Jeopardy questions, and it uses its intelligence to figure out how to do it. We are driven to avoid spoiled food, and to have sex with sexy people, and to eat pizza, pizza, pizza all day long, and we use our own intelligence in these pursuits. Simple emotions for simple machines, and uber-complex emotions for uber-complex machines like ourselves.

My point is that, if computers wanted to rule the world, someone would have to program that desire into them. If they wanted to enslave humanity, some geek would have to spend many sleepless nights figuring out the easiest, most bug-proof way to enslave himself and his species-peers. It's not something that would happen automatically. It's far from a foregone conclusion.

Just like the idea of alien life, we humans tend to think of intelligent machines in human terms, as if humanity is something you'll reach if you just keep adding virtual neurons. But we're not the product of virtual neurons. We're the product of millions of years of selective pressure in certain environmental/social conditions. Nobody thinks that a virtual brain will automatically generate the personality of a crow, or a lemur, but there are loads of people assuming that a human's drives will spontaneously arise in a complex-enough computer.

No.

No comments:

Post a Comment