Artificial Consciousness

Philosopher David Chalmers on artificial intelligence in movies, consciousness of computers and moral rights of artificial intelligence systems

videos | September 8, 2016

Can we create an artificial person? Can a computer feel and think like a real person? When will the artificial intelligence systems acquire ethical rights? Professor of Philosophy David Chalmers describes the different views on the problem of artificial consciousness.

One of the deepest questions in philosophy and science is wether you can create an artificial person, an artificial intelligence that has an artificial mind. And in the current discussion this is really formulated as a question about computers. Could you program a computer so that it literally has a mind so that thinks, feels, perceives, feels emotion and especially is conscious? So this is depicted sometimes in the movies: everything from Star Wars where we have the droids C-3PO, R2D2, Star Trek where we have the android Commander Data to more recent films like Her where we have Scarlett Johansson playing the operating system Samantha who is depicted as a conscious and feeling person.

Philosopher David Chalmers on the combination problem, dualism, and panpsychism

One could actually imagine replacing my neurons by silicon chips, which perform the same function and then asking what happens to my consciousness. When I do this, if the silicon chips function well enough, I think you can argue my consciousness will stay the same. We could even replace these neurons by little people that play the same function. I think again you can argue the consciousness will stay the same. For these reasons I believe that artificial consciousness is possible. And if you get the elements of a computer system hooked up in the same complex ways that we have neurons hooked up inside the brain then it will be conscious.

Right now we have some simple artificial intelligence systems, but no one feels that so far they have moral rights. When do they acquire this? The first thing I think that’s required for them to have moral status as subjects is for them to be consciousness. Right now most people think they are not conscious, in which case they don’t have rights. But once they eventually have some kind of consciousness, maybe they require some more rights. Maybe not so much, maybe it’s like a fish or a chicken, but some consciousness. Maybe we think they have some moral status, but not as much as humans. If they eventually though have the same kind of level of consciousness that humans have, I would say they have the same moral status as humans have.

Professor of Philosophy, New York University; Director of the Centre for Consciousness, Australian National University
Did you like it? Share it with your friends!
Published items
0779
To be published soon
+92
New