Will we ever be able to explain consciousness and can we duplicate it artifically?

atreas said:
I happen to like much more kingjoshi's position: he talks clearly from a technical point of view. From that point both perception and learning can be achieved with material means. The problem is that you still don't have consciousness that way, but surely you can emulate some sort of consciousness - at least, until we reach the domain of "feelings", "will", or "fantasy".
When I simulate in a computer a tornado, did I create a tornado? Most would say obviously not. When I computate mathematics, did the computer do thinking? What constitutes 'thinking' in someone else? How do we determine it?

When we create a robot that knows about its 'body', can learn and has the ability to 'think' and overcome obstacles, do we have a self-aware creation? There are humans that aren't good with feelings. Some are even on this board. Studies have shown which parts of the brain are important in dealing with that. Do people with severe forms of autism and other mental differences lack consciousness?

Are people that are more creative more conscious or sentient? We not only have software that can do mathematical and logical proofs, we have software that can create music, visual art, etc. With no human interaction required except to feed the system knowledge of other music.

There have been robots simulated to feel. They have worked on robotic facial expressions to mimic humans (lip movement, eye brows, etc). Observers feel 'spooked' because they don't expect robots with such capabilities.

So when we eventually create a robot with all of these in one (when our hardware and software get there), what will it have to do to actually prove sentience or consciousness?
 
Will we ever be able to explain consciousness and can we duplicate it artifically?

My God, I hope not.
 
People make fun of George Bush and his (lack of) clarity of speech. But language is one of the most difficult parts of handling intelligence. Just to get to his ability would be a major leap through.

Language capabilities in humans are quite fascinating. There are people that have no grasp of what they're talking about and the words they say make no sense, but it's completely grammatical. And others can have complete understanding but not have the ability to actually form the sentence. This is beyond speech but deals with the foundations of grammar and our ability to learn language itself.

Communicating with a conscious being is essential in determining their consciousness. Any being we create artificially will need complex capabilities of language, beyond just acknowledging yes or no.
 
kingjoshi said:
Communicating with a conscious being is essential in determining their consciousness. Any being we create artificially will need complex capabilities of language, beyond just acknowledging yes or no.
Why? If consciousness is material and can be created out of electro chemical processes and plastic, what are the characterisitics that need to manifest for it to be consciousness? Are plants conscious? Or only humans?
 
Birdjaguar said:
Why? If consciousness is material and can be created out of electro chemical processes and plastic, what are the characterisitics that need to manifest for it to be consciousness? Are plants conscious? Or only humans?
how do we know we don't create consciouness every time software is run? What behaviors are sufficient? Is the level of consciousness of a human baby different from adolescence or adulthood? Any human gets the benefit of the doubt because we're human and we assume others to be similar to us. But we treat other creatures and especially artificial creations with greater skepticism. The best way to determine it is through communication.
 
kingjoshi said:
how do we know we don't create consciouness every time software is run? What behaviors are sufficient? Is the level of consciousness of a human baby different from adolescence or adulthood? Any human gets the benefit of the doubt because we're human and we assume others to be similar to us. But we treat other creatures and especially artificial creations with greater skepticism. The best way to determine it is through communication.
You didn't answer my questions. ;) As I have said at least twice in this thread, without a definition of some sort, we are all talking past one another. Since no one seems willing to go out on a limb I will state mine:

An entity is conscious if under "ordinary" circumstances it can/does respond to changes or events that take place outside of the entity itself.
 
For an experiment or observational study, a term has to be objectively defined. For an argument, a specific definition is needed. For a discussion, a definition is not necessarily needed. It can be built as the discussion wears on. The complexity of the issue makes general concepts of the term sufficient for discussion. We don't have specific defintions of intelligence, consciousness, human or race, but we can have fruitful discussions.

Birdjaguar said:
An entity is conscious if under "ordinary" circumstances it can/does respond to changes or events that take place outside of the entity itself.
Every system in the physical world responds to stimuli. Are you saying any entity that initiates behavior in response to events in the environment beyond direct physical stimuli?

For example, there is something called 'swarm intelligence'. One of the methods of communication to other members of the swarm is not through direct link, but by leaving messages to others by modifying the environment. Whether that is to leave chemicals/odor or something else. Would the next creature that comes by and reacts to it be considered conscious?

I mean, every robot does that. Is roomba, the robot vacuum cleaner conscious?
 
kingjoshi said:
So when we eventually create a robot with all of these in one (when our hardware and software get there), what will it have to do to actually prove sentience or consciousness?

Be isomorphic to paradigm sentient beings, on a very fine-grained level. Its "neural networks" will have to work just like our neural networks. Gross behavioral similarity is never enough.

If you want to know how well a car can drive, watch it zoom down the road. If you want to know whether it's got internal combustion going on, you have to look under the hood nowadays: these fuel cell thingys are getting pretty sophisticated.

"Intelligence" is like driving, strictly a performance issue. "Pain," "pleasure," "anger," "love," "sweet sensation", "sour sensation" - these are under the hood. Like internal combustion powering a car, each of these aspects of consciousness is but one of many possible ways to achieve a similar outward result.
 
kingjoshi said:
Every system in the physical world responds to stimuli. Are you saying any entity that initiates behavior in response to events in the environment beyond direct physical stimuli?
Yes. An entity that is self aware enough to respond has some degree of consciousness. The choice of responsesmay be limited though.
kingjoshi said:
For example, there is something called 'swarm intelligence'. One of the methods of communication to other members of the swarm is not through direct link, but by leaving messages to others by modifying the environment. Whether that is to leave chemicals/odor or something else. Would the next creature that comes by and reacts to it be considered conscious?
Yes.
kingjoshi said:
I mean, every robot does that. Is roomba, the robot vacuum cleaner conscious?
I'm thinking about this and will post something later. It is an interesting question. Because if roomba is conscious then so are other man made items and I want to think about if there are any "edges" to consciousness. And if there are, how one can defne them.

Again, in your mind: Are plants conscious? Or only humans?
 
kingjoshi said:
Communicating with a conscious being is essential in determining their consciousness. Any being we create artificially will need complex capabilities of language, beyond just acknowledging yes or no.
Very true - communication (and language) is a necessary step. But is it adequate?

I think that you are referring to the famous "Turing test" of observed results, rather than internal processes. Still, to be fair, we should also include in the discussion another famous argument about this subject: the "Chinese Room" thought experiment, which tries to show that communication doesn't necessarily means understanding, and without understanding we have a big problem to claim something is conscious (or intelligent, depenting on the angle you want to look at it).

Chinese room
 
Back
Top Bottom