Consciousness: Is It Possible?

Have homo sapiens only developed intelligence to the extent to which it was biologically economical given our circumstances? How is that measured?

Yes we have, because although we compete with other species we also compete with each other. The result is an cycle of escalation in intelligence which both gives rise to, and is enhanced by, artificial factors such as culture, technology, and the ability to tap into new energy resources (e.g. agriculture, fossil fuels).

I don't know exactly how it can be measured. Perhaps by looking at how many calories are required to sustain higher intellectual functioning in an organism vs the additional calories and reproductive success that higher intellectual functioning can deliver?
 
How do you measure intelligence itself? I do not see humans as any more intelligent. They are more efficient and have more knowledge, but I do not see humans as any more intelligent. In fact they have more than likely abandoned some intelligence, because methodology only allows for objective reasoning.
 
It depends what aspect(s) of intelligence you are talking about. Puzzles and problem solving tests can measure important aspects of intelligence such as pattern recognition capability, analytical skills, and abstraction ability. Imo intelligence is the ability to acquire and apply knowledge towards useful ends, including the creation of new useful knowledge.

I do not agree that humans have abandoned some intelligence simply because objective reasoning happens to be a particularly successful intellectual strategy. Some modern cultures might devalue intelligence (I'm looking at you, United States of Kardashians), but it takes a lot longer than a couple of generations for humans to actually "abandon" intelligence which has evolved over millions of years.
 
Devil's advocate. How do we know we're actually conscious? We might have seeing-eye points of reference for our brains but that doesn't mean our flashes of emotion and insight are any more meaningful than inanimate objects. You take away the right parts and the person ceases to exist.
 
It depends what aspect(s) of intelligence you are talking about. Puzzles and problem solving tests can measure important aspects of intelligence such as pattern recognition capability, analytical skills, and abstraction ability. Imo intelligence is the ability to acquire and apply knowledge towards useful ends, including the creation of new useful knowledge.

I do not agree that humans have abandoned some intelligence simply because objective reasoning happens to be a particularly successful intellectual strategy. Some modern cultures might devalue intelligence (I'm looking at you, United States of Kardashians), but it takes a lot longer than a couple of generations for humans to actually "abandon" intelligence which has evolved over millions of years.

Is there a difference between being able to think at all and being able to think more efficiently? At what point in the evolutionary process would one put the ability to not think at all? At what complexity is thought irrelevant?

I don't think we abandon any intelligence via evolving. It seems to be via choice. Neither do I see intelligence evolving, but it is human choices that have "evolved" and sought out more efficient ways of doing things. Evolution is the product of genetic error. It is how we deal with that error that determines where we go next.

Devil's advocate. How do we know we're actually conscious? We might have seeing-eye points of reference for our brains but that doesn't mean our flashes of emotion and insight are any more meaningful than inanimate objects. You take away the right parts and the person ceases to exist.

It seems to me that the ability to rationalize would be a key factor.
 
Yes we have, because although we compete with other species we also compete with each other. The result is an cycle of escalation in intelligence which both gives rise to, and is enhanced by, artificial factors such as culture, technology, and the ability to tap into new energy resources (e.g. agriculture, fossil fuels).
Intra-species competition is pretty widespread; outside of organisms which are almost entirely solitary.

So the question is why did human beings especially develop "intelligence" (whatever that may mean) to the extent of culture, technology (however you might define these too) etc.; and other organisms didn't?
 
Devil's advocate. How do we know we're actually conscious? We might have seeing-eye points of reference for our brains but that doesn't mean our flashes of emotion and insight are any more meaningful than inanimate objects. You take away the right parts and the person ceases to exist.
I don't think being conscious makes us "meaningful". Rather, I think that it's a threshold for the application of equal standing to others. For instance, since we value our own lives, we should recognize that any conscious being has the same claim to a right to life as we do.
It seems to me that the ability to rationalize would be a key factor.
Computers can be made to make decisions and draw conclusions using the same kind of logic that people do. It's easy enough to do because logic is very well understood. The only tricky part is to teach computers enough information to apply that logic to.
 
I don't think being conscious makes us "meaningful". Rather, I think that it's a threshold for the application of equal standing to others. For instance, since we value our own lives, we should recognize that any conscious being has the same claim to a right to life as we do.
Computers can be made to make decisions and draw conclusions using the same kind of logic that people do. It's easy enough to do because logic is very well understood. The only tricky part is to teach computers enough information to apply that logic to.
But a computer would not drive drunk or develop "beer goggles" when looking for love in all the wrong places.
 
Not irrational enough and they will lack passion. Now a badly programmed AI might do stupid things, but they wouldn't be our of passion or silliness.
 
Why couldn't they be programmed to be irrational or passionate?

Can you program passion? or love? I suppose you can only program rational irrationality as can be seen in behaving of CIV games AI...
 
A lot of human irrationality (baring those due to evolutionary instincts) comes from the poor interface we have with our emotional senses. A LOT of our emotional senses come from the hormonal interface between our body and our brain, and we do have sensory neurons in the region of our brain that interfaces with our hormonal responses. But, unlike our other senses (vision, hearing in particular), we're not very well attuned with out 'emotional' senses. We have a hard time interpreting and predicting. And so, a lot of our responses seem irrational.

"Why did you duck?"
"I saw a ball flying towards me"
[easy]

"Why were you so chipper today, but pissy now?"
"Not sure, probably because I was thinking about my -ex"
[tougher]
 
Can you program passion? or love? I suppose you can only program rational irrationality as can be seen in behaving of CIV games AI...
Sure, why not?

Anything that can be explained and predicted can be programmed. You just need a good model of when people get passionate and what they do. Jelousy, rage, embarrassment, disgust we have a model of when people feel these things and how they act when they do. If we didn't, then we wouldn't be able to recognize them in other people, and the task of writing convincing AI would actually be easier, because it would be less important to get those things right.

The Sims is a good commercial example of exactly this in action. The Sims are simpler than real people, in part because having too many knobs would effect gameplay. An attempt whose sole goal was to make AI would be more thorough. But for emotions, the model would likely be the same -- different parameters that change through the things a person does or happens to them.

There are also various pet robots that can be convincing enough for people to feel some empathy for. The technology isn't great yet, but there are no fundamental obstacles to improvement.
 
A lot of human irrationality (baring those due to evolutionary instincts) comes from the poor interface we have with our emotional senses. A LOT of our emotional senses come from the hormonal interface between our body and our brain, and we do have sensory neurons in the region of our brain that interfaces with our hormonal responses. But, unlike our other senses (vision, hearing in particular), we're not very well attuned with out 'emotional' senses. We have a hard time interpreting and predicting. And so, a lot of our responses seem irrational.

"Why did you duck?"
"I saw a ball flying towards me"
[easy]

"Why were you so chipper today, but pissy now?"
"Not sure, probably because I was thinking about my -ex"
[tougher]
That's true, but in the end we can analyze emotional responses and break them down to more fundamental causes. The task of writing AI does not require being able to predict the behavior of a particular person, it only needs to be within the plausible range of human responses. So you don't need accuracy.
 
Emotions define human experience. Why would one want to subject an artificial being to such emotions? I realize so that it could relate, but there is a lot of other baggage that would just put such an artificial being in a constant "depressed" state.
 
Emotions define human experience. Why would one want to subject an artificial being to such emotions? I realize so that it could relate, but there is a lot of other baggage that would just put such an artificial being in a constant "depressed" state.
Probably someone will do it just to show they can.
 
Emotions define human experience. Why would one want to subject an artificial being to such emotions? I realize so that it could relate, but there is a lot of other baggage that would just put such an artificial being in a constant "depressed" state.

Because emotions are fantastic things. Yes, they can be negative, but overall, the positives from them outweigh those for me. Joy, love etc. These are things that I would want AI to experience simply because they are so wonderful, and I think that everyone, everything, deserves to enjoy them. I don't want AI to be uncaring zombie slaves. I want them to be thinking, feeling, individuals like us.
 
Why would one want to subject an artificial being to such emotions? I realize so that it could relate, but there is a lot of other baggage that would just put such an artificial being in a constant "depressed" state.

Well, we create laboratory mice that suffer from (e.g.,) schizophrenia symptoms or depression symptoms in order to learn more about the disease. I'd certainly not say that this is moral, but we do it with the endgame of improving the human condition.
 
So the question is why did human beings especially develop "intelligence" (whatever that may mean) to the extent of culture, technology (however you might define these too) etc.; and other organisms didn't?
I don't know what current thinking is among professionals on this, but it seems reasonable that a slight increase in intelligence conveyed a selective advantage to some mutants over others.

Perhaps it was sexual selection, where female choice and male preference converge to result in a strikingly fast increase in the mental sophistication of offspring.

Or perhaps the increase in intelligence was simply an emergent result of larger cranial capacity that wasn't selected against.

Or perhaps none of the above. But I don't think our particular evolutionary at is the only way to consciousness or intelligence.
 
Back
Top Bottom