Should sentient computers have "Human Rights"?

You're religious beliefs deserve no respect if they condone the wholesale slaughter of intelligent beings on the basis that they aren't human.

I can only compare it to Nazism. (and that's not even enacting Godwin's law).
 
You're religious beliefs deserve no respect if they condone the wholesale slaughter of intelligent beings on the basis that they aren't human.

I can only compare it to Nazism. (and that's not even enacting Godwin's law).

Did I ever say I support the wholesale slaughter of beings? I said "machines", which is all an artificial robot created by man would be.
 
What is the difference between a thinking machine and a being? What if the thinking machine was composed of organic materials?

You need to start actually thinking about some of these issues instead of blinding spewing out misguided ideology.

And for the record, to make the claim that a particular being does not have the right to their own life IS to condone their wholesale slaughter.
 
They also need a strong source of debating material, too. Because if you cannot voice good reasons why the intelligent beings should not have rights then they cannot debate the topic with you. Meaning, force would be the last (but necessary) option to get those rights.
 
Some day in the future, Mister Joe-Everyman-Smith heads down to a clinic and gets a 'cyber-brain' unit hooked up to his head. The CB is a piece of hardware which can function analagously to a mass of nervous tissue. While initially blank, the CB slowly becomes integreated into Mr. Smith's thought process. The brain extends its processing and storeage abilities into the CB until the two are effectively one. (Just as it does when incorperating new tissues while the brain grows) In time, Mr. Smith's Organic brain starts to die, but his mind is both the Brain, and CB - Two parts of a single greater mind.

Emphasis added - how analogous is analogous? If it duplicates the fine structure of a typical human brain, fine; but that seems likely to be very difficult and expensive. On the other hand, if the "analogy" only consists in the fact that it can process information intelligently, then I suspect Mr Smith is going to notice some disturbing changes. For example he might say "now that a part of my somatosensory cortex died, I can't feel my left leg, although somehow I still know when something is touching it."
 
While we're nowhere near developing such a theory, a 'theory of materialistic sentience' would go a long way to solving this issue. Right now we operate off of empathy and 'rules of thumb', but we really could do better than that. As we learn more neuroscience, a viable theory might shake out eventually. A similar event occured in biology, physics, economics, etc.

Quoted for truth. El Mac, did you say you're a psychologist (by training and/or employment)?
 
Emphasis added - how analogous is analogous? If it duplicates the fine structure of a typical human brain, fine; but that seems likely to be very difficult and expensive. On the other hand, if the "analogy" only consists in the fact that it can process information intelligently, then I suspect Mr Smith is going to notice some disturbing changes. For example he might say "now that a part of my somatosensory cortex died, I can't feel my left leg, although somehow I still know when something is touching it."
Further experimentation with mind/machine interface would be needed in order to determine if people would begin to 'feel' the actions of the machine in the same way as their own body and mind.
 
No. I want the option of throwing my computer out of the window when it crashes.
 
No. I want the option of throwing my computer out of the window when it crashes.
So, when it's invented, don't buy sentient Software.

Just because sentient machinery is discovered, it does not follow that all machines will be made sentient. For most tasks, it's simply not necessary.
 
I think the more important question is, should sentient humans have computer rights?
 
Quoted for truth. El Mac, did you say you're a psychologist (by training and/or employment)?

Kinda; I've had some psychology training, and I've been involved with studies on the mentally ill. But these are pretty well smaller points of my career. Most of my formal training involves biology, not psychology
 
It wouldn't be true sentiment it would be more like:
If input = 20 then program =sad
that isn't real sadness. It can't feel something, it is a simulation that resembles the actual feeling.
And what are your emotions, if not responses to a number of conditionals (Presence or absence of chemicals/nerve impulses) which are in turn the response to further conditionals? (Outside stimuli)

If(Nerves[RightArm].Checksignal("Pain") == True)
{
Brain.Feel(pain);​
}

The only difference is that the soft/hardware of living things are less stable than machine components, often prone to error and false responses which appear nonsensical. That, and we don't fully understand the language by which the brain operates itself.
 
I just realized.. When Microsoft gets monopoly on the AI business, we'll be in the Matrix world.
 
It wouldn't be true sentiment it would be more like:
If input = 20 then program =sad
that isn't real sadness. It can't feel something, it is a simulation that resembles the actual feeling.
I'd agree that I can't see how a computer could ever be sentience - but in that case, what about a sentient machine?
 
Back
Top Bottom