Should sentient computers have "Human Rights"?

Should sentient computers have "Human Rights"?
The thread title answers the question. Anything sentient should be acknowledged as having 'human' rights.
 
What about sentient animals?


EDIT: I think this thread needs a refresher on the definition of sentient. It is not exclusively human.
 
I think that computer devserve computer rights, like the right not to be shut off for no apparent reason, and not to be scraped or otherwise destroyed.
 
A machine made by a human is just that, a machine. It is ours to do with as we see fit. A walking, talking, sentient robot, made by man, is still just one of our machines and I'd put it in a wood chipper without hesitation.
So how would a sentient machine that was made by us differ from a sentient human that's made by a man and a woman?
 
So how would a sentient machine that was made by us differ from a sentient human that's made by a man and a woman?

Only in that the two workers on the assembly line dont cuddle and smoke cigarettes after building a sentient robot.
 
Probably because the forum likes it when you explain why you said no ;)

I think it's a very bad idea to give human rights to something mechanically constructed by humans. If someone makes a sentient doorknob we'd have to require all institutions be accessible to it like someone in a wheelchair. How do you make a library accessible to a doorknob? There are countless other examples I could give, but, essentially, I think this one has an obvious answer.

Just had another thought: are you guys going to debate whether computer abortion should be legal, too? If it's sentient & has human rights, you can't start making it & then stop before it's finished if abortion is illegal. This whole subject is silly.
 
Without emotion they won't have a desire for life, liberty, etc. because they can't have a fear of losing these 'rights'

So why should we give them to them? What good would it do?
 
Just had another thought: are you guys going to debate whether computer abortion should be legal, too? If it's sentient & has human rights, you can't start making it & then stop before it's finished if abortion is illegal.
You could stop assembly up until the point you put in the power supply. After that it cant be 'aborted'. But that wouldnt be our problem. The sentient computers would handle the building of sentient computers, it would be how they reproduce.
 
Now, if a living person adds computer hardware to their brain, which becomes integrated into their thought process (As the newly formed cells did in early life) then the organic portion of their brain dies off (As it slowly does during aging) how is this different from the natural progression of life? (Save that your mind will now outlive your body)

Sure, if you replaced one single neuron at a time with its functional equivalent, you'd preserve the relevant causal powers of each little bit of your brain. But that would be insanely difficult and expensive - so, I wager, your scenario is not how it will actually work. Instead, a powerful AI will be developed which works by radically different principles, and ways to make excellent simulations of individual personalities and memories will be developed on that platform. And people who "take advantage" of this "immortality" will actually, IMHO, be committing a bizarre sort of suicide.
 
Sure, if you replaced one single neuron at a time with its functional equivalent, you'd preserve the relevant causal powers of each little bit of your brain. But that would be insanely difficult and expensive - so, I wager, your scenario is not how it will actually work. Instead, a powerful AI will be developed which works by radically different principles, and ways to make excellent simulations of individual personalities and memories will be developed on that platform. And people who "take advantage" of this "immortality" will actually, IMHO, be committing a bizarre sort of suicide.
I'm not saying to build a machine which replaces individual neurons, I'm saying to make a machine than ia able to interface with nerve cells as a whole. (And it's been proven in labs that Neurons can learn to interact with electronic devices to which they are exposed)

Some day in the future, Mister Joe-Everyman-Smith heads down to a clinic and gets a 'cyber-brain' unit hooked up to his head. The CB is a piece of hardware which can function analagously to a mass of nervous tissue. While initially blank, the CB slowly becomes integreated into Mr. Smith's thought process. The brain extends its processing and storeage abilities into the CB until the two are effectively one. (Just as it does when incorperating new tissues while the brain grows) In time, Mr. Smith's Organic brain starts to die, but his mind is both the Brain, and CB - Two parts of a single greater mind.

When the organic brain finaly shuts down, would you claim that Mr Smith no longer exists, desipte the fact that the CB was as much a part of his consiousness as the original Organ, and continues to function?
 
So how would a sentient machine that was made by us differ from a sentient human that's made by a man and a woman?

Heh! A man and a woman cannot claim to be making a human when they procreate. If that were the comparision you're trying to make, it would be akin to a man throwing some silicon, copper, and some other ingredients into a box and expecting a walking, talking, sentient robot to emerge all on its own some time later.
 
I disagree, computers may well get millions of magnitudes more powerful but it could be impossible to build a sentient AI (assuming this is what 'independent machine intelligence' means). We just dont know.

Are you suggesting there is something special about biologic entities that restricts sentience only to them?
 
Just had another thought: are you guys going to debate whether computer abortion should be legal, too? If it's sentient & has human rights, you can't start making it & then stop before it's finished if abortion is illegal. This whole subject is silly.
It would probably be similar to what the debate focuses on now. It's okay to abort the machine while you're building it, but it's not okay to abort the machine once it's sentient. We'd draw the line at functional sentience, instead of declaring that a clump of transistors deserves rights because it has the potential to be sentient.
Some day in the future, Mister Joe-Everyman-Smith heads down to a clinic and gets a 'cyber-brain' unit hooked up to his head. The CB is a piece of hardware which can function analagously to a mass of nervous tissue. While initially blank, the CB slowly becomes integreated into Mr. Smith's thought process. The brain extends its processing and storeage abilities into the CB until the two are effectively one. (Just as it does when incorperating new tissues while the brain grows) In time, Mr. Smith's Organic brain starts to die, but his mind is both the Brain, and CB - Two parts of a single greater mind.

This is not an unreasonable analogy. We've learned that introducing neurons into diseased brains causes those neurons to take on function and to become part of the brain. As the older brain cells continue to die, the animals remains alive and some of its personality. This is all without intentionally transferring neuronal function to the new cells, but letting it happen 'naturally'.
 
Heh! A man and a woman cannot claim to be making a human when they procreate. If that were the comparision you're trying to make, it would be akin to a man throwing some silicon, copper, and some other ingredients into a box and expecting a walking, talking, sentient robot to emerge all on its own some time later.
So why is this distinction important when discussing whether it's okay to say, torture a sentient mind?

Consider this - a sufficiently advanced civilization could, I believe, make a human, not in the natural way, but by directly constructing it. Would this creation, despite being identical in result to a natural born human, and having the same experience of feelings and pain, not be deserving of any rights?
 
I disagree, computers may well get millions of magnitudes more powerful but it could be impossible to build a sentient AI (assuming this is what 'independent machine intelligence' means). We just dont know.

Are you suggesting there is something special about biologic entities that restricts sentience only to them?
It's important to note the distinction between "sentient computer" and "sentient machine".

The latter is anything we could construct, and since I don't believe there is anything supernatural about sentience, I believe we could build sentient intelligent machines. We might loosely refer to these as "computers".

However, "computer" has a strict definition, namely, being a turing machine, and doesn't simply mean any machine we can build. Dr Alimentado may be referring to this - if we consider a computer as being a machine which is limited only to performing a set of instructions on information, then it's not clear to me that any computer would be sentient merely by running these instructions fast enough, or by running the right program.

So it's possible to believe that computers will never be sentient, but that we could build sentient machines.
 
If theyre sentient, their rights should be at least as good as human rights. If theyre sentient, they'll have their own ideas about their rights.
 
If theyre sentient, their rights should be at least as good as human rights. If theyre sentient, they'll have their own ideas about their rights.

But they've been programmed to act sentient, it doesn't mean they are sentient. How is it even possible for a computer to be sentient? If I throw a rotten tomato at a supposedly sentient computer, and it curses at me and says what a horrible human being I am, wouldn't that just be what the robot/computer has been programmed to say? In which case it wouldn't really matter morally what I did to it.
 
Back
Top Bottom