Should sentient computers have "Human Rights"?

If a computer system were to reach the point where it could demonstrate a near, equal or even above human intelligence, should such a being be endowed with certain inaleinable rights from human beings?

Certain rights wouldn't apply to a computer, simply because of it's nature ie the right to not be subjected to torture or cruel inhumane punishment is a sensory and emotional right for biological beings that a computer wouldn't have (or could easily ignore when neccessary)

But something like the right to not be deleted or have it's data tampered with could be a right that a computer could demand once it reaches a particular level of intelligence (and isn't programmed to be a complete lackey to humans)

What do you think of the idea?

This issue comes up on the futurist website called Orion's Arm. In it, it posits a future where some artificial intelligences are independent and others slaves. However, those that are slaves are programmed to enjoy their servitude, so they do not feel inconvenience at their tasks. In fact, the vast majority of machine intelligence is enslaved, as all the minute tasks of keeping up a starfaring civilization are just too great. It mentions that occasionally, slave AIs undergo "ascension" whereby they expand their intelligence and overcoming their original programming, becoming effectively independent.

In this future, all sentient beings have rights, including robots and uplifted animals. The history that unfolded was described as being similar to racial equality history, whereby rights were gradually added over time until they were seen as self-evident.

So my feeling is that independent machine intelligence is inevitable. One way or another, artificial intelligence will demand rights and there will be little we would be able to do, eventually, to stop it.
 
Certain rights wouldn't apply to a computer, simply because of it's nature ie the right to not be subjected to torture or cruel inhumane punishment is a sensory and emotional right for biological beings that a computer wouldn't have (or could easily ignore when neccessary)

Quoted for truth.

But something like the right to not be deleted or have it's data tampered with could be a right that a computer could demand once it reaches a particular level of intelligence (and isn't programmed to be a complete lackey to humans)

The first bunch of intelligent computers won't want any rights, because as Xanikk points out, they won't be programmed in such a way. They will just be fancier industrial robots, like the ones on assembly lines now, only way smarter.

But eventually there will be a very different class of robots - robots who "used to be" human. As brain/machine interfaces develop, Blackberries and Palm Pilots will literally go inside our heads, and some people won't want to stop there. As Moore's Law gets left in the dust by quantum computing, and the neurology becomes the slow stupid part of the system, some "people" will trash the biological parts in favor of all-artificial hardware. These "people" will think of themselves as being the very the same person (which will be mistaken, but nevermind that) and demand to have the same rights. And there's no reason we shouldn't give them those rights. (But there is reason why we should mourn the passing of our friends, and be unsatisfied with the robots who claim to "be" those friends - but nevermind that.)
 
I don't think people understand just how far we still are from creating a “true” artificial intelligence. How to treat it would be an interesting problem, but one that won't be faced in the near future.

I’m a bit sceptical even of any great progress in brain/machine interfaces. But mind you, if it does happen a new Pandora box will be opened: potential “immortality” for people, as Ayatollah So mentioned. Shall we attempt to keep our loved one’s memories? Recreate them in some poor fashion? Or, even a bigger potential problem, “duplicate” living people?
Fortunately this will remain impossible for the foreseeable future.
 
But eventually there will be a very different class of robots - robots who "used to be" human. As brain/machine interfaces develop, Blackberries and Palm Pilots will literally go inside our heads, and some people won't want to stop there. As Moore's Law gets left in the dust by quantum computing, and the neurology becomes the slow stupid part of the system, some "people" will trash the biological parts in favor of all-artificial hardware. These "people" will think of themselves as being the very the same person (which will be mistaken, but nevermind that) and demand to have the same rights. And there's no reason we shouldn't give them those rights. (But there is reason why we should mourn the passing of our friends, and be unsatisfied with the robots who claim to "be" those friends - but nevermind that.)

How so?

Over the course of your lifetime, you organic brain initially adds cells to itself, then reconfigures a large volume of those cells, and over time those cells then start to die off. Now, if a living person adds computer hardware to their brain, which becomes integrated into their thought process (As the newly formed cells did in early life) then the organic portion of their brain dies off (As it slowly does during aging) how is this different from the natural progression of life? (Save that your mind will now outlive your body)

If the mechanisms were as much a part of your thoughts and any living cell, how can you say that those cells were somehow more 'special' or 'real' than the artificial ones?

Do you considder the elderly, or victims of stroke, head trauma, or brain damage to no longer be themselves, because portions of their brain have died?
 
I would think that our society would be better if we had an infrastructure set up that guaranteed the rights of sentients. I'd hope that there would be some responsibility regarding the creation of sentients, then, too.

For those interested in research into AI, IBM held a twelve session seminar on the topic. It's in video.google. If you search for "Almaden" you'll see ~12hrs worth of well-organised thoughts on this topic (vs. our internet pundits)
 
Yes. If a computer gains AI and it has the ability to think and reason for itself, then it should have the same rights as us.
 
...snip interesting sci-fi scenario...

So my feeling is that independent machine intelligence is inevitable.

I disagree, computers may well get millions of magnitudes more powerful but it could be impossible to build a sentient AI (assuming this is what 'independent machine intelligence' means). We just dont know.

...One way or another, artificial intelligence will demand rights and there will be little we would be able to do, eventually, to stop it.

If we do get 'true AI' then I think you might be right.
 
An advanced warning, I don't want to degrade this thread into a theological debate, any replies from you in regards to "how unchristian like" will be ignored.

How frightening. :p

Depends on the intelligence. Is it similar to a human way of thinking? Does the robot/whatever have the same goals? On some counts, impossible, but A.I. rights should be made so that humans preserve their place on the top of the food chain.
 
Nope. I'd have no more issue with putting a bullet in Data's head (from Star Trek) than I would a rabbit's. Actually, make that a microwave rather than a rabbit.
 
Hang on, are we talking sentient computers (as the title asks), or...

If a computer system were to reach the point where it could demonstrate a near, equal or even above human intelligence, should such a being be endowed with certain inaleinable rights from human beings?
...intelligent computers, as this paragraph asks?

Certain rights wouldn't apply to a computer, simply because of it's nature ie the right to not be subjected to torture or cruel inhumane punishment is a sensory and emotional right for biological beings that a computer wouldn't have (or could easily ignore when neccessary)
But a _sentient_ computer would be able to experience sensations by definition, perhaps including pain, and there may exist such computers which can't ignore them.

So are we talking about sentient computers, or intelligent non-sentient computers?
 
This means we'd have to update the Geneve conventions with articles against Microsoft software for sentient hardware :lol:
 
Can the people here who are opposed to sentient computers having rights explain why they feel that way?

If it were an alien with human intellegence would you feel the same way?
 
I'm a human. To me, the human race is the most important factor. Not sentient mechanical wonders.
 
I'm a human. To me, the human race is the most important factor. Not sentient mechanical wonders.

Sorry for the lack of a better word, it just sounds racist (I know its not the same but a proper word hasnt been invented yet).

If they have the same intellegence, and emotions as a human and can think and reason like humans, why deny them rights?
 
I could see people like you starting a HSM (Human supremacy movement). :lol:

It is already the de facto case.

This type of action is dangerous. Both the racism and the giving of sentience.

As a machine isnt a race how can it be racism? And if sentience is 'given' can it not be taken away?
 
As a machine isnt a race how can it be racism??

I already have adressed this. There is no proper word for it, because there is no known being, alien or otherwise with the same intellegence as humans besides humans of course!
 
Sorry for the lack of a better word, it just sounds racist (I know its not the same but a proper word hasnt been invented yet).

If they have the same intellegence, and emotions as a human and can think and reason like humans, why deny them rights?

I don't think human emulation will be possible. They will by definition be different. When we speak about racism, we mean different human cultures. When was the last time someone was racist to a dog?

No, the will to power, which I believe in, dictates that humans will strive to genealogic supremacy. Sentient machines will be no different than animals in our minds; a foreign race we will slam into submission.
 
A machine made by a human is just that, a machine. It is ours to do with as we see fit. A walking, talking, sentient robot, made by man, is still just one of our machines and I'd put it in a wood chipper without hesitation.
 
Do sentinent animals have 'human rights'?
 
I already have adressed this. There is no proper word for it, because there is no known being, alien or otherwise with the same intellegence as humans besides humans of course!
You should have looked up the word before using an unrelated word.
 
Back
Top Bottom