Is John human?

What do you make of John (with organic left foot)?


  • Total voters
    26
Well what's a book? If I say Bob and I are reading the same book does that imply we're reading the same physical object? Under most contexts that implication is not made. Most of the time when someone says they're reading the same book they're reading separate copies of the same book. There's all sorts of differences between the two books, Bob's book has a slightly different paper composition from mine and mine has a crease on page 109, but that information is not relevant. What is relevant is the words on the page being the same (that's a simplification of course but the principle stands). The other stuff doesn't matter. My copy could even be the Authors typewritten final draft. So long as the book conveys the same meaning it's the same book.

Even if we can't copy a human brain exactly down to the smallest detail, we probably can copy the meaning sufficiently well. The irrelevant parts only matter if we consider the humans objects rather than meaningful.
A human being is not a book. Two books mass produced can be called the same. A clone or identical twins are not.

Is it?

Transport versus copy reeks of dualism as if there's a sort of inviolable soul tag we carry around with us.
Its called being alive. If I murder you but make a copy of you at the same time I might be able to get away with it (if no one finds one) and the duplicate might think its you but I've still comitted murder.

I have a sense of "me" that differentiates "myself" from "others". That sense is more like "the entity that produced this statement". It's a egoistc mistake to impose that "me"-ness as some fact of the universe.

When I talk about "myself" in the future, it is acknowledging that the thoughts associated with "me" will by and large be associated with an entity in the future. If multiple entities fit that bill then I see no reason to deny any of them "me" status.
But they won't be you. You'll still be you. Unless you're trying to sell me that your consciousness will magically transfer to your 5 identical twins & you will be aware of all of them (and can go out with all of them to the movie theater & watch five movies simultaneously thru 5 pairs of eyes).
 
It's not dualism. It's recognition that a specific consciousnes can only be in one place at a time, under specific conditions.
 
Its called being alive. If I murder you but make a copy of you at the same time I might be able to get away with it (if no one finds one) and the duplicate might think its you but I've still comitted murder.

If you stab someone in the heart, but (due to medical progress or whatever it may be) the person you stabbed is saved in extremis through a hearth implant, have you comitted murder or attempted murder?
 
If you stab someone in the heart, but (due to medical progress or whatever it may be) the person you stabbed is saved in extremis through a hearth implant, have you comitted murder or attempted murder?
That would be attempted murder, not sure how thats relvant to the hypothetical of the murder of Perfection & trying to cover it up with a robot that acts like him.
 
So let's rewind back in time to when John was completely organic. Let's say the first accident requires him to get a new cyber-brain. Is a cyber-brain with otherwise human components a robot?


I believe I originally got the idea from a similar type of story, though I think it might have been something with an axe.

edit: that and the hermes robot in that one Futurama episode

The last time I encountered this question was Season 3 Episode 13 of Star Trek Deep Space 9.

They replaced half the guy's brain, and he was ...off
 
That would be attempted murder, not sure how thats relvant to the hypothetical of the murder of Perfection & trying to cover it up with a robot that acts like him.

No, but relevant to the alternate question: what if your attack left Perfection brain-dead but with an otherwise working body ; and he then received a life-saving brain transplant that functioned as a perfect copy of his brain instant before death?

Have you murdered Perfection, then, or have you attempted to murder him?
 
I see the confusion. But the heart or an arm is merely a part of a human being. You cannot have a brain transplant, the brain is you. You cannot "get" another brain. You are your brain. When your brain dies, you die.

Maybe scientists someday will be able to build a copy of a human brain which gets pretty damn close (though I highly doubt it) but its not "you".
 
I see the confusion. But the heart or an arm is merely a part of a human being. You cannot have a brain transplant, the brain is you. You cannot "get" another brain. You are your brain. When your brain dies, you die.

Maybe scientists someday will be able to build a copy of a human brain which gets pretty damn close (though I highly doubt it) but its not "you".

I'm not so sure I agree with the notion that I am my brain. To me there's an important nuance lost here...you're the sum total of what goes on within your brain (and between your brain and other body parts). Not the organ itself.

In a hypothetical where a second brain could be set up to perfectly replicate those every interaction, and every information within...To me that is, in fact, me. Another me, if the current me still exist ; but me all the same. And if that me should be transplanted into my current body, replacing the old me after some tragic accident, then I would certainly say that the result is still me.

Of course, that hypthetical is massive. Soft-sci-fi territory exclusively. But it appears to be the hypothetical on which this thread is based.
 
I see the confusion. But the heart or an arm is merely a part of a human being. You cannot have a brain transplant, the brain is you. You cannot "get" another brain. You are your brain. When your brain dies, you die.

Maybe scientists someday will be able to build a copy of a human brain which gets pretty damn close (though I highly doubt it) but its not "you".

I'm not sure this is true, either. But from another direction:

What defines the limits of the brain? Is it the bone case it resides in? Or do we include the spinal chord as well? And it's not clear to me why we shouldn't.

But in that case, why not include all the other nerve fibres throughout the body as well?

And since any computing system is nothing without any input or output, we must surely include the eyes, the nose, the ears, the hands (and every nerve on the surface of the body), the gut, and the tongue and the larynx too.

And why not, even, extend it into the wider world? After all, what is the individual without the social relationships that surround them?
 
In a hypothetical where a second brain could be set up to perfectly replicate those every interaction, and every information within...To me that is, in fact, me. Another me, if the current me still exist ; but me all the same. And if that me should be transplanted into my current body, replacing the old me after some tragic accident, then I would certainly say that the result is still me.

I think that it might be a perfect replica of you, but it would still be a replica - it wouldn't be a continuation of your thoughts/consciousness/whatever. You'd just have two versions of you, now, with the original still stuck in the first brain.
 
I'm surprised nobody has given this answer yet:

Spoiler :
“No, John. You are the demons.” And then John was a zombie.

I suppose one question you might ask is who decides whether John is human? I think we have to assume that John is able to think of himself as human and do so in a human way (as opposed to, for example, being 'programmed' to see himself as human); otherwise, there's no point in continuing to inquire. But is that enough? What if people who interact with John perceive his mind to be identical to a human mind? Would that be enough? I think so, and I suppose that makes me a Functionalist.
 
those every interaction, and every information within...To me that is, in fact, me. Another me, if the current me still exist ; but me all the same. And if that me should be transplanted into my current body, replacing the old me after some tragic accident, then I would certainly say that the result is still me.

It's a copy of you that's indistinguishable in a very way. But still a copy. You are still in the original body.
 
It's not dualism. It's recognition that a specific consciousnes can only be in one place at a time, under specific conditions.
The problem is we have an archetype of a person's life: a person is born then lives for a certain amount of time then dies. We are very used to thinking that way, so used to that idea that we think it must be that way. Our idea about "me" is very strongly tied to that archetype. Really though that's not the way it must be. It's a matter of circumstance that it happens to be that way. As we change those circumstances our concept of "me" must also change.

A human being is not a book. Two books mass produced can be called the same. A clone or identical twins are not.
A clone or identical twin of course is different because the content of their minds is different.

When I talk about "myself" in the future, it is acknowledging that the thoughts associated with "me" will by and large be associated with an entity in the future. If multiple entities fit that bill then I see no reason to deny any of them "me" status.
But they won't be you. You'll still be you. Unless you're trying to sell me that your consciousness will magically transfer to your 5 identical twins & you will be aware of all of them (and can go out with all of them to the movie theater & watch five movies simultaneously thru 5 pairs of eyes).
I'll get back to you on this, but your example here is illustrative. I'll post later to disentangle what is going on.
 
I'm surprised nobody has given this answer yet:

Spoiler :
“No, John. You are the demons.” And then John was a zombie.

I suppose one question you might ask is who decides whether John is human? I think we have to assume that John is able to think of himself as human and do so in a human way (as opposed to, for example, being 'programmed' to see himself as human); otherwise, there's no point in continuing to inquire. But is that enough? What if people who interact with John perceive his mind to be identical to a human mind? Would that be enough? I think so, and I suppose that makes me a Functionalist.

So anything that passes the Turing Test would pass for human?

Or "if it quacks like a duck, moves like a duck and looks like a duck, it must be a duck"?

That seems a little premature, I think.
 
It's a copy of you that's indistinguishable in a very way. But still a copy. You are still in the original body.

In this theory, I am NOT still in the original body. The original body now contain the replacement brain, because my original brain has ceased to function.

Obviously, if there are two Me running around (and thus undergoing different experiences, which cause their identities to diverge), then one is me and the other is a copy ; they are two distinct individuals.

If the first brain is no longer functional and replaced in the body by a second brain, though, there are no two copies running around.

Ultimately, I suspect and believe it would come down to identity, and thus be a matter of individual choice. Not a matter with a hard and fast answer.
 
So anything that passes the Turing Test would pass for human?

Depends on what you mean by the Turing Test. The actual historical Turing Test is simply an imitation act. I'm talking about something that is truly indistinguishable.

Borachio said:
Or "if it quacks like a duck, moves like a duck and looks like a duck, it must be a duck"?

That seems a little premature, I think.

We don't have any other means of determining whether it's a duck. I'd give it the benefit of the doubt, especially since John thinks he's human.
 
In this theory, I am NOT still in the original body. The original body now contain the replacement brain, because my original brain has ceased to function.

If your brain has ceased to function, then you are dead. Isn't that a fairly logical conclusion?

I mean, it doesn't matter how many clones of yourself you make - if your brain dies, you die. How could it be not so?
 
The big problem here is the expectation that "me-ness" is some objective thing. That there must be some singular coherent self from birth to death.

If I get duplicated both of my duplicates will be "me" because they'll carry forward my mental properties. However neither duplicate will describe each other as "me" because they have experiences the other does not, but both duplicates can aptly describe Perf before the duplication as "me".

In other words I become two "me"s each of which does not have a "me"-ness relationship with each other.
 
Then John is human.

Not necessarily, no.

I suppose one question you might ask is who decides whether John is human? I think we have to assume that John is able to think of himself as human and do so in a human way (as opposed to, for example, being 'programmed' to see himself as human); otherwise, there's no point in continuing to inquire. But is that enough? What if people who interact with John perceive his mind to be identical to a human mind? Would that be enough? I think so, and I suppose that makes me a Functionalist.

A robot can function as a human functions. It's a matter of definitions in this case, and that's true *regardless* of whether a theoretical consciousness is transferred or "merely" copied. I suspect in practice there would simply be new terminology for entities with consciousness and legal rights, especially given that people already have pushed similar laws for animals. John would be a thinking entity with full rights, and likely defined as a robot, cyborg, or something else entirely simply to conveniently distinguish his body type from that of completely organic people. The transferal of a unique consciousness in this instance is irrelevant.
 
Back
Top Bottom