• Civilization 7 has been announced. For more info please check the forum here .

The AI Thread

You are the creator of all computer games you started to play?

Most people can create next to nothing of note. Making someone pregnant is a technical issue, available since before even language (let alone civilization) was a thing - for most animals one has to suspect it isn't even a conscious decision. Then again if a baboon was actually able to create baboons, instead of triggering a ready-made routine which creates, she'd be more of a creator than any human scientist in regards to complexity of the creation.

Indeed. The creator of humans is either some deity (if you're religiously inclined) or nature through a very long process of evolution. It's not the parents.

We agree on special rights for all humans because they are human, the product of this creation that makes us all, and over which no human can claim domain. No one manufactures humans! Producing an embryo in a lab from human material, which is possible, is just reproducing humans, carrying on that which already exists. And selecting or trying to manipulate genes is still tweaking that which already exists. As for the status of the outcome of this reproduction, even the patriarchal cultures of old that granted parents authority over their children provided for an end to that authority. Even ancient slavery didn't regard slaves as sub-human or non-human, it was just a juridical circumstance. Only the "scientific" era brought the twisted idea of associating slavery and physical traits and then seeking to justify it by promoting the pseudo-scientific idea of race, attempting to find a justification for intrinsically excluding part of humanity from the universal rights.

The very idea of comparing human reproduction with the production of a tool, not seeing the distinction between both... @El_Machinae, I'll be direct: you're seeming dangerous to me. And I hope you'll take the time to think through the idea and implications of whether or not humans should be regarded as unique.
The way I see it, your criticism of @Commodore that his view might justify slavery is exactly wrong: it is your view that could justify slavery. If humans are not deemed to have an intrinsic common quality (humanity) that is unique, the way is open to believe that some humans could indeed be "produced like tools". And tolls are produced to be used, there is no way around that. Otherwise they wouldn't be tools. Might be works of art, but not tools.
Science fiction has toyed with blurring the lines between humans and "intelligent robots", something that does not exist and we don't know if ever will. And the Hollywood movies can have feel good happy endings with some "AI" becoming "human". It's fiction! It only works when people think about it just for a few hours. And it is not accidental that those writers of SF who thought the most about a future with "Artificial Intelligence" refused to play with considering AI human.

I've noticed that science* work tends to be a "safe space" for people somewhat lacking in empathy. Though they have a moral sense nevertheless - they usually navigate through work and life without causing major damage so long as the institutional setting provides the rules, the moral, and prevents them from going "mad scientist". The problem I've noticed with several researchers I've known is that they keep a blind spot when trying to reason about ethical issues: you can't reason empathy into happening. And ethics are built on that, can't be derived from nothing. Which is why ethical committees are necessary, even though (by design!) they limit research.
* finance now recruits many of those from maths... which explains some things?
Edit: I'd better make it clear that what scares me is a trend I've noticed among the life sciences crowd to wish to have a very permissible work environment. The subjects are dangerous enough that science there should be slow and careful.
 
Last edited:
The way I see it, your criticism of @Commodore that his view might justify slavery is exactly wrong: it is your view that could justify slavery. If humans are not deemed to have an intrinsic common quality (humanity) that is unique, the way is open to believe that some humans could indeed be "produced like tools". And tolls are produced to be used, there is no way around that. Otherwise they wouldn't be tools. Might be works of art, but not tools.
Speaking about intrinsic quality of humanity.
  • Historically speaking, this quality never prevented people from enslavement or considering other people as sub-humans.
  • Nobody suggests to grant AI this quality. We are talking about whether it will be moral to enslave sapient non-human beings. Is it dangerous to think sapient AI should not be enslaved?
  • Finally, sapient non-human being doesn't have to be manufactured. If it's an intelligent extraterrestrial life specimen, does its non-humanity justify inhuman treatment?
 
It's a bit reversed, humans have an intrinsic quality that makes it immoral to make them into tools. Even if you can engineer a person, you still don't have the right to make them into a tool.

Just because you can do something, doesn't mean you may do something.

But that intrinsic quality is a function of sentience. The reason I'm allowed to toss the placenta, even though it is alive and human, but not the twin, is because the twin is sentient.

But yes, I don't insist that human sentience is specifically unique. Except in so far that it seems to be in the natural world. But we are imagining a future where that is no longer true.

We are at a serious barrier to communication, because I am describing a spectrum of sentience, along which we get increasing moral obligations. Humans are obviously within the upper tier of those obligations, and cross a threshold that disallows slavery.

Your view, that only humans are worthy of certain types of moral consideration. Does this mean that you struggle while watching TV? When the Star Trek creates a federation where Romulans are prevented from enslaving Vulcans, do you lose the plot and say " why does it matter? Only humans are uniquely entitled to not be enslaved!"

No, you don't. It's because you don't actually think that humans are metaphysically morally unique, they just are practically so, so far.

The initial premise, that it's morally permissible to enslave sapient AI, is recognized as currently science fiction. And we don't know what the future will bring on that front. So is increasing animal intelligence through genetic tweaking, which as far as I can tell, Commodore would also permit enslavement of.

People can bold caps insisting that no one manufactures humans, and that it's impossible to do so, and then it just seems like they're arguing about the difference between growing and making. It's a distinction that doesn't really matter with regards to whether slavery exists. The difference between growing and making is just a heuristic. It's human activity that would set the process in motion, with an eye towards an intended outcome.

But you're right that science should be cautious. We could create Sapient AI, and then people like you and Commodore would insist that it's okay to enslave them, preferred even. Commodore would even use a gun to defend that "right". And then we could create hundreds of billions of slaves before we create legal tools to prevent the abuse
 
Last edited:
We are at a serious barrier to communication, because I am describing a spectrum of sentience, along which we get increasing moral obligations. Humans are obviously within the upper tier of those obligations, and cross a threshold that disallows slavery.

Your view, that only humans are worthy of certain types of moral consideration. Does this mean that you struggle while watching TV? When the Star Trek creates a federation where Romulans are prevented from enslaving Vulcans, do you lose the plot and say " why does it matter? Only humans are uniquely entitled to not be enslaved!"

No, you don't. It's because you don't actually think that humans are metaphysically morally unique, they just are practically so, so far.

I do struggle with most science fiction, ended up accepting it as space opera, because the alien species there are usually anthropomorphized. And that creative option in fact betrays the deeply held, perhaps innate, tendency of humans in its culture to see out kind as unique, and every others as they relate to us. For viewers to relate with other species in SF those species mush be presented with traits similar to humans! The Vulcans and Romulans are examples of such. When aliens are presented as, say, intelligent insectoids or something else inhuman, they're cannon fodder. Intelligent, hah! Assumed not to, so clearing away the difficulty of dealing with a hard subject in light SF.

Humans can have moral beliefs (or culturally ingrained opinions if you will) about other species. But we have never dealt with those species the same way as we dealt with humans (or, in fiction, human-like species). The reaction is instead one of either paternalism (and implicit domination), such as over protected wildlife, or submission, such as ideas of divinity, or assimilation (they're humans dressed up as something else for narrative purposes). Event the old mythologies and tales of fantastic beings, which have inspired modern fantasies, settle their creatures into one of these three categories: human-like, gods, or evil to be ruled over.

How we would deal with a different, clearly not-human, intelligent species is a very interesting question but one I think we'll only discover about when and if it happens! I'm not optimistic as to peaceful coexistence. Reason will probably not be the thing deciding the outcome. Consider why Kafka's Metamorphosis drew so much attention: it's a story of the human and the alien. The human is only human whet it comes with the whole package, otherwise it's something alien, to be dealt with at arm's length...

But you're right that science should be cautious. We could create Sapient AI, and then people like you and Commodore would insist that it's okay to enslave them, preferred even. Commodore would even use a gun to defend that "right". And then we could create hundreds of billions of slaves before we create legal tools to prevent the abuse

I thing we'll be spared the problem with AI, for the simple reason that AI in the sense we're talking now remains out of reach, science fiction, and I see it remaining there for the foreseeable future. Again, most science fiction takes the space opera path of assimilating sapient AI to humans (complete with humanoid robots sometimes), which makes no sense. We think as we think because of our biology: it's not just the mind, it's the body and its systems and needs. A "sapient AI" on some kind of vat would be alien, "think" very differently. Unless the body was also faithfully copied, in which case you would indeed have "manufactured humans".... but we are already equipped with a natural process to do that!

Given the seer alien-ness of a non-human sapient AI, the human reaction to it is as impossible to predict as the characteristics of this hypothetical AI. As is the reaction of such an AI to humans. I see stories of cute robots in love with humans, and fighting for their rights to be human, is absurd romantic drivel. It's not even worth the time to discuss philosophically, as those stories are not setting up a scenario of "human and AI" but rather "human and discriminated human that happens to magically be a robot".
 
I try not to pull from science fiction when it comes to the morality around sentience, but it helps discern where our biases lie. We have enough of a spectrum through both diversity on Earth as well as during human development. And all we have is a guide-post that the majority of people intrinsically believe - that humans are worthy of moral consideration at a level unique to humans.

I'm happy to take that as given. I think there are a bunch of post-hoc reasonings about why people think it should be true, but it's impossible to remove the self-serving bias of every philosopher crafting arguments about why they (themselves) "deserve" rights. Sapience springs out as a front-runner, though, because sapience means that the entity values its rights.

Morality is the story of interaction between two sentient organisms: each organism has a stake in the outcome of the interaction, each organism has a preference for the desired outcome. It could be between me and a mouse or, heck, it could be between me and future-me. We tend not to classify self-interaction as a type of morality, and it's easy not to, and there's usually no point.

So, if there's a carving of a mouse and I break off a leg, the morality of the action only matters insofar as the carving has an owner (or someone the destruction will affect). If there's a real mouse, then the breaking of the leg affects everyone involved in the story - which, by default, includes the mouse.

Now, it's obvious that we don't truly care about animal suffering. Oh, we have biases. If I were to take your dog and skin it alive in front of you, you'd be angry at me in ways that are just not equivalent to me keying your car. But there's only a rare person that refrains from buying meat produced industrially. "The cow wouldn't exist, except for the fact that we grew it on purpose for our benefit" the argument will boil down to. And we can even quantize the suffering we implicitly endorse, where the person who buys 30 chickens a year is causing twice as much individual mistreatment as the person who buys 15.

But we do know something about the morality of sentience, that somewhere along the point between being an embryo to being a child, there is a sufficient change in sentience that creates intrinsic rights. And those rights are 100% about sentience levels. And eventually, about sapience. And, with sapience, there's a part of our story where were self-interaction matters. We don't allow present-you to institutionally enslave future-you for present-you's pleasure. It doesn't matter what contract I sign or what promises I make, there are certain thresholds of enforcement that will just not be legally enforced. I could promise to sell a kidney, but we wouldn't force its removal if I changed my mind. I could promise to do a sex-scene in a porn, but it wouldn't be forced on me. I could promise to till a field, but we'd not allow whippings to force me to finish. We'll cattle-prod a cow into a butchery, though, because we 'raised it for that purpose' (its wishes don't matter).

There are thresholds for "enslavement" we tolerate, based on location and society. For the temporary pleasure of riding in an unowned Ferrari, for example, the United States will happily lock you in a small room unless you perform slave labour in a for-profit prison. So, present-you can contractually lock in future-you into enslavement in the Land-of-Arpaio. Even that is getting pushback these days. It's just that the culture that once fought for slavery is slow when it comes to understanding slavery being 'wrong'.
 
Last edited:
^The cow isn't ok to be killed because "we raised it for a purpose", but due to food being actually needed for humans. If one didn't have other means of getting food, they'd hunt an animal to eat it regardless of whether the animal was feral or bred to become food one day.

If one day aliens came, and presented blueprints of how they created humans as machines able to breed, it would certainly present humans as something insignificant next to their hypothetical creators. Then again this won't help your argument much, El Machinae, given humans are also bio-material, showing that those alien masterminds had already decided not to bother with pure AI :p
 
I'm super-suspicious that you only eat enough meat required to survive. BUT, I didn't say we kill a cow because we raised it for a purpose (cows aren't sapient and don't value their own existence, merely their experience), I showed how we mistreat a cow because it was created for our purpose. It's on the purpose that matters. If I were to treat a dog the way we pay people to treat cows, but merely for the pleasure of yelping sounds, people would object It's the sentience that changes how you value them, but the benefit they provide you. Means to an end. Tools or enslavement, depending on how you view such things. But permissible because of your wants and not because of underlying concern. The point of that example was that there are areas of the sentience spectrum that we don't morally value and much as we value humans, that have crossed into different thresholds.

It's weird how many dualist are here, where humans have some vis essentialis

Forging a hammer uses the same thermodynamic principles as a seed growing. Mixing flour and yeast and the popping it in the oven is using same thermodynamics as fusing a sperm and egg and popping it into a womb. "Oh, you didn't invent the recipe and you didn't grow the flour" are meaningless nit-picking of the statement "I'm making a cake". You're pretending that ' intentionally making' and 'intentionally growing' are metaphysically different (cuz, dualism) and it leads to the conclusion that enslavement of sapient entities is morally permissible.
 
I'm super-suspicious that you only eat enough meat required to survive. BUT, I didn't say we kill a cow because we raised it for a purpose (cows aren't sapient and don't value their own existence, merely their experience), I showed how we mistreat a cow because it was created for our purpose. It's on the purpose that matters. If I were to treat a dog the way we pay people to treat cows, but merely for the pleasure of yelping sounds, people would object It's the sentience that changes how you value them, but the benefit they provide you. Means to an end. Tools or enslavement, depending on how you view such things. But permissible because of your wants and not because of underlying concern. The point of that example was that there are areas of the sentience spectrum that we don't morally value and much as we value humans, that have crossed into different thresholds.

It's weird how many dualist are here, where humans have some vis essentialis

Forging a hammer uses the same thermodynamic principles as a seed growing. Mixing flour and yeast and the popping it in the oven is using same thermodynamics as fusing a sperm and egg and popping it into a womb. "Oh, you didn't invent the recipe and you didn't grow the flour" are meaningless nit-picking of the statement "I'm making a cake". You're pretending that ' intentionally making' and 'intentionally growing' are metaphysically different (cuz, dualism) and it leads to the conclusion that enslavement of sapient entities is morally permissible.

Much like the cake you bake, pure AI isn't sapient.
 
Last edited:
Much like the cake you bake, pure AI isn't sapient.

Hey, would you stuff it on that? The initial premise under discussion was explicitly regarding sapient AI. You can present your Vitalism views when the thread migrates to that, but please don't quote me when you do.

Honestly, you cannot coherently engage in discussion regarding the morality of enslavement while rejecting the premise of the discussion.

"fossilized previous thought" is right, since I've already said this to you.
 
Hey, would you stuff it on that? The initial premise under discussion was explicitly regarding sapient AI. You can present your Vitalism views when the thread migrates to that, but please don't quote me when you do.

Honestly, you cannot coherently engage in discussion regarding the morality of enslavement while rejecting the premise of the discussion.

"fossilized previous thought" is right, since I've already said this to you.

It is true that there is diminished ethical contempt when the being abused isn't human. That is only natural, given if we go the route of not accepting full rights for humans, it won't end well for us. Furthermore, there is no other species on this planet which can realistically be compared to our own. Who knows, maybe whales do mental stuff of importance, but they are a closed system and ultimately don't matter. Maybe even ant colonies do mental stuff of importance when viewed as a unit, but that too is currently not observable. In either case, those hypothetical stuff are due to the dna basis of those beings, not some pure AI.

Maybe in a future reincarnation - if one is lucky enough for such a thing to even exist - I will be breeding half-sentient hands, which compute formal logic-pure math hybrid problems. I will still identify them as part of an experiment, with no rights granted.
 
The rights granted to humans are also judged along the sentience scale and the additional sapience scale. But yeah, it's natural for us to downplay the rights owed to other organisms. We're evolved tribal predators hardwired to seek carnal pleasures. We're continually tempted to forgo robust morality standards. It just means that we have to pray that we're never treated based on how we treated others.

Again, please don't quote me when you want to discuss your vitalism views of sapience. You might be confused as to what I mean, but it's your "it's the DNA" tangent. Obviously, you're free to discuss it, but please don't quote or tag me.
 
Again, please don't quote me when you want to discuss your vitalism views of sapience. You might be confused as to what I mean, but it's your "it's the DNA" tangent. Obviously, you're free to discuss it, but please don't quote or tag me.

Don't be such an evolved tribal predator about it.
 
But why? You believe living organisms possess some spiritual essence which is a source of intelligence?
Like, machine cannot be sapient because it doesn't have soul or something like that?

In materialistic worldview, if we believe human intelligence is a result of evolution and based on physical properties of our brain, there are no natural laws which prevent replicating it in non-biological form.
 
But why? You believe living organisms possess some spiritual essence which is a source of intelligence?
Like, machine cannot be sapient because it doesn't have soul or something like that?
No, DNA is not magical, nor something theoretical like a "soul".

In materialistic worldview, if we believe human intelligence is a result of evolution and based on physical properties of our brain, there are no natural laws which prevent replicating it in non-biological form.

Certainly if one can replicate how DNA works, it can work. Not sure how this will happen through coding. I mean, you don't try to use coding to replicate the computer's electric supply either, do you?
 
Certainly if one can replicate how DNA works, it can work. Not sure how this will happen through coding.
DNA is a code.

I mean, you don't try to use coding to replicate the computer's electric supply either, do you?
It doesn't have to be coding. Neurons can be replicated in transistor-based physical devices for example.
Besides, modern neural-network based algorithms aren't coded either.
 
DNA is a code.


It doesn't have to be coding. Neurons can be replicated in transistor-based physical devices for example.
Besides, modern neural-network based algorithms aren't coded either.

Yes, good luck trying to replicate DNA - even if it becomes possible at some future time, doesn't it defeat the purpose of not using DNA?

It reminds me a bit of the old joke about god and his sand. Also, let's imagine that a time traveler went to prehistory and dropped a battery there. The prehistoric people could try to replicate the battery at some point (after they developed some way of examining what it is), but I feel your position is that they should instead wish to create something which acts like a battery but isn't one.
 
Last edited:
Yes, good luck trying to replicate DNA - even if it becomes possible at some future time, doesn't it defeat the purpose of not using DNA?
You probably mean something else, because there is no problem with replicating DNA now, with current technology. In vitro, in vivo, or in artificial form.
I mean, it doesn't have much to do with intelligence. Even bacterias have it.
 
You probably mean something else, because there is no problem with replicating DNA now, with current technology. In vitro, in vivo, or in artificial form.
I mean, it doesn't have much to do with intelligence. Even bacterias have it.

Well yes, I clearly mean understanding how it works to the end, not just being able to copy the whole. Much like anyone can cut pieces of plastic without having to know chemistry.
 
Top Bottom