Does morality work without a deity?

Secular laws are not moral systems.

But secular nations posist that they stand on and derive legitimacy partially from moral framework of some sort. Some of this ends up codified in constitutions. "This is the moral framework which our country is built upon", etc. I am contrasting that with a theological country such as say Saudi Arabia, where the moral framework that sits underneath everything is religious in nature instead.
 
That's true I suppose, eventually (laws would have to be changed, etc.), but if aliens landed and seemed intelligent, we wouldn't steal from them either, although it would take some while for new laws to be written to give Gods and/or aliens person status. It would likely end up in the courts for many years.

So I guess I do disagree - initially Gods wouldn't have legal standing, because they are not legally persons. It would need to be written into the law - you'd need a precedent, etc.

Secular laws are not moral systems.

I think the moral system that informs most western laws would treat any unquestionably sentient being as a person, though - is that not the case?

EDIT: Warpus essentially made the point above, and in more detail.
 
I'm of the thought that morality is self-evident, based on personal experience and the laws of nature. For example, taking a hammer and a chisel to my own skull can have "bad" consequences. By extension, it's not difficult to see that what would be misfortune to me would also be a misfortune to others. If I were to say, kill someone else and loot their valuables; that would be denying them the same right to existence which I grant to myself.
 
The latter. It has to do with the nature of utility summation. Let's say we invent a duplication machine that can produce exact replicas of persons. Compare the following two scenarios.

Scenario 1:
Bob comes to the lab and sleeps overnight.
We give bob a nice steak
Bob goes to sleep, we duplicate him in his sleep
Bob and his duplicate go on their merry way

Scenario 2:
Bob comes to the lab and sleeps overnight we duplicate him in his sleep but we don't tell him
We give bob and his duplicate identical steaks - they have the exact same experience, and the same experience as scenario 1
Bob goes to sleep and his duplicate go to sleep
Bob and his duplicate go on their merry way

A niave utilitarian will claim that because two steak eating experiences exist scenario 2 has twice the utilty as scenario 1. My claim is because these experiences are identical they only count once and that scenario 1&2 have identical goodness.

We can apply this principle to two similar but not identical experiences where the common components of the experiences only count once and the divergent components count twice.

What if Bob is a vegetarian? Another thing is that you are creating another mouth to feed-- while the second Bob may experience a limited goodness there may be a greater sadness in the form of overpopulation :D I would say it would only be a good situation if all Bob's could reach their maximum potential.
 
I'm of the thought that morality is self-evident, based on personal experience and the laws of nature. For example, taking a hammer and a chisel to my own skull can have "bad" consequences. By extension, it's not difficult to see that what would be misfortune to me would also be a misfortune to others. If I were to say, kill someone else and loot their valuables; that would be denying them the same right to existence which I grant to myself.

Are you saying that "bad" people never prosper?

Because that would take a big leap of faith, imo.
 
I think the moral system that informs most western laws would treat any unquestionably sentient being as a person, though - is that not the case?

I don't think they would. The scenario in which a robot gains sentience and fights for his/her rights as a person has played out in several science fiction stories for instance. If such a thing were to happen in real life, that robot would have 0 rights. The courts would have to assign these rights, as legal personhood is right now only afforded to homo sapiens.
 
Absolutely, you're right in a legal sense - but if, while that sentient robot was arguing its case in court, you asked a bystander 'would it be wrong of me to hurt that robot?', they would almost certainly answer that it would be. In theory at least, we like to believe that our laws are founded on what is right, so would regard the robot's lack of personhood as a temporary sign that the system hadn't caught up, rather than the right way for it to be.
 
Depends on who you talk to, I can think of one specific example at the top of my head - one poster here who shall remain unnamed who said that he wouldn't think twice about "killing" such an aware/intelligent/sentient robot.
 
Absolutely, you're right in a legal sense - but if, while that sentient robot was arguing its case in court, you asked a bystander 'would it be wrong of me to hurt that robot?', they would almost certainly answer that it would be.
I agree with your opinion, but I'm afraid it's wishful thinking. Apes and dolphins have been known as sentients for decades and are still denied all rights and still not considered as persons.
 
I'm not sure that's quite right, though - or, more accurately, I'm not sure we're using precisely the right terminology. I would argue that if somebody made a machine that was identical in all respects to me, practically everyone would consider it a human being, even though they knew it had actually been made in a factory rather than a person. Most people think that there's something about apes and dolphins that makes them different, to which intelligence is usually (if problematically) brought out - but then most people also say that apes and dolphins are higher up the chain of moral consideration than cows and chickens.
 
What if Bob is a vegetarian? Another thing is that you are creating another mouth to feed-- while the second Bob may experience a limited goodness there may be a greater sadness in the form of overpopulation :D I would say it would only be a good situation if all Bob's could reach their maximum potential.
"Eating a steak" is an arbitrary pleasurable experience. It could be eating something else or watching a movie.

Also note that both scenarios create duplicates of Bob so whatever issues there are in regards to that should be the same.
 
I'm of the thought that morality is self-evident, based on personal experience and the laws of nature. For example, taking a hammer and a chisel to my own skull can have "bad" consequences. By extension, it's not difficult to see that what would be misfortune to me would also be a misfortune to others. If I were to say, kill someone else and loot their valuables; that would be denying them the same right to existence which I grant to myself.

Based on the laws of nature? Nature is a very complex thing. Only with the development of the mental element morality comes into existence.
Self-evident? Everyone has a different degree of (self)awarness and everyone perception of reality is different.


Morality seems to be something temporary required for certain stage of mental development. Before that there is no need for morality and likely neither is after.
 
What Giftless is describing is the human power of empathy, which not all other creatures, intelligent or not, will necessarily possess. Empathy evolved for us due to various factors - it's not necessarily going to be something that an intelligent alien being has as well. And since that's where the majority of our morality comes from, I don't dare guess what the morality of an intelligent alien being might be. It's going to be impossible to predict.
 
Is human level mental nature something universal? If so alliens may as well be aware of it and comunicate with us on that level.
 
Is human level mental nature something universal? If so alliens may as well be aware of it and comunicate with us on that level.

I'm not sure what you mean exactly by "human level mental nature", but in the end, empathy is not universal, or at least doesn't seem to be. That's the important bit, I think. An alien species might very well have a morality based on something else entirely, and it might not be very compatible with what we consider to be "morally obvious".
 
Well, unless it's some type of hive mind, I cannot actually see how an intelligent alien species could get to space-capable without empathy. Teamwork requires it, afaict.
 
Well, unless it's some type of hive mind, I cannot actually see how an intelligent alien species could get to space-capable without empathy. Teamwork requires it, afaict.

I can't either, but I'm putting this under "I don't know", since we just don't know for sure whether empathy is a requirement for civilization or not.
 
Back
Top Bottom