Ultimate Goal of Humanity

stfoskey12

Emperor of Foskania
Joined
Dec 27, 2010
Messages
1,205
Location
35° 12' N 97° 26' W
I have often wondered if there is some ultimate goal we humans should all be working towards. It seems like some people support transhumanism, while others support colonizing space, others support simple living, and still others feel the most important thing is to spread their religion. So it seems as though either humans were not given a destiny, or that that destiny is not obvious. However, I think it would make since to work towards a goal where people are as happy as possible. But maybe that can't be engineered and comes through everybody doing what they want in a way they feel benefits society.

If TLDR; I don't think people have an inherent destiny but that we get to create our own. What should we create?
 
Simple five step plan:

  • Build a Dyson shell (not a sphere) around our derpy system.
  • Further obfuscate our presence by moving to intergalactic space, creating a fold in space or whatever.
  • Increase security by creating our own seperate universe and moving there.
  • For the same reason: Eliminate this one.
  • Live happily ever after.

Which brings us back to the question, why nobody else has already done this.

Yes, yes, a tiny bit of research would be necessary. But let's not count peas.
 
Simple five step plan:

  • Build a Dyson shell (not a sphere) around our derpy system.
  • Further obfuscate our presence by moving to intergalactic space, creating a fold in space or whatever.
  • Increase security by creating our own seperate universe and moving there.
  • For the same reason: Eliminate this one.
  • Live happily ever after.

Which brings us back to the question, why nobody else has already done this.

Yes, yes, a tiny bit of research would be necessary. But let's not count peas.

I could see this repeating into "smaller" and "smaller" subuniverses as humanity evolved into different species and decided to secure themselves from each other.
 
However, I think it would make since to work towards a goal where people are as happy as possible. But maybe that can't be engineered

Here we founder over the word "happy", because in one sense, it can be engineered all too simply: by injecting dopamine into the brain. So at the very least, we need to elaborate what "happiness" means enough to explain why that doesn't count as "happy" in the intended sense.
 
We already have the universe to our selves. We could overthrow all powers that be and live in peace and harmony.
 
Get to the next level.
 
What is a level? I am not convinced we got ourselves to any level.
 
The next level is heaven.
 
Godhood.

I.e. achieving such level of technology that makes us effectively immortal and all-powerful. Creating parallel Universes and all that jazz will follow. In the meantime, we should try to expand a bit around this Galaxy to avoid stupid accidents that could lead to our premature end.

(however, I am now about 50% convinced humanity will destroy itself before it ever leaves this planet in numbers big enough to ensure the survival of Terra-genes. Another piece of statistic in the Cosmic Filter.)
 
Goal of humanity? Finding values and sticking to it.
 
Well 1st step is proving that we're smarter than a bunch of reindeer with tons of lichen to eat (for now) and no natural predators.

If we can't self-regulate here & now there is no point in talking about an ultimate goal.
 
How is transhumanism evil?

How it is not?

It doesn't care of consequences, but not only that, it denies there are consequences.

Also, there is no ultimate goal for humanity, only the various goals of people.

Knowing is ultimate. You born with it. You do it all your life. It drives you through life. Every human being shares this drive. It is in the very definition of conscious life.
 
How it is not?

It doesn't care of consequences, but not only that, it denies there are consequences.

I may agreed with you that the idea of transhumanism can be evil, however I humbly disagree with your argument here, my friend had a presentation regarding transhumanism and he was also one of its activist back in the US. And he not deny there are actually bad consequences in technology and he claim transhumanism recognize it and try to avoid it in the basis of their ethics.

What I don't agree with transhumanism is, the idea that we must put transhumanism as the ultimate goal for humanity. Because it can possibly use for wrong purpose.

I remember they argue how many peoples die each day, and how the discovery of transhumanism can actually save humanity from dying. That argument can be used for utilitarian argument, take for example to be lenient with medical experimentation under the argument for advancement of progress toward transhumanism, or the restriction of medical experiement can be accused to be holding up humanity toward transhumanity or conservative fear to face advancement and change.

This notion of "transhumanism" can be use to legalize nasty things that benefit particular institution or research by saying all of that is for the sake of reaching "transhumanism", while reaching transhumanism is the betterment for the whole humanity.
 
I agree that there is no goal to humanity, but ending poverty is a noble pursuit. If we could make it so that everyone has minimal guarantees of food, water, shelter, and safety, the world would be a much better place.

Do that, then worry about making a dyson shell.
 
Back
Top Bottom