Future technologies and social change

Rg339

Prince
Joined
Jun 3, 2022
Messages
308
What(if any) technologies do you see producing large-scale social changes in the future?

I have a passing interest in genetic editing. 100, 200 years from now, I could see it making huge waves. It’d function like Chekhov’s gun. If person X can give themselves an advantage through use of it, person Y MUST use it to keep pace, or resign themselves to joining an underclass, enjoying little social or financial capital.
 
Do You mean like in time ? If Person X used it, than person Y suffers back in time to make what person X used it for ?
 
Because Space-Time continuum is not to be trifled with Young man . It will defend on it's own -> If You (sick bastard) decide to murder Your own garandfather it will create Your own reality (a whole new world) - where You will still be (You sick bastard ! Who would want to murder his own grandpa ?!)
 
have a passing interest in genetic editing. 100, 200 years from now, I could see it making huge waves. It’d function like Chekhov’s gun. If person X can give themselves an advantage through use of it, person Y MUST use it to keep pace, or resign themselves to joining an underclass, enjoying little social or financial capital.
Answering Your question in a manner .....


 
What(if any) technologies do you see producing large-scale social changes in the future?

I have a passing interest in genetic editing. 100, 200 years from now, I could see it making huge waves. It’d function like Chekhov’s gun. If person X can give themselves an advantage through use of it, person Y MUST use it to keep pace, or resign themselves to joining an underclass, enjoying little social or financial capital.
If humans survive that long, we will see technologically-based drug solutions for most common problems, which in turn can be enough to usher in an era of actual high civilization: as negative traits are significantly reduced (through voluntary drug-therapy), antipathies should subside acutely. In the end you can see drastic rise in tech and every other human creation as a result.

Now, that is a rather large "if", also due to how currently the capitalist market is against any such solution (even if it could exist). Any actual high civilization will not be income-based, and more eusocial.
 
"Through voluntary drug-therapy" You won't be seeing me there pal, unless You shove a shotgun under my armpit , but I will not be enslaved You sick bastard ! <grabs a shotgun from him arms shoves it in his face ! Hasta La Vista Baby ! BANG ! Free people will always be free ! " .
 
Genetic engineering is definitely one, and it opens up all kinds of moral quandries -- like designer babies, and how badly drastic improvements in health as a result of gene editing in humans will further elite-tify the wealthy and powerful. It's something I'm both curious about and horrified by. As someone who has several issues because of genetics, I'd like to think those problems can be solved...but humanity was also dropping fusion bombs on the regular in the fifties, so our record on understand the difference between "I can do this thing" and "I should do this thing" is not that strong.
 
Just having the rich/super-rich benefit from such tech won't be sustainable - leads to revolt - but even that could (more slowly) make things better (at least one group won't have problems, which means some from it - assuming all other things being as today- will be more into helping the other groups break through). But the idea is to have such tech available to everyone; otherwise you are just creating a larger and more stable moat between the groups.
Everyone is unique, yet I really struggle to see how someone would not want to have the rest of their traits and just improve stuff they themselves don't like.
 
Just having the rich/super-rich benefit from such tech won't be sustainable - leads to revolt
Revolt will be less possible w each passing year. Already it's impossible. W tech enchanced elites it will be even less possible
 
I think the biggest wild card is the growing potential for bioterrorism, biological warfare, and biological accidents.
There's a 3 hour podcast here, where Sam Harris lends his platform for the author's essay - so it's more Rob Reid than Sam Harris. Full disclosure: the accomplished biochemists in my life found it a worthwhile listen. I really recommend it, and then recommend getting a least someone you know into the actual science of it.

It will become increasingly easy to make really big steps in virus design, which means the law of large numbers kicks in ... except with self-replicating organisms. Covid was probably the easiest trial run of our pandemic response imaginable, and we failed.

After that, Automation-Induced Unemployment strikes me as something we'll have to figure out. We're going to live in this weird world where robots doing all the work means that some people become desperately poor. On the personal level, I recommend owning as much capital as possible, owning a piece of whatever is going to take over. But 99% of the solutions will have to involve actual government.

With regards to my life, cracking self-driving vehicles will probably make a big difference. I don't expect that for awhile yet.


I also think we're within about a decade of cracking cancer, plus the seven years of bringing things to market. Combinatorial treatments are going to be the way to go, but those require large datasets to crack. But, we're living in the era of increasingly large datasets.
 
The constant need for humanity to covet and destroy what other humans have simply alludes to a repeat of historical norms with or without technology. We will endeavor to find increasingly lethal ways to strike at one another until someone steps over the line and leads us to catastrophe. Thus short of populating other planets in the next 100-200 years this will mean we will eventually go extinct.
 
It will be funny if Penrose is correct about generations of the universe and previous 3d-based life trying to leave information for the next generation to avoid the same erasure. Meanwhile, current generation features humans ^^
 
After that, Automation-Induced Unemployment strikes me as something we'll have to figure out. We're going to live in this weird world where robots doing all the work means that some people become desperately poor. On the personal level, I recommend owning as much capital as possible, owning a piece of whatever is going to take over. But 99% of the solutions will have to involve actual government.
Yeah, that general trend is going to require amendments to the social contract. My hope is that the pace of the change leaves ample time for states to react to it. My fear is that something like advanced AI will revolutionize the market in a brief time period; a large disruption event like that could, potentially, be of a scale that makes adjustments painful and reactions insufficient.
 
Top Bottom