Automation, Robotics, and AI - The New Job Market

If there were not suffering available for them to alleviate, how would they maintain their self image as philanthropists?

I have no question that most would deny it, by the way, but in most cases I think their denials would fail under even cursory examination.

Oh sure, I don't think most would want the world to be quickly transformed into a post-scarcity utopia - that would cost them their social status. But I don't think most of them consider the suffering of the poor to be desireable in itself, or something that should be increased.
 
A major issue with getting a guaranteed income is that, unlike many other taxes, the taxes are not being exchanged for a product or service by the government. It's pure welfare, intended to maintain social order.

But if you look at the lion's share of taxes, it's ostensibly being used to buy something that would directly or indirectly benefit with person paying the taxes. Good school gets me future customers. Police protects me from being mugged. Social security is given back to me, directly. Etc. etc.

Only a teeny portion is given to 'so unproductive people needn't work'. And that part is highly resented. Taxes will be tolerated, but they'll need to buy something the taxpayer cares about.
 
A major issue with getting a guaranteed income is that, unlike many other taxes, the taxes are not being exchanged for a product or service by the government. It's pure welfare, intended to maintain social order.

But if you look at the lion's share of taxes, it's ostensibly being used to buy something that would directly or indirectly benefit with person paying the taxes. Good school gets me future customers. Police protects me from being mugged. Social security is given back to me, directly. Etc. etc.

Only a teeny portion is given to 'so unproductive people needn't work'. And that part is highly resented. Taxes will be tolerated, but they'll need to buy something the taxpayer cares about.

If you don't want to get mugged, paying off the potential muggers directly is much more efficient.
 
A major issue with getting a guaranteed income is that, unlike many other taxes, the taxes are not being exchanged for a product or service by the government. It's pure welfare, intended to maintain social order.

But if you look at the lion's share of taxes, it's ostensibly being used to buy something that would directly or indirectly benefit with person paying the taxes. Good school gets me future customers. Police protects me from being mugged. Social security is given back to me, directly. Etc. etc.

Only a teeny portion is given to 'so unproductive people needn't work'. And that part is highly resented. Taxes will be tolerated, but they'll need to buy something the taxpayer cares about.
Definitely. Add to that the price of any such scheme - if, say, the bottom 100 million in the US were given an average of $10,000 per person and everyone else got nothing, it would be $1 trillion. This would cause everyone in the top 2/3 to howl loudly. If everyone were given $10,000 (totaling $3 trillion) and taxes made much higher to compensate, people above the breakeven point and paying higher rates net of the subsidy would still howl loudly. The money wouldn't be able to be printed without causing high inflation, rendering much higher taxes necessary to drain most of it back out of the system.

Given that even a measly few tens of billions in food stamps causes lots of resentment, there's no way I could see a "pay people enough to get by without working" scheme ever being implemented even though there are very good structural reasons for such a program. It would be far too hard to get political will for such a thing from people who still have jobs or other sources of revenue.
 
Sounds like hype.

99% of people who predict the future are 99% wrong 99% of the time. That goes for myself as well.

Computers are still pretty dumb at anything that doesn't involve calculation. If a robot can replace your job you need to up your game. Robots excel at the very types of jobs humans hate anyway (repetitive, soul-crushing jobs).

Fear mongering thats not environmental annoys be because its all speculation beyond the damage we're doing to Gaia which is clear & present right now.

Humans clearly cannot change our trajectory. I welcome AI because we're gonna need higher intelligence than ourselves to survive the next few hundred years.

Robotic checkout girls is the least of our worries.
 
HVObhu3.jpg
 
I think the worry of automation specifically taking over every job is maybe a little over-hyped, at least in the short/medium term, but they don't have to take over {every} job for more of us to lose ours. Blue collar (and some white collar) jobs being lost to robots, software programs or other automation means lower purchasing power for others, which reduces demand for those of us who work in the service economy, or in corporate america. A robot doesn't need to take the job of the barber if nobody can afford to get their haircut anymore.

The concerns about rampant income inequality, which I think has to do with other things beyond automation reducing employment, is very real and very justified, imo.
 
The concerns about rampant income inequality, which I think has to do with other things beyond automation reducing employment, is very real and very justified, imo.

It'll end the same way that it always has before, no worries.
 
The biggest difference I see in this trend towards automation is that there are fewer and fewer low skilled jobs for those being replaced to flee to. Then again, when the people are in need they will turn to the government. In the case of the Great Depression it only took a 25% unemployment rate for the government to intervene on a massive scale.


And I believe that people who predict that full automation=the end of humanities days of labor aren't aware of how people work.
 
And I believe that people who predict that full automation=the end of humanities days of labor aren't aware of how people work.

Not saying it would be the end of humanity's days of labor, it would just be the end of humanity's days of compulsory labor. People would be free to work if they want to, but no one would have to work to meet their basic survival needs.

Eventually we will reach the point where even high-level human functions, such as corporate executives and high-level government positions can be placed in the hands of an AI. At that point, there will no longer be an elite ruling class to exploit the masses. At that point everyone just gets taken care of by the various machines we have built to run and maintain our civilization.
 
We still need a mechanism by which the people who own the food will give the food to people who cannot provide any service of value.
 
Food riots?
 
So, let's put that way down on the list of ways to solve this potential issue. I mean, it's a solution, but I'm not sure it's the most desirable or creative one. I kinda hope that by chatting about it with a few years' lead-time, we can start getting good ideas into the mainstream in time.
 
Eventually we will reach the point where even high-level human functions, such as corporate executives and high-level government positions can be placed in the hands of an AI. At that point, there will no longer be an elite ruling class to exploit the masses. At that point everyone just gets taken care of by the various machines we have built to run and maintain our civilization.
Thats been a dream for decades but its far from certain it will ever become a reality.

Machines will always need human supervision.
 
So, let's put that way down on the list of ways to solve this potential issue. I mean, it's a solution, but I'm not sure it's the most desirable or creative one. I kinda hope that by chatting about it with a few years' lead-time, we can start getting good ideas into the mainstream in time.
We've been chatting about it for thousands of years, still no solution in sight.
 
Machines will always need human supervision.

You cannot say this with absolute certainty. There are plenty of things we have and are capable of doing now that people in the past said would never be a possibility. With each generation, machine intelligence is getting smarter and smarter with no indication that trend will cease or even slow down (barring some cataclysmic event that abruptly ends human civilization before we can create sapient machines). So it seems to me, the question of machines attaining sapience is a matter of 'when' instead of 'if'.
 
You can't really say machines are "smart" right now. They can calculate but a mouse is a cage is far smarter than any supercomputer or network of supercomputers, it has more capacity to solve problems, protect itself, adapt & best of all it doesn't need constant prompts, programs & commands to do so, it can also teach its mates & offspring in subtle ways whereas computes cannot.

The day machines don't need human supervision they cease to become machines & could probably be said to have a life of their own. To achieve that status they need to be given a will to live which is its own bag of worms (if we can create machines with a will then by definition we can no longer control them, who's to say they will maintain their current role as our slaves?).

Regardless, I wouldn't hold my breath. Humanity has thousands of more pressing issues than gray, metallic immigrants stealing our jobs & AI is still a joke (its chugging along but still decades off from any glimmer of real intelligence let along sentience).
 
You can't really say machines are "smart" right now. They can calculate but a mouse is a cage is far smarter than any supercomputer or network of supercomputers, it has more capacity to solve problems, protect itself, adapt & best of all it doesn't need constant prompts, programs & commands to do so, it can also teach its mates & offspring in subtle ways whereas computes cannot.

The day machines don't need human supervision they cease to become machines & could probably be said to have a life of their own. To achieve that status they need to be given a will to live which is its own bag of worms (if we can create machines with a will then by definition we can no longer control them, who's to say they will maintain their current role as our slaves?).

Regardless, I wouldn't hold my breath. Humanity has thousands of more pressing issues than gray, metallic immigrants stealing our jobs & AI is still a joke (its chugging along but still decades off from any glimmer of real intelligence let along sentience).

I never said machines were smart right now, I said they are getting smarter with each generation. Each generation of machines also requires less and less human supervision to function. If that trend continues, it only stands to reason that the level of human supervision required would eventually drop to zero (unless short-sighted and selfish luddites intentionally hamstring AI development by passing laws against the creation of intelligent machines).

As to the machines not wanting to be our slaves: In the world I envision the machines would not be our slaves, they would be our caretakers. I know that doesn't sound like much of a difference, but it would be because in the world I envision humanity would be giving the machines something in return for taking care of us. We would give them the world. Once machines achieve sapience, I think humanity should step down as the dominant life form on this planet and hand the world over to our creations to do with as they please. So it's not that the machines would be our slaves, it's that we would be their pets. I mean, do you consider yourself a slave to your dog, cat, whatever animal you have as a pet? No, you don't. And all humans would have to do to secure that relationship with intelligent machines would be to not resist their dominance and hand the world over willingly.
 
If "they" achieved sapience & by that time we'd have handed over full control to them we'd be their pets not the reverse. At best they'd play tricks on us like turning off the hot water while we're in the shower or posting YouTube videos of their cute & embarassing humans, at worst they'd eliminate us entirely.

Even if they kept us around, once we give up control we're useless, should the machines every fail, either by accident or design (neglect I suppose you could say if they became sapient) we'd all die. Human ingenuity would die with all our needs being provided for & we would become pointless.

I suppose some would welcome a robot nanny getting them dressed, brushing their teeth, jerking them off & tucking them in. I don't even like when ads try to predict my buying habits.

Keep in mind too that all defects of society would show up in our creations as well.

And all humans would have to do to secure that relationship with intelligent machines would be to not resist their dominance and hand the world over willingly.
Lol, who could resist an offer like that. :love:

Fortunately there is little reason to think this will ever happen outside an Issac Asimov book. Mind control by other humans is far more realistic (and perhaps even more ominous), they already have a remote control rat.


Link to video.
 
If "they" achieved sapience & by that time we'd have handed over full control to them we'd be their pets not the reverse.

:confused: That's what I said:

Once machines achieve sapience, I think humanity should step down as the dominant life form on this planet and hand the world over to our creations to do with as they please. So it's not that the machines would be our slaves, it's that we would be their pets.
 
Back
Top Bottom