Machines taking over in 2045?

Moore's Laws is not a fact, and certainly not a physical law. It is not compatible with the laws of physics, at least if it is assumed to continue indefinitely. It will fail eventually, probably long before reaching the theoretical limit where the computer is a black hole.

Plus, the word "law" in sciences is used only to describe things that we're certain are true.

Moore's law is probably best described as a self-fulfilling prophecy.

That's an interesting way to describe it, I like it.

BTW, if someone in the early nineties declared the Law of China's GDP growth ("China's GDP will grow roughly by 9% every year, give or take one percentage point, indefinitely"), it would still be "valid" today, even though everybody knows that it cannot continue forever.

20051014_china_gdp.gif


A point that is brought up rarely in this debate is that our current increase in computation power only relies on miniaturization. We're already close to the orders of magnitude where other forces than out currently-relied-up electromagnetic force come into effect - transistors can't be as thin as we want to and still work.

There are of course alternatives, but they're not even remotely close to practical application, and certainly won't fall under Moore's law.

Exactly. In any case, there is this missing link between computational power and actual creative intelligence that we have no idea how to simulate.
 
I agree with Winner that this singularity stuff is nonsense, and that trying to put a date on it is extra nonsense.

But I disagree there is any reason for progress to slow down. This talk about "easy paths being exhausted" has been brought up since the early 20th Century. We will simply open up new paths, and I expect human progress to grow at great pace as long as we don't kill ourselves.

And yeah, maybe one day we will have superior A.I.s and transplant our brains to machines and live forever. But it won't happen in 2045, that's just ridiculous. We have no way of knowing when that will happen; what we do know is that this day is very far and that every single one of us is going to die. Yep, better accept that fact and live a happy, fun and good life. Because it is 100% guaranteed it won't last forever.
 
But I disagree there is any reason for progress to slow down. This talk about "easy paths being exhausted" has been brought up since the early 20th Century. We will simply open up new paths, and I expect human progress to grow at great pace as long as we don't kill ourselves.

It must slow down, it's inevitable. Unless you want to contend that the human population could possibly grow on Earth forever or that when the population growth stops it will be replaced by a corresponding growth in average human intelligence, you must accept that the current rapid expansion of knowledge is just a temporary flare-up. For most of human history, the rate of progress was far, far slower.

"Easy paths" are indeed getting exhausted. We have a few cards up our sleeve still (biotechnology, nanotechnology, etc.) that promise a lot of applications, but the deeper we go, the harder it becomes to make "simple" inventions whose applications make one a lot of money. Eventually, the cost of doing cutting-edge research will grow so high that it will stifle it. At the same time, as the population stabilizes (and with it the customer base) and as we are forced to constrain our consumption by the limits set by the physical environment we live in, there will be no more room for expansion - at least not on Earth alone.

Growth and technological progress will of course not stop completely, but the periods "between inventions" will get longer and longer, as the rate of discovery gradually returns to pre-industrial levels. Picture it as a gauss curve - we're getting close to the top.
 
It must slow down, it's inevitable. Unless you want to contend that the human population could possibly grow on Earth forever or that when the population growth stops it will be replaced by a corresponding growth in average human intelligence, you must accept that the current rapid expansion of knowledge is just a temporary flare-up. For most of human history, the rate of progress was far, far slower.

"Easy paths" are indeed getting exhausted. We have a few cards up our sleeve still (biotechnology, nanotechnology, etc.) that promise a lot of applications, but the deeper we go, the harder it becomes to make "simple" inventions whose applications make one a lot of money. Eventually, the cost of doing cutting-edge research will grow so high that it will stifle it. At the same time, as the population stabilizes (and with it the customer base) and as we are forced to constrain our consumption by the limits set by the physical environment we live in, there will be no more room for expansion - at least not on Earth alone.

Growth and technological progress will of course not stop completely, but the periods "between inventions" will get longer and longer, as the rate of discovery gradually returns to pre-industrial levels. Picture it as a gauss curve - we're getting close to the top.

I disagree with your premise that population growth is needed for technological advancement. The proportion of the human population currently doing research is ridiculously small, which means that the number of reserachers can keep growing for a very, very long time even after the overall population has stagnated. Not ony that, one single reseracher today is more productive than a single researcher 50 years ago, because he has all previous techs at his disposal, and hence even if the number of reserachers was to stagnate (which there is no reason to happen in the foreseeable future - and indeed for centuries to come) progress could still accelerate.

As for resources, we're still at an early level of utilization IMO. How much of the sun's energy (that reaches the Earth) are we currently capable of using? There are gigantic untapped opportunities out there that dwarf everything we've accomplished in 200,000 years as a species.
 
By Moore's law we can predict the power of computation over the decades, and by using simply arithmetic we are able to figure out that functions of the human brain can be handled by microscopic portions of that power. Thus, the power will be exponentially greater than the whole of humanity, it just needs the proper "self aware algorithms".

Just need some algorithms? Those algorithms represent 90% of the puzzle! Processing power is *nothing* if you don't know how to use it.

That's why I don't like Kurzweil - he seems to focus on the processing power part of the equation, when the simple fact is that this processing power isn't going to magically turn into a human brain one day. You need a lot more for that - and we don't even know where to begin. There's nothing to be exited about until we start figuring that part out - the processing power is just going to be a tool to help us.. not the main piece of the puzzle at all
 
2045 won't have computing power that surpasses a single human brain, let alone all of them, let alone for anywhere near that price range. The singularity, and artificial intelligence, are subjects of exaggerated hype for the pseudo-scientific with no proper understanding of how computing works. A computer will not be made aware by human hands for a long, long time. And shame on you for using Watson as an example, it's almost as hollow an argument as mentioning CleverBot. It only serves to undermine your own position.

You don't understand Moore's law, which is fact.

Your first mistake was presuming Moore's Law is a law at all. It's a prediction, based on observation, made by a single man and has, mind you, slowed down over time. It is based on no rigorous scientific process whatsoever.
 
Your first mistake was presuming Moore's Law is a law at all. It's a prediction, based on observation, made by a single man and has, mind you, slowed down over time. It is based on no rigorous scientific process whatsoever.

I never claim it is a scientific law, only that it has been fact for decades and the industry is continuing the trend and predicts they will continue the trend for many decades to come. People are mistaken in thinking the growth of computational power will slump, Intel is moving into different type of processors, hell even some that go in radical new designs, that already show promise of speeding up Moore's Law.
 
2045? Machines are becoming more like humans every day. Already they seem to have picked up our slacking....I remember when the takeover was scheduled for early 21st century. But it appears that that has been postponed.
 
2045? Machines are becoming more like humans every day. Already they seem to have picked up our slacking....I remember when the takeover was scheduled for early 21st century. But it appears that that has been postponed.

If you want to say machines take over the world, they already kind of have. We can't do anything without a computer of some scale now, and when they fail we are stuck in a ditch.
 
2045? Machines are becoming more like humans every day. Already they seem to have picked up our slacking....I remember when the takeover was scheduled for early 21st century. But it appears that that has been postponed.

:lol:
 
I disagree with your premise that population growth is needed for technological advancement.

That is not exactly what I am saying. In one of my previous posts I stated that it is also the fraction of people with the required education to do useful research in the population that matters.

However, when the population stabilizes and stops growing, and most people around the world attain the educational standards common in developed world today, there won't be any room for further expansion.

The proportion of the human population currently doing research is ridiculously small, which means that the number of reserachers can keep growing for a very, very long time even after the overall population has stagnated. Not ony that, one single reseracher today is more productive than a single researcher 50 years ago, because he has all previous techs at his disposal, and hence even if the number of reserachers was to stagnate (which there is no reason to happen in the foreseeable future - and indeed for centuries to come) progress could still accelerate.

I believe the contrary is true. Today's inventions are much more complicated and require much more initial investment (both in terms of money and education). 100 years ago, you could discover a new physical phenomenon in the lab you had in your basement. Today, you need something like the Large Hadron Collider, an enormous and very expensive instrument that took decades to set up.

I also don't believe that you can simply translate the number of researchers into the number of inventions. As I said, today's inventions take much more effort ("research points", in gaming terminology) to complete, so usually research is a collective effort, with large international teams working on one problem. Hence, to make a new invention today requires more people researching it. So far, we've been able to supply more researchers and more money to keep up the rate of discovery, but again, this won't continue forever.

Also, research is a general term which encompasses many different activities. I am sure thousands of research workers are trying to develop a luminescent lipstick or something equally extravagant, although with their qualification they could just as well be working on a new chemotherapy for cancer or something equally beneficial. Worse, basic research is getting both more difficult and less attractive for the main investors. It's possible that in the future we'll focus more on developing applications of existing knowledge without really trying to go for breakthroughs that would open entirely new paths of research. (That is basically what the Middle Ages were about - people worked with existing knowledge, and pretty much perfected its applications - the notion that the Middle Ages were times when no progress was being made is entirely false, it was just a lot slower than it is today.)

As for resources, we're still at an early level of utilization IMO. How much of the sun's energy (that reaches the Earth) are we currently capable of using? There are gigantic untapped opportunities out there that dwarf everything we've accomplished in 200,000 years as a species.

Actually, we're getting close to the limit of what we can squeeze from the environment. We're already using more than can sustainably be extracted.

But in any case, even if resources on Earth were unlimited, humans can't really consume infinite amounts of products and services. Already people in most developed countries are living pretty comfortable lives. Even if they could consume more, would they? In fact, progress in science and technology can make people consume less. For example, say a Matrix-like virtual reality networks are created. How would this affect the airlines, the tourism industry, and other "luxury" services? Why would I pay a lot of money to spend two week in Maldives, if I could just do the travelling from my home and the experience was almost the same? Actual physical travel would sharply decline then, so the overall impact on the economy, employment, etc. would be negative.

Now, in a world with limited resources (a much likelier scenario), your consumption will be restricted not only by the money you earn, but also by regulation designed to prevent a resource overdraw.

---

Really, we need to start getting used to the idea of a limited world with limited resources and try to make sure we can make the most of it. Pinning our hopes to the illusion of ever increasing growth and ever faster technological progress is folly.

Also, the sooner we expand beyond Earth, the better - it's the only way to conciliate unlimited growth of humanity with the limitations of our environment.
 
Can't say that I'm overly impressed with advancements in processing power either - it's to be expected.

It's in the software-part of the equation we need something extremely remarkable to happen to realize how we can design and construct sentient-like machines, if possible at all. I'm pretty certain that the answers are not found in how 98% of the programmers and computer scientists in the world, traditionally approach programming languages.
 
That seems to be very true, though it's well outside my field of knowledge. What I've heard is that the tendency towards moving towards multiple processors (quad core?) requires a different programming style than we're used to. Until then, we're making crude use of the increase in speeds/processing available. If there was some type of thinking revolution in this area, though, that could be a decent game-changer

I tried to ask about this concept in this thread, but (like I say) I am not very familiar with the field.

luiz: keep in mind that Winner is saying that technological advancement will continue, even with a stabilizing population. It's just that certain exponential components will no longer be there (i.e., increasing educated base). I think it's clear that if the number of educated workers is increasing, that will contribute to total trend of exponential increase!

Winner: I like your comment about 'shiny lipstick'. This is so true. We get what we pay for. If we buy goods that require R&D, then we'll get R&D in those areas. So, it matters what we 'hire' the companies to produce, and so it matters what we buy. What's remarkable is the waste and inefficiency with regards to 'real' science. The beauty industry is a giant industry (obviously), but despite that, very little 'real' biological science is done as a proportion of the revenues. Game-changers in the beauty field would be ways of actually keeping skin young and muscles firm, right? But there's no spending by industry on these types of research.

What we buy matters. That said, that's part of the reason why I think there will be exponential growth in computing for some time. My family currently owns three laptops, 3 ipods, 2 cellsphones, and a Wii. Each of these things get replaced every 5 years or so. That's a lot of consuming power being directed at buying processing power. And I think that trend will continue for some time amongst the global population. By buying R&D goods that increase productivity, we create the trend of escalating returns.
 
Can't say that I'm overly impressed with advancements in processing power either - it's to be expected.

It's in the software-part of the equation we need something extremely remarkable to happen to realize how we can design and construct sentient-like machines, if possible at all. I'm pretty certain that the answers are not found in how 98% of the programmers and computer scientists in the world, traditionally approach programming languages.

We really don't know how the human brain works (yet). It's hard to fault programmers for not translating our imperfective understanding of sentience to create such machines.
 
IMO this is the best chance for a reasonable AI development coupled with this:

http://en.wikipedia.org/wiki/Blue_Brain_Project

http://io9.com/5832085/ibms-neurosynaptic-chips-are-the-closest-thing-to-a-synthetic-brain-yet

I agree that we need a socio-economic overhaul to adjust to empty/full world economics, because we really can't grow something out of nothing (yet) but we haven't really come close to maximizing our energy and matter use on this planet alone, not to mention the solar system.
 
I never claim it is a scientific law, only that it has been fact for decades and the industry is continuing the trend and predicts they will continue the trend for many decades to come. People are mistaken in thinking the growth of computational power will slump, Intel is moving into different type of processors, hell even some that go in radical new designs, that already show promise of speeding up Moore's Law.

You have no way of knowing how long it will continue or how accurately. This is like me walking down the street and saying that since I haven't tripped in the last fifty meters, I won't trip at all and people who think I can are mistaken.

What I've heard is that the tendency towards moving towards multiple processors (quad core?) requires a different programming style than we're used to. Until then, we're making crude use of the increase in speeds/processing available. If there was some type of thinking revolution in this area, though, that could be a decent game-changer

The thing with multicore programming is that each program will be executed just the same. The challenge with multicore programming is how you split up the work among the cores and keep them out of each other's hair. It doesn't affect programming style (other than programs using more kernel threads, which doesn't really have an effect on the AI problem either).

The technology that I've heard some people claim have an effect on AI research is memristor-related, but they've been more focused on machine learning than artificial intelligence.

I'm pretty certain that the answers are not found in how 98% of the programmers and computer scientists in the world, traditionally approach programming languages.

It's not so much a problem for programmers or programming languages. If you want to do anything that really breaks the traditional programming language you'd need some kind of major organizational change, and couple it with the abolishment of Turing machines as the standard model.
 
Well the more you go into the kernel, the more chance something messes up which makes it more likely for it to BSOD if I recall correctly.
 
That is not exactly what I am saying. In one of my previous posts I stated that it is also the fraction of people with the required education to do useful research in the population that matters.

However, when the population stabilizes and stops growing, and most people around the world attain the educational standards common in developed world today, there won't be any room for further expansion.
But the thing is, we are very very far from a point where most people in the world attain the educational standards of the developed world. So there's plenty room to grow even after the population stabilizes.

Not only that, there is no reason why the proportion of researchers in the developed world should remain constant. I think (and hope) that future economic pressures will greatly reduce the demand for lawyers, accountants and etc, and increase demand for researchers. In short, I don't think that stagnation of the number of researchers need be a problem.

I believe the contrary is true. Today's inventions are much more complicated and require much more initial investment (both in terms of money and education). 100 years ago, you could discover a new physical phenomenon in the lab you had in your basement. Today, you need something like the Large Hadron Collider, an enormous and very expensive instrument that took decades to set up.

I also don't believe that you can simply translate the number of researchers into the number of inventions. As I said, today's inventions take much more effort ("research points", in gaming terminology) to complete, so usually research is a collective effort, with large international teams working on one problem. Hence, to make a new invention today requires more people researching it. So far, we've been able to supply more researchers and more money to keep up the rate of discovery, but again, this won't continue forever.
But empirically we have seen that returns on research are not diminishing. If we take what you said as true, from the moment we started doing scientific research in the 17th Century, all the way to the present, returns should be diminishing, as innovations became more and more complex. But they have not!

The effects of "standing on the shoulders of giants" seem to compensate the "exhaustion of easy paths".

It's undeniable that a researcher now is much more productive than in the past- this is an objective fact that can be measured by the number of patents per capita. How could he not be, with absurd processing power at his disposal, and all previous research readily available?

Also, research is a general term which encompasses many different activities. I am sure thousands of research workers are trying to develop a luminescent lipstick or something equally extravagant, although with their qualification they could just as well be working on a new chemotherapy for cancer or something equally beneficial. Worse, basic research is getting both more difficult and less attractive for the main investors. It's possible that in the future we'll focus more on developing applications of existing knowledge without really trying to go for breakthroughs that would open entirely new paths of research. (That is basically what the Middle Ages were about - people worked with existing knowledge, and pretty much perfected its applications - the notion that the Middle Ages were times when no progress was being made is entirely false, it was just a lot slower than it is today.)
You're underestimating the potential of breakthroughs while researching shiny lipstick! Throughout the history of science, major breakthroughs were achieved while researching something unrelated. You may start researching some fluorescent substance for lipsticks and end up discovering a cheap light-emitting substance with all sorts of practical applications.

And I absolutely see no risk of we moving towards mostly perfecting what we already have. Research is fields like physics is so theoretical and advanced these days that my head hurts just reading a paper, and I am an engineer.

Actually, we're getting close to the limit of what we can squeeze from the environment. We're already using more than can sustainably be extracted.
I don't think that can be demonstrated.

But in any case, even if resources on Earth were unlimited, humans can't really consume infinite amounts of products and services. Already people in most developed countries are living pretty comfortable lives. Even if they could consume more, would they? In fact, progress in science and technology can make people consume less. For example, say a Matrix-like virtual reality networks are created. How would this affect the airlines, the tourism industry, and other "luxury" services? Why would I pay a lot of money to spend two week in Maldives, if I could just do the travelling from my home and the experience was almost the same? Actual physical travel would sharply decline then, so the overall impact on the economy, employment, etc. would be negative.
A Matrix-like realistic virtual environment would be a major undertaking requiring a huge investment and maintenance, both in hardware and software. A virtual trip to the Maldives might one day be cheaper than a real one, but it won't be free. A new economy would develop on the virtual world, and indeed total output would increase, not decrease.

As for the fact that the developed world has already reached comfortable standards of living and needs go no further, you have to keep in mind that comfort is a relative concept. A middle class American family of the 50's also had a comfortable life. But the amount of resources needed to sustain their lifestyle - the amount of dollars needed per month - was vastly inferior to that of an American middle class family of the present. A modern family will travel much more by plane, will own multiple flat screen TVs with home theaters, laptops, videogames, iPads, kitchen appliances, etc. They will go the gym, the kids will take jiu-jitsu and french and piano classes, and so on and so forth. Likewise, in future all sorts of "new demands" will show up. Nobody needed a fast internet connection or an iPad 20 years ago, and life was already pretty good. But that doesn't mean people won't want them now.

Now, in a world with limited resources (a much likelier scenario), your consumption will be restricted not only by the money you earn, but also by regulation designed to prevent a resource overdraw.

Really, we need to start getting used to the idea of a limited world with limited resources and try to make sure we can make the most of it. Pinning our hopes to the illusion of ever increasing growth and ever faster technological progress is folly.

Also, the sooner we expand beyond Earth, the better - it's the only way to conciliate unlimited growth of humanity with the limitations of our environment.
If you think about it, the one and only resource we truly need an en ever expanding supply of is energy. And as I said, there are gigantic untapped opportunities out there that can sustain our growth virtually forever.

luiz: keep in mind that Winner is saying that technological advancement will continue, even with a stabilizing population. It's just that certain exponential components will no longer be there (i.e., increasing educated base). I think it's clear that if the number of educated workers is increasing, that will contribute to total trend of exponential increase!
Sure, but as I said, there are two things to consider:

-We're still very very far from a point where the whole world has developed levels of education, and thus there is great growth potential.

-There is no reason why the composition of the educated workforce should remain constant forever. In the present, a lot of people that go to college study Law, or political science, etc. Depending on the pressures of the future, more people could turn into research. In fact, if we look at rapidly developing nations like China and South Korea, we note that a very high proportion of their educated workforce is on technical careers, when compared to the West. To keep it's competitive edge, the West will have to do the same.
 
We really don't know how the human brain works (yet). It's hard to fault programmers for not translating our imperfective understanding of sentience to create such machines.
Agree completely - that was basically what I was trying to communicate. ;)

We basically don't have a clue how the human (or any for that matter) conscienceness works, only a bunch of theories. Imo you can't mimic conscienceness to a satisfying degree by 'traditional means' from the various disciplines of computer science.

Perhaps some young genuis working in his parents garage will stumble upon a moment of revelation at some date...
 
-We're still very very far from a point where the whole world has developed levels of education, and thus there is great growth potential.
True! I think that, for tech development, the lowest hanging fruit is developing world education, health, and IQ. If we attend childhood poverty, childhood disease, and education, we'll continue getting crop after crop of new researchers (and new people to buy the products of R&D). It's good bang-for-the-buck for those looking to boost our tech curve.
-There is no reason why the composition of the educated workforce should remain constant forever. In the present, a lot of people that go to college study Law, or political science, etc. Depending on the pressures of the future, more people could turn into research. In fact, if we look at rapidly developing nations like China and South Korea, we note that a very high proportion of their educated workforce is on technical careers, when compared to the West. To keep it's competitive edge, the West will have to do the same.
I'm not sure, but we live in different worlds. The 'developed' world has had a fairly stagnant ratio of R&D focus. Now, because the economies have been developing and growing (I like to think that some of this is due to R&D!), the 'total pie' has been growing, even if the ratio hasn't. I can't really argue for or against this intuition, though.

With regards to the Singularity, the idea is that the number of 'virtual people' can continue to grow (even exponentially) even if the population stagnates. Now, this needn't be actual persons (as AI), just that the computing power available will continue to increase and that the amount of work done by humans will (as a proportion) decrease. If I can buy processors that aren't much worse than a person (and I can find reasons for those processors, economically), then I can 'expand' my output even if there're not new people to hire.

The analogy to mechanical machines is pretty strong. As more and more work is done by machines, the economic growth becomes dominated by improvements in machine technology. Adding people (without concommitant machinery) doesn't improve output nearly as much as buying better machines.

Right now, total computing in the world is (at the most conservative) the same processing power as one person. Link. This is obviously not the way we think about the current contribution of processing power, but we can say there's 7 billion real people and 'one' virtual person as of 2010. If processing continues as it has for the next 20 years (with a growth of 58% (general purpose computing), there will be 7 billion (plus population growth) people, and 10,000 'virtual people'. But if there's only 10 more years of the same trend, there will be a billion 'virtual people' worth of processing power. The 'virtual' population growth can then dominate

I don't know if computing trends will continue for 30 years. But, as I said, I currently have 3 laptops, 2 cellphones, 3 iPods, and a Wii. I expect to replace them about every five years, and I expect other people to be like me as time goes on ...
 
Back
Top Bottom