Machines taking over in 2045?

Meh. I've heard this before. By then I'll be in my 50s so doesn't matter too much ;)
 
In the year twenty forty-five
The robopocalypse is going to arrive
All our thoughts and all of our dreams
Made obsolete by our washing machines
Whoa whoa

I like to think (It has to be!)
of a cybernetic ecology
where we are free of our labors
and joined back to nature, and
all watched over by machines of loving grace.

Richard Brautigan
 
I think it's only a matter of time before Singularity in some form or another is reached. Whether it'll happen in my lifetime, I'll just have to wait and see. But the speed of technological development never fails to amaze me.

Today while fasting I actually found myself pondering about this very topic. In short, I think an all encompassing intelligence of the universe impractical and unlikely, and the timeline quite exaggerated, but the basic idea is plausible, although far enough away from my life that I could really care about it.

Also, I found this apt.

Link to video.

When watching this, I knew that there had to be some Gilbert and Sullivan in there.

I was right :D

(from 0:55)


Link to video.
 
I do (and it's far from being a "law") , and I don't see how it could be a relevant reply to what I said.

The tech singularity does not meant machines take over. Moore's Law, which has always been right, predicts that computational power will be so astronomically high by that point, and with current software and A.I trends it will happen. Probably sooner rather than later. I'm not seeing why you disbelieve in something that is certainly going to happen. It isn't like saying Jesus will return or any other religious mumbo jumbo. The tech singularity simply means an explosion of computational power that improves itself at speeds we cannot fathom.

The tech singularity you cry about is about as unlikely as the current technological level we have achieved, which as far as I'm concerned is fairly damn real.
 
The tech singularity does not meant machines take over. Moore's Law, which has always been right, predicts that computational power will be so astronomically high by that point, and with current software and A.I trends it will happen. Probably sooner rather than later. I'm not seeing why you disbelieve in something that is certainly going to happen. It isn't like saying Jesus will return or any other religious mumbo jumbo. The tech singularity simply means an explosion of computational power that improves itself at speeds we cannot fathom.

The tech singularity you cry about is about as unlikely as the current technological level we have achieved, which as far as I'm concerned is fairly damn real.

Computational power =/= thinking machines that will start inventing things (and even smarter thinking machines) on their own. You clearly don't understand what the "Church of the Singularity" claims will happen.

Also:

I'm not seeing why you disbelieve in something that is certainly going to happen.

:lol: We don't know for certain what will happen tomorrow, much less what will happen in decades. Extrapolating future based on current trends can only get you so far. At best it gives you a possible vision of what the future might be.
 
I'm probably what people would consider a 'singulatarian', in that I expect this transition to artificial substrate to occur. This means that I perceive the possibility of seeing immortal humans as being viable, and so I'd like to see it sooner rather than later (in order to save more lives).

This is mostly because I buy into the 'law of accelerating returns', which basically states that we use science and technological improvement recursively, to increasingly build better R&D. I think that capitalistic competition helps drive this process as well as our current systems of scientific communication. As well, obviously, I believe that cognition can be carried on non-biological substrates.

I think Kurzweil is wrong in his timelines, because he's tacked on too many exponential trends without realising that some of those trends are sigmoidal (not exponential). The main one he's missing is the aging of the population. Right now, we have an expanding pool of 20-30 year olds, being drawn from the developing world. If we keep on adding to this group, it will look like an exponential increase in R&D. But this pool will eventually stabilise (and maybe decrease), and the tech-boost we get from 20-30 year olds will diminish. As more old foggies are added (in proportion), the rate of tech-adoption will decrease.
 
Because older people are obviously incapable of innovation...?


Personally, I think the rate of technological progress will soon began to slow down. By 2100, it will be much slower than it is today, as all "easily accessible" paths of research will be exhausted.

The rapid technological progress we perceive today is the result of a large increase of human population (which can't continue indefinitely, in fact it has to stop in the next 30 years or we're screwed) coupled with the increase of the percentage of people who receive education.

As the population stabilizes and most people reach educational standards of the developed world, there will be no more room for expansion. At the same time, the amount of necessary knowledge one has to posses in order to produce innovation will get far bigger. 100 years ago, you only needed to be smart and have a basic education to come up with pretty revolutionary things and ideas. Today, you need expensive long education and access to state-of-the-art laboratories to produce new inventions. This in practice reduces the share of people in the educated population who are capable of producing innovation.

We've got used to fast growing everything, but we'll soon find out that it was just a temporary phase. In fact, some claim we're already approaching the end of it, as the global economy is looking at a decade of near stagnation. I don't agree, I think there will be at least one additional period of rapid expansion, but when it's over, it will be over for good (at least unless we leave Earth and start expanding throughout the rest of the Universe).
 
Because older people are obviously incapable of innovation...?
They still can! We'll still have technological increase, just not with the double-exponential Kurzweil thinks. This next paragraph is basically the same theme as what I meant, though
The rapid technological progress we perceive today is the result of a large increase of human population (which can't continue indefinitely, in fact it has to stop in the next 30 years or we're screwed) coupled with the increase of the percentage of people who receive education.
We've currently got an expanding pool of people who're educated. The key word is 'expanding'. Once the expansion tapers off, we'll lose that exponential component to the trends. On this, we agree. We just talked about the same thing from different perspectives.
At the same time, the amount of necessary knowledge one has to posses in order to produce innovation will get far bigger. 100 years ago, you only needed to be smart and have a basic education to come up with pretty revolutionary things and ideas. Today, you need expensive long education and access to state-of-the-art laboratories to produce new inventions. This in practice reduces the share of people in the educated population who are capable of producing innovation.

I don't know if this is true, because the tools by which we can make discoveries keep on getting better. If the tools are expanded, then it takes people less time to teach them to make those discoveries. As well, we are getting better at bringing people 'up to speed' of a current science, because we excise out old information and replace it with better information ... while the information itself stacks, the amount of total teaching required doesn't change much. In 1990, it took 4 years to get someone an advanced genetics education. in 2011, it still takes 4 years. The information is rather different. It's a more efficient packaging of information too.

Loosely, though, I agree. It does take more time to get someone's knowledge to 'the edge', with enough additional information to think of new discoveries. This might mean that we need to spend more time educating the inventors. But I don't know if this means that we'll need to increase how much % effort is put in by society, or if this increased investment (per inventor) is paid for out of the exponential trends.
 
Because older people are obviously incapable of innovation...?


Personally, I think the rate of technological progress will soon began to slow down. By 2100, it will be much slower than it is today, as all "easily accessible" paths of research will be exhausted.

The rapid technological progress we perceive today is the result of a large increase of human population (which can't continue indefinitely, in fact it has to stop in the next 30 years or we're screwed) coupled with the increase of the percentage of people who receive education.

As the population stabilizes and most people reach educational standards of the developed world, there will be no more room for expansion. At the same time, the amount of necessary knowledge one has to posses in order to produce innovation will get far bigger. 100 years ago, you only needed to be smart and have a basic education to come up with pretty revolutionary things and ideas. Today, you need expensive long education and access to state-of-the-art laboratories to produce new inventions. This in practice reduces the share of people in the educated population who are capable of producing innovation.

We've got used to fast growing everything, but we'll soon find out that it was just a temporary phase. In fact, some claim we're already approaching the end of it, as the global economy is looking at a decade of near stagnation. I don't agree, I think there will be at least one additional period of rapid expansion, but when it's over, it will be over for good (at least unless we leave Earth and start expanding throughout the rest of the Universe).

Ah but if we attain a number of BCI inventions or even something to complement our wet and patchy memory with something like implantable NOM (non-organic memory) you could download the current progress in a certain field in a blink of an eye.
 
Ah but if we attain a number of BCI inventions or even something to complement our wet and patchy memory with something like implantable NOM (non-organic memory) you could download the current progress in a certain field in a blink of an eye.

Too bad we currently have no bloody idea how to do anything even remotely similar to that :) Since our memory is basically stored in the 3-dimensional structure of our brain - in the connections our neurons make with each other - it's very doubtful we'll be able to connect our brain to some external memory source, at least not in the foreseeable future.
 
Too bad we currently have no bloody idea how to do anything even remotely similar to that :) Since our memory is basically stored in the 3-dimensional structure of our brain - in the connections our neurons make with each other - it's very doubtful we'll be able to connect our brain to some external memory source, at least not in the foreseeable future.

I might go so far as to say that we already are connected to the largest external memory in the world, the internet.
 
I might go so far as to say that we already are connected to the largest external memory in the world, the internet.

That's hardly the same as what Kozmos was talking about. Internet is a wonderful thing, but it's not different in principle from a library or any other external depository of information and knowledge.

If I want to learn something, I can Google or Wiki it, but I still have to read and absorb the information, which can be quite time consuming. If it was possible to just push a button and upload the info directly to my brain, it would be great, but I am afraid that's not possible and it won't be possible for a very, very, very long time.

(and maybe that's good - can you imagine what it would do to the world? Practically all skilled workers would suddenly lose the value they now have on the jobs market, because anybody could achieve their level of competence by mere uploading the skills to their brains. Schools would become obsolete, and with them the teachers. The consequences would be enormous.)
 
Moore's law is a fact for now, but what does that have to do with anything?

By Moore's law we can predict the power of computation over the decades, and by using simply arithmetic we are able to figure out that functions of the human brain can be handled by microscopic portions of that power. Thus, the power will be exponentially greater than the whole of humanity, it just needs the proper "self aware algorithms".
 
By Moore's law we can predict the power of computation over the decades, and by using simply arithmetic we are able to figure out that functions of the human brain can be handled by microscopic portions of that power. Thus, the power will be exponentially greater than the whole of humanity, it just needs the proper "self aware algorithms".

We will also shave ourselves with 14-blade razors by 2100 !!!

14razorblades.gif
 
We will also shave ourselves with 14-blade razors by 2100 !!!

14razorblades.gif

This figures means nothing, as we all know four blades is optimal. Computers on the other hand HAVE been following the same path and will continue to do so thanks to new innovations. There is no "if" when it comes to the magnitudes of increased computational power, there is only the time between now and then.
 
A point that is brought up rarely in this debate is that our current increase in computation power only relies on miniaturization. We're already close to the orders of magnitude where other forces than out currently-relied-up electromagnetic force come into effect - transistors can't be as thin as we want to and still work.

There are of course alternatives, but they're not even remotely close to practical application, and certainly won't fall under Moore's law.
 
Moore's Laws is not a fact, and certainly not a physical law. It is not compatible with the laws of physics, at least if it is assumed to continue indefinitely. It will fail eventually, probably long before reaching the theoretical limit where the computer is a black hole.


Moore's law is probably best described as a self-fulfilling prophecy. It sets the goalposts for researchers and engineers to try to meet rather than predicting the inevitable. Strictly speaking, it has not consistently held true. There have been periods when developments were faster than expected, and others when they were slower. The rate of increase in processing capacity has actually been decreasing since its peak back in 1998.
 
Back
Top Bottom