Computers in 2100 AD

2020:
Computers will still piss me off.

2030:
Computers will still piss me off.

2040:
Computers will still piss me off.

2050:
Computers will still piss me off.

2060:
Computers will still piss me off.

2070:
Computers will still piss me off.

2080:
I die and get a break from the goddamn machines

2090:
Computers will resurrect me from secret government brain-tapping recordings

2100 - The Heat Death of the universe
I am continuously pissed off at my AI overlords.
 
At some point we will start merging with computers and have to exert a degree of willpower to not spend the rest of our lives living in shared, perfect-immersion fantasy worlds that cater to our second-deepest desires.

Also, computing will possibly get so powerful that we could simulate entire universes, which could mean the simulated universes might be able to evolve species that make computers that simulate universes. At some point in the future, gigantic AI computers could mine whole solar systems for materials and make massive planet sized computers that could then simulate dozens or infinite or at least a few layers of universes, which of course then leads the the obvious conclusion that if such a thing is possible, than statistically we are already a simulation.
 
At some point we will start merging with computers and have to exert a degree of willpower to not spend the rest of our lives living in shared, perfect-immersion fantasy worlds that cater to our second-deepest desires.

Also, computing will possibly get so powerful that we could simulate entire universes, which could mean the simulated universes might be able to evolve species that make computers that simulate universes. At some point in the future, gigantic AI computers could mine whole solar systems for materials and make massive planet sized computers that could then simulate dozens or infinite or at least a few layers of universes, which of course then leads the the obvious conclusion that if such a thing is possible, than statistically we are already a simulation.

The Wachowskis agree.
 
Also, computing will possibly get so powerful that we could simulate entire universes, which could mean the simulated universes might be able to evolve species that make computers that simulate universes. At some point in the future, gigantic AI computers could mine whole solar systems for materials and make massive planet sized computers that could then simulate dozens or infinite or at least a few layers of universes, which of course then leads the the obvious conclusion that if such a thing is possible, than statistically we are already a simulation.
The problem with this is information density. There is way more information in the univerce then can be packed into a small computer sized area.
 
Is there? we have more neuron connections in our brain than atoms in (insert massive cosmic value) has atoms. Surely with the right computing this could be done!
 
Is there? we have more neuron connections in our brain than atoms in (insert massive cosmic value) has atoms. Surely with the right computing this could be done!
From wiki:
The adult human brain is estimated to contain from 10^14 to 5 × 10^14 (100-500 trillion) synapses.

That's about as many atoms in an ovum...
 
Is there? we have more neuron connections in our brain than atoms in (insert massive cosmic value) has atoms. Surely with the right computing this could be done!

It doesn't matter. The point is in order to simulate the outside world, you have to simulate the computer - because the computer also influences the outside world. And you're not allowed any approximations because of chaos theory, if you make even a tiny error, a short time later it makes all of your predictions completely wrong. This cartoon explains it http://www.smbc-comics.com/index.php?db=comics&id=2108#comic
 
Quantum computers will take over way before 2100. Even though at the moment they are still in the secondary research state they are simply too good to ignore.

If you ignore Moore's Law for a moment, then transistors are linear. If you want a computer to run twice as fast you need twice as many bits or transistors (in reality, more like x2.5 due to inefficiencies in rerouteing calculations). Moore's Law states that every 18 months you double the number of transistors on a given chip, so every 18 months a computer doubles in speed.

Quantum computers, on the other hand, are exponential. Due to complicated physics and the fact that they do everything in parallel, adding an extra qubit (quantum equivalent of a bit) to a quantum computer doubles its speed. (In reality its more likely to be 6-12 qubits needed due to even more complicated physics). But what that means is that once they take off, quantum computers really take off. If you go from a 100 qubit to a 200 qubit quantum computer you did make it run twice as fast, you make it run over 1000 times faster. If you go from a 200 qubit to a 400 qubit qumpter (can we coin this as a new term for a quantum computer?) you make it run a million times faster.

If you say that a Moore's Law type of relationships exists for qubits as well as transistors; then what you get is that every 18 months the number of operations a computer can do squares. That is a ridiculously massively huge rate of increase. Once they take off, quantum computers really take off.


Not only that but for some specific tasks, quantum computers require less steps than a normal computer would. For example, database mining or (famously) breaking certain types of encryption. But even if qumputers require more steps than normal computers for some tasks; because the rate of growth of quantum computers is so massive, its only a matter of time before they can do even those tasks better than normal computers.
It's also worth pointing out that not many algorithms have been found that work to quantum computers advantages, but it is almost inevitable that more will be found once we get closer to a working prototype.


In short, I don't know when quantum computers will appear in the 'mainstream', but when they do; they'll revolutionise everything completely.

Actually it slowed down to 24 months if I remember correctly. By the way lets throw Nanotechnology into the mix.
 
Actually it slowed down to 24 months if I remember correctly. By the way lets throw Nanotechnology into the mix.

At that rate we still get to atom sized transistors by 2100
 
Also, computing will possibly get so powerful that we could simulate entire universes, which could mean the simulated universes might be able to evolve species that make computers that simulate universes. At some point in the future, gigantic AI computers could mine whole solar systems for materials and make massive planet sized computers that could then simulate dozens or infinite or at least a few layers of universes, which of course then leads the the obvious conclusion that if such a thing is possible, than statistically we are already a simulation.

That would be like Sims but bigger!:lol:
 
It doesn't matter. The point is in order to simulate the outside world, you have to simulate the computer - because the computer also influences the outside world. And you're not allowed any approximations because of chaos theory, if you make even a tiny error, a short time later it makes all of your predictions completely wrong. This cartoon explains it http://www.smbc-comics.com/index.php?db=comics&id=2108#comic

Who says such a thing would be used for predictions?
 
The end of personal computing:

Nvidia begins designing a GPU that requires a nuclear powerplant to run, but runs out of energy mid-design.
 
The end of personal computing:

Nvidia begins designing a GPU that requires a nuclear powerplant to run, but runs out of energy mid-design.
I don't think nVidia will make a GPU that uses more than 10 Gigawatts (I think...)
 
Well, consider, the energy you put in, you have to dissipate as heat. So gW in, gW out. You'd need a huge heatsink to get rid of all that heat, and we're talking cooling tower size for idle temps.
 
Well, consider, the energy you put in, you have to dissipate as heat. So gW in, gW out. You'd need a huge heatsink to get rid of all that heat, and we're talking cooling tower size for idle temps.

I'm pretty confident Nvidia isn't going to breach 1 kilowatt soon actually, but you never know
 
Any technology sufficiently advanced is indistinguishable from magic. 50 years ago they would have restarted the witch trials if you showed someone a cellphone.

No they wouldn't, besides the fact it would probably run out of battery before you could show anyone it wouldn't work due to the lack of cell networks :D
 
Well, consider, the energy you put in, you have to dissipate as heat. So gW in, gW out. You'd need a huge heatsink to get rid of all that heat, and we're talking cooling tower size for idle temps.

Yes, but refrigeration tech would also improve as well so the increased heat would not be a problem
 
No they wouldn't, besides the fact it would probably run out of battery before you could show anyone it wouldn't work due to the lack of cell networks :D

It was a hyperbole. Also, actually charging it would not be too much issue. I can probably wire together an adapter using 50 year old parts just fine.

Yes, but refrigeration tech would also improve as well so the increased heat would not be a problem

You would still need to move 10 GW of energy away from the processor. There's no material that can transfer that much energy in the form of heat in the size of an average GPU die. Not only that, but 10GW of energy is a helluva lot of energy per hour. For comparison, that's 360 Terajoules of energy every hour, where just 60 Terajoules was the energy released by the Hiroshima A-bomb.

You can't just say "oh well, someone will come up with something" when in reality, the solution would have to break some laws of physics.
 
I'd have to wonder what screen resolution that 10GW of power could run in say the current generation. Is there a max limit of on the number of GPU's that can be linked together in a SLI-like format?
 
I'd have to wonder what screen resolution that 10GW of power could run in say the current generation. Is there a max limit of on the number of GPU's that can be linked together in a SLI-like format?

only with current technology, but we're talking about 89 years in the future!
 
Top Bottom