Why did Moore's Law Stop?

Narz

keeping it real
Joined
Jun 1, 2002
Messages
31,514
Location
Haverhill, UK
Just lack of motivation to be more efficient? Hard drives on new laptops should be like 256TB.
 
It's hard to make things arbitrarily small, like transistors on a <10 nm scale. Eventually even quantum tunneling gets in and starts messing with your electrons. Computing power is still going up fairly quickly, but not quite at the doubles-every-2-years level anymore. I still can't rule out some sort of innovation keeping it going for yet another decade, though, FWIW.

The more interesting question is something like: why did Moore's law continue for as long as it did? There are no other fields, AFAIK, where a doubling rate of 2 years is sustained for more than 60. A variety of techs built on each other like stepladders. Wiki has an overview of some of the techs that enabled this to keep going.

Contrast with other technology. Jet travel for the masses got as fast as it was ever going to get by the 1960s; from then until now, the fastest speed to get from point A to point B has been about Mach 0.7. Supersonic transport was the "inevitable" future tech that never happened in any commercially viable way, despite UK and French efforts to keep Concorde in the air. Even solar power, while it has been growing impressively in part due to parallel developments in computing, has not grown at anything like Moore's Law rates because you have to build out a bunch of expensive infrastructure too. Things like cancer research do happen, but mostly at an incremental pace with a few occasional breakthroughs, which occasionally turn an untreatable cancer into a treatable one but with only incremental impact on cancer mortality overall.

One interesting question to ponder: if electronics had turned out to be just another type of tech, and progress had slowed to a linear and incremental pace from about 1990 on, how much of a folk belief in progress would there still be? Almost everything "high-tech" we see is that way because of the impressive growth in electronics. Were it not for Moore's Law, we might really think that progress was fastest in the mid-20th century and was fairly stagnant since then, with incremental but non-revolutionary change in most fields except genomics and a few others.
 
Just lack of motivation to be more efficient? Hard drives on new laptops should be like 256TB.
I think lack of motivation may be part of it, on a commercial level. When it comes to computer memory, people just don't have a use for so much. Take portable music players: When I last owned an iPod, I didn't even come close to filling it with music, and I was a not-quite-obsessed music collector who wanted a huge library in his pocket. Today's devices actually have less memory than that one did. I don't know what would people would do with a 256TB laptop, so it wouldn't have much added value over a typical laptop.
 
Games and applications development has reached a plateau where you don't need more cpu power, ram memory or disk capacity. In other words, hardware surpassed content. However Moore's law is still working with graphic cards at least. Current Nvidia 10 series is about twice as powerful as 9 series from 2-3 years ago. Reasons? 4k and VR.
 
Games and applications development has reached a plateau where you don't need more cpu power, ram memory or disk capacity. In other words, hardware surpassed content. However Moore's law is still working with graphic cards at least. Current Nvidia 10 series is about twice as powerful as 9 series from 2-3 years ago. Reasons? 4k and VR.
New games need ~8GB RAM and it, like storage requirements are being bottlenecked by consoles. The CPU core requirements have been bottlenecked a long time because true hex core+ CPUs have been out of the price range of people because Intel has had no competition, expect that to change with Ryzen.
 
One interesting question to ponder: if electronics had turned out to be just another type of tech, and progress had slowed to a linear and incremental pace from about 1990 on, how much of a folk belief in progress would there still be? Almost everything "high-tech" we see is that way because of the impressive growth in electronics. Were it not for Moore's Law, we might really think that progress was fastest in the mid-20th century and was fairly stagnant since then, with incremental but non-revolutionary change in most fields except genomics and a few others.


I entirely agree.

My father used to cut out science reports from the UK Times newspaper for me to read in the 1960s.

It was quite clear to me then that 80% of the more promising advances were in two broad fields, firstly solid
state electonics and secondly molecular research in biology; supported by an occasional bit on astronomy or geology.
 
I think CPU's have reached the limit of what conventional transistors can do, to make them much faster they'd have to go to laser-based chips.(Chips that use lasers instead of electrical currents)
 
Games and applications development has reached a plateau where you don't need more cpu power, ram memory or disk capacity. In other words, hardware surpassed content. However Moore's law is still working with graphic cards at least. Current Nvidia 10 series is about twice as powerful as 9 series from 2-3 years ago. Reasons? 4k and VR.

Somewhat, cpus have out paced games. You can still run the newest games at the best settings on cpus 4-5 years old. But gpus you are right, it's mostly cus of higher res but also MSAA which basically doubles or quadrouples your res and then shrinks it down to smooth lines.

Ram games are starting to use more but it's kept pace. two 8gb sticks are around $100. Back in ~2013 two 4gb sticks were about that much, and in 2010 two 2gb sticks were about that price. Hard drives no one needs that much space I think so there's no consumer demand, but ssd tech has been drastically dropping in price. Idk about doubling every 2 years, but a 256gb top end drive like a samsung evo is now around $100. A few years ago it would've been ~200 and a few years before that it would've been maybe 500+. A tiny 60gb drive was like a couple hundred.
 
Just lack of motivation to be more efficient?

Complete opposite. Moore's law isn't about efficiency - for that, look at performance per watt, which is still steadily increasing.

Hard drives on new laptops should be like 256TB.

Moore's law was about processing, not storage space.

I think lack of motivation may be part of it, on a commercial level. When it comes to computer memory, people just don't have a use for so much. Take portable music players: When I last owned an iPod, I didn't even come close to filling it with music, and I was a not-quite-obsessed music collector who wanted a huge library in his pocket. Today's devices actually have less memory than that one did. I don't know what would people would do with a 256TB laptop, so it wouldn't have much added value over a typical laptop.

Not really applicable to enterprise storage.

A low-cost 256TB drive would save a ton of resources over building, powering and maintaining warehouses full of 60-drive 480 TB pods: https://www.backblaze.com/blog/open-source-data-storage-server/
 
One thing that's slowed us down is the lack of intuitions about how to program for multi-core processors. Once we learn how to tap that potential, and teach others how to, then things will zoom up again.
 
I don't think that's really true, modern programming languages have very strong concurrency and parallel processing capabilities. The limitations are really intrinsic computational ones - in the overhead that synchronization between parallel processes adds, and as a result of Amdahl's law.

From the human side, the limitations are really a) legacy code that isn't parallelized yet (see the massive effort it's taking for Firefox) and b) borderline quality developers that are simply never going to grok solving/debugging problems in parallel.

GPUs are essentially massively-parallel processors, but the problem sets that GPUs can be used for are limited.
 
That's a variant of how our expanding of an educated population can increase our tech growth, but we get less-than-exponential returns
 
Complete opposite. Moore's law isn't about efficiency - for that, look at performance per watt, which is still steadily increasing.
My laptop now takes the same wattage as my laptop when I first joined CFC (15 years ago :ack: ). And it's still slow, I can't even play slither.io without lag. :(
 
Your laptop isn't really indicative of industry trends. Idle power and load power are all trending significantly downwards at every form factor, with smaller form factors showing larger performance gains, exacerbating the global performance-per-watt that's available.

My laptop draws about 4 watts while in use, and lasts over a month in standby.
 
so what happened to the bragging about it was no longer in action , because the supreme American engineers were doubling capacity in less than a year and a half ?

yes , it was 18 months because it's the exact middle point between a year and two .
 
My laptop lasts 0 seconds in standby. Need to replace kaput battery. However it is very old and dont know if worths it. On the other hand it is a rock solid Lenovo... Question: at same price, current laptops have the power of desktops from 4, 5, 6 years ago?
 
Same price as what?

Comparing power per year is tricky. If you're comparing long multithreaded workloads, it's basically the worst-case scenario for a laptop, as it will be thermally limited, so you're probably looking at comparable performance to pre-Sandy Bridge (i.e. 2010) desktop performance. If you're looking at lightly-threaded, bursty workloads (e.g. web browsing), you're going to be looking at whatever single-core turbo clock your chip can do. Even 5 watt ultra-mobile chips that go into fanless laptops burst to 3.6 GHz, so you're looking at comparable performance of a similarly clocked desktop chip. (Plus ~10% IPC gains per generation.)

Furthermore, the storage bottleneck is super important. Any PC from the past decade with an SSD will deliver a better user experience than any other with spinning rust. PCIe storage is nice, but only up to ~5x faster (in the best case) than SATA, versus the orders of magnitude improvement you get by going from HDD to SSD.
 
Back
Top Bottom