MarineCorps said:
What exactly is so bad about raw Ghz?
Well, raw GHz and nothing else... does absolutely nothing!
Infact, processing power has never been the bottleneck in technology. It has always been the memory. If you cranked up the CPU frequency, but kept cache & memory speeds the same - overall performance would not change much.
In an ideal world, the memory would perform calculations without a standalone CPU - combining the Logic and Storage into one unit would slash waiting time between operations.
The closest consumers have ever got to that, are big CPU caches.
AMD emphasise caches more than Intel does. I guess Intel wants to avoid memory becoming the focus of attention because it's not their area of expertise, and AMD haven't much choice because they have never really outperformed Intel in raw GHz.
We are bound by the agenda of these companies, and it looks unlikely that anyone outside accademic circles is going to be intrested in producing self-sufficient memory anytime soon
Incidentally, SGRAM is a varient of SDRAM used in some graphics cards. It can perform
very simple calculations on data bits, releasing the graphics processor to do other work. This solution improves overall performance by about 10% to 20% (at the same frequencies) but is expensive and thus very rare, especially given that graphics cards have a shelf-life of only a few months. I have only
once seen DDR-SGRAM!