Nvidia RTX 5000 Rumors Picking Up Speed as Leaks Claim to Reveal Price and Specs
You may need to empty the piggy bank to get Nvidia's next graphics card.
The Nvidia GeForce RTX 5080 is looming on the horizon, but while we don't have official word on next-generation graphics cards, rumors are swirling about specs and pricing.
Take this with a grain of salt, but right now the Nvidia RTX 5090 is rumored to cost between $1,999 and $2,499, according to popular leaker Moore's Law is Dead. The video also suggests the RTX 5080 may cost as much as $1,499. Compared to the RTX 4090 and RTX 4080, which cost $1,599 and $1,199 at launch, respectively, this would be a massive price jump if these leaks are true.
For the RTX 5090 at least, this potentially high pricing could be due to inflated specs. The RTX 5090 is rumored to be a 600W part with 21,760 CUDA cores with 32GB of GDDR7 memory on a 512-bit bus, according to hardware leaker kopite7kimi on Twitter. If accurate, the RTX 5090 would be the first GPU to hit the market with GDDR7 memory, which would be much faster than the GDDR6X memory found on the RTX 4090.
We still don't know when Nvidia is going to grace us with the RTX 5090's presence, but it's increasingly likely that Nvidia's next graphics cards will be launching early next year. After all, Nvidia CEO Jensen Huang is hosting the CES 2024 keynote, giving Team Green ample opportunity to announce a new generation of gaming graphics cards – a first for CES.
I've reached out to Nvidia for comment, and I'll update this story if and when I hear back.
Could the RTX 5090 really cost $2499?
Nvidia's top GPUs have been getting more expensive with each generation, so it wouldn't be a huge surprise if the RTX 5090 costs more than the RTX 4090. However, if the top-end of the leaked price range is accurate, it would mean nearly a $1,000 jump in price for the next-generation enthusiast graphics card.
Assuming the rumors are accurate, that price will get you both more and faster memory than what's on the RTX 4090, which would be a massive boost to performance. Plus, the RTX 5090 would also feature all the architectural improvements made to Blackwell when it launched for data centers earlier this year.
There's more of a problem when it comes to lower-end graphics cards, however. That Moore's Law is Dead video suggested an internal briefing from Nvidia – from an anonymous source – claimed the RTX 5070 would be competitive with the RTX 4070 Ti. Right now the leaked pricing suggests the RTX 5070 may launch for $699 with 12GB of VRAM. That's $100 more expensive than the RTX 4070 Super, which also comes with 12GB of VRAM.
Keep in mind if Nvidia really is saving the RTX 5090 for CES, rather than the fall release it usually does, these leaks are very much subject to change. CES 2024 is still three months away, and a lot can change, especially when it comes to pricing.
Very hard to analyze your exact comparison case. But bottom line is that you are GPU limited in many games. Exactly what you said.As this thread has some people who's knowledge goes far deeper than mine, can I ask for some advice?
What's the best CPU (~<200€) to go with my RTX 6650 xt? Upgraded to that card last year (along with a new M.2 SSD) and it's time to do the same to my ancient i5 6600K CPU. Not even just for Civ, it's struggling even with running Unity at a reasonable speed these days. Looking around, it seems like my best options are the 5700x3D or the 7600x. The later is better generally (although it will require new and more expensive RAM too), but the former is best for gaming so it's what I'm leaning towards, my main worry is that the gaming advantage it provides will be bottlenecked by the GPU anyway. Is that likely to be the case? Or will I get better gaming performance with the 5700x3d over the 7600x even with a RTX 6650 xt? Benchmarks for CPU always seem to use a way better GPU so it's hard to judge which combos complement each other well.
On top of that, any recommendations for motherboards? That's something I've also truly ignorant about. Is there any reason not to go with the cheapest that's compatible with what I will put on it? Looking around I think that's the B550 series for the 5700x3D and the B650 for the 7600x, is that correct?
On just technical comparison recommendation does not change. 7600x will be better CPU for games overall.Thanks for the information!
I'm really not thinking to upgrade for 5 years or more at least, so I'm not sure having an AM5 platform is all that helpful (it might have moved on to AM6 by that point). I should clarify that I'm not chasing 120fps at 4k resolution on max settings here. It's enough for me to run current releases at 60fps on my 1080p monitor at highish settings, and for me to be able to play (at minimum settings and 30fps) new releases as they come out. Even with my ancient CPU, the only game I've really struggled with at a hardware level is Jedi survivor which stutters a lot in some specific places. But things like Elden Rings or Cyberpunk run as smoothly as I like them to already. So I'm only really upgrading now because November is a good time of year to do so, and I think that I might be completely unable to run things that are coming out in the next year.
Do my very humble goals change your recommendations at all?
Even 1440p feels dated to me; I can never go back from 4k.57% of gamers and increasing are
I suspect that it's a question that while the step down from 1440p might be very noticeable to you, when you stay at 1080p you don't notice it. That's the problem with upgrading (not just PCs, but much of life in general): habituation. You spend a bunch of cash, appreciate the improvement for a short while, then it becomes the new normal and the old way looks bad.
4K is outdated. Dual UHD wide screen is the way.Even 1440p feels dated to me; I can never go back from 4k.
But judging by the results of the Steam survey you posted, only 3.9% of people are using 4k on their primary display. That's a lot lower than I expected.
Go adopt 4k everyone! And get an OLED display while you're at it. See the light!
From a platform standpoint, the reason to go with the higher-end X series would be if you were planning to build a PC with lots of expansion (PCIe) cards, and IIRC many of the X series boards have more/faster m.2 slots for SSDs as well. But for the average person, no, it wouldn't make a difference at all in how you play Civ.Is there any reason not to go with the cheapest that's compatible with what I will put on it? Looking around I think that's the B550 series for the 5700x3D and the B650 for the 7600x, is that correct?
I've been on 1200p for close to 13 years because no one would make a 16:10 27" monitor, which was the next logical upgrade from 16:10 24", and the 16:10 30" options (at 2560x1600, which would be a real upgrade over 1920x1200) were $1000. It's still hard to find 16:10 upgrade options, and at a certain point I quit bothering to look for them. Though I've considered adding a second 16:10 24" monitor occasionally.Even 1440p feels dated to me; I can never go back from 4k.
But judging by the results of the Steam survey you posted, only 3.9% of people are using 4k on their primary display. That's a lot lower than I expected.
Go adopt 4k everyone! And get an OLED display while you're at it. See the light!
Still in 1080p just because it's always felt sufficient to me and I've never known anything else I have been hemming and hawing about picking up a bigger monitor for months now, though, so might upgrade the resolution as well if I do so.I have a hard time understanding how so many people are still playing on 1080p. It feels absolutely ancient to me.
4K is outdated. Dual UHD wide screen is the way.
What really helps me in the 1080 - 1440 - 4K monitor comparison is having eyes so old that I can't tell the difference any more.Still in 1080p just because it's always felt sufficient to me and I've never known anything else I have been hemming and hawing about picking up a bigger monitor for months now, though, so might upgrade the resolution as well if I do so.
Well... I'm stuck at 2560 x 1440 because my pc's in the living room, my monitor is 27 inches and won't get any bigger because it would then 'clash too much with the decor' according to my significant other My pc could take 4k easily butEven 1440p feels dated to me; I can never go back from 4k.
But judging by the results of the Steam survey you posted, only 3.9% of people are using 4k on their primary display. That's a lot lower than I expected.
Go adopt 4k everyone! And get an OLED display while you're at it. See the light!
Went up to 1440p some 3 years ago and can't for the life of me imagine living with merely a 1080p screen now. It's just too old and poor resolution to look at.Still in 1080p just because it's always felt sufficient to me and I've never known anything else I have been hemming and hawing about picking up a bigger monitor for months now, though, so might upgrade the resolution as well if I do so.
I believe you are approaching this just backwards. Same sized images done with higher resolutions (higher ppi) help brain to interpret the visual information. So person suffering presbyopia (like me) and even other eyesight problems really want the best resolutions which any desired monitor size can offer.What really helps me in the 1080 - 1440 - 4K monitor comparison is having eyes so old that I can't tell the difference any more.
Makes any decision so much easier . . .
I believe you didn't recognize it as a joke.I believe you are approaching this just backwards. Same sized images done with higher resolutions (higher ppi) help brain to interpret the visual information. So person suffering presbyopia (like me) and even other eyesight problems really want the best resolutions which any desired monitor size can offer.
I am serious person.I believe you didn't recognize it as a joke.
I've used a 4K monitor on my iMac for years and years, and regularly play games at the highest resolution/settings my machines can handle. I have, however, found that after years on 4K I can't see much difference between 1080 and 1440 on my PC, which is why I splurged on a 4K monitor a few years ago for that machine as well.
The only question left for me is whether I will feel it necessary to upgrade the PC for Civ VII, or will I be able to wait until Anno 117 comes out much later in 2025: my experience with Anno 1800, the previous game in that series, is that it will be an absolute CPU/RAM hog so that even if my GPU can handle it, I will probably have to upgrade at launch or soon afterwards.