Anyone Else Buying New PC for Civ7?

Nvidia new card -

Nvidia RTX 5000 Rumors Picking Up Speed as Leaks Claim to Reveal Price and Specs​

You may need to empty the piggy bank to get Nvidia's next graphics card.​


The Nvidia GeForce RTX 5080 is looming on the horizon, but while we don't have official word on next-generation graphics cards, rumors are swirling about specs and pricing.

Take this with a grain of salt, but right now the Nvidia RTX 5090 is rumored to cost between $1,999 and $2,499, according to popular leaker Moore's Law is Dead. The video also suggests the RTX 5080 may cost as much as $1,499. Compared to the RTX 4090 and RTX 4080, which cost $1,599 and $1,199 at launch, respectively, this would be a massive price jump if these leaks are true.

For the RTX 5090 at least, this potentially high pricing could be due to inflated specs. The RTX 5090 is rumored to be a 600W part with 21,760 CUDA cores with 32GB of GDDR7 memory on a 512-bit bus, according to hardware leaker kopite7kimi on Twitter. If accurate, the RTX 5090 would be the first GPU to hit the market with GDDR7 memory, which would be much faster than the GDDR6X memory found on the RTX 4090.


We still don't know when Nvidia is going to grace us with the RTX 5090's presence, but it's increasingly likely that Nvidia's next graphics cards will be launching early next year. After all, Nvidia CEO Jensen Huang is hosting the CES 2024 keynote, giving Team Green ample opportunity to announce a new generation of gaming graphics cards – a first for CES.

I've reached out to Nvidia for comment, and I'll update this story if and when I hear back.

Could the RTX 5090 really cost $2499?​

Nvidia's top GPUs have been getting more expensive with each generation, so it wouldn't be a huge surprise if the RTX 5090 costs more than the RTX 4090. However, if the top-end of the leaked price range is accurate, it would mean nearly a $1,000 jump in price for the next-generation enthusiast graphics card.

Assuming the rumors are accurate, that price will get you both more and faster memory than what's on the RTX 4090, which would be a massive boost to performance. Plus, the RTX 5090 would also feature all the architectural improvements made to Blackwell when it launched for data centers earlier this year.


There's more of a problem when it comes to lower-end graphics cards, however. That Moore's Law is Dead video suggested an internal briefing from Nvidia – from an anonymous source – claimed the RTX 5070 would be competitive with the RTX 4070 Ti. Right now the leaked pricing suggests the RTX 5070 may launch for $699 with 12GB of VRAM. That's $100 more expensive than the RTX 4070 Super, which also comes with 12GB of VRAM.

Keep in mind if Nvidia really is saving the RTX 5090 for CES, rather than the fall release it usually does, these leaks are very much subject to change. CES 2024 is still three months away, and a lot can change, especially when it comes to pricing.
 
So my PC is relatively new outside of the old 1070 I was gifted, I'll definitely need to replace it at some point but I'm very glad that it should still run civ 7 as I won't have money for that for a good while
 
As this thread has some people who's knowledge goes far deeper than mine, can I ask for some advice?

What's the best CPU (~<200€) to go with my RTX 6650 xt? Upgraded to that card last year (along with a new M.2 SSD) and it's time to do the same to my ancient i5 6600K CPU. Not even just for Civ, it's struggling even with running Unity at a reasonable speed these days. Looking around, it seems like my best options are the 5700x3D or the 7600x. The later is better generally (although it will require new and more expensive RAM too), but the former is best for gaming so it's what I'm leaning towards, my main worry is that the gaming advantage it provides will be bottlenecked by the GPU anyway. Is that likely to be the case? Or will I get better gaming performance with the 5700x3d over the 7600x even with a RTX 6650 xt? Benchmarks for CPU always seem to use a way better GPU so it's hard to judge which combos complement each other well.

On top of that, any recommendations for motherboards? That's something I've also truly ignorant about. Is there any reason not to go with the cheapest that's compatible with what I will put on it? Looking around I think that's the B550 series for the 5700x3D and the B650 for the 7600x, is that correct?
Very hard to analyze your exact comparison case. But bottom line is that you are GPU limited in many games. Exactly what you said.

However there are scenarios where you benefit from CPU.
5700x3d is 8 core and 7600 is 6 core. Civilization VI could utilize cores on turn calculations well. And Civ 7 propably even better. But then 7600x runs on better clocks which then again improve same things.

Then games with heavy reliance on memory bandwith. 5700x3d is hard to get perform good with DDR4. Fast memory clocks increase internal bus speeds and make it overal faster. DDR5 with 7600x is easy to find reasonable fast memory easily and make them run reasonable fast.

Then again it is the 3D-cache that helps tons with memory accesses and makes it good gaming platform.

But as personal opinion: Am4 with 5700x3d is dying platform. AM5 with 7600x have years left. With that platform you have few years left to upgrade to fast or very fast CPU later on.

So get MSI B650 tomahawk and 7600x + good AMD EXPO supported DDR5. Upgrade your GPU at some point and maybe CPU later.
 
Thanks for the information!

I'm really not thinking to upgrade for 5 years or more at least, so I'm not sure having an AM5 platform is all that helpful (it might have moved on to AM6 by that point). I should clarify that I'm not chasing 120fps at 4k resolution on max settings here. It's enough for me to run current releases at 60fps on my 1080p monitor at highish settings, and for me to be able to play (at minimum settings and 30fps) new releases as they come out. Even with my ancient CPU, the only game I've really struggled with at a hardware level is Jedi survivor which stutters a lot in some specific places. But things like Elden Rings or Cyberpunk run as smoothly as I like them to already. So I'm only really upgrading now because November is a good time of year to do so, and I think that I might be completely unable to run things that are coming out in the next year.

Do my very humble goals change your recommendations at all?
On just technical comparison recommendation does not change. 7600x will be better CPU for games overall.

Only thing that could change it is very tight budget where we would start sourcing cheapest possible (even pre-owned) parts to make decent computer. In that AM4 platform could win.
 
I have a hard time understanding how so many people are still playing on 1080p. It feels absolutely ancient to me.
 
57% of gamers and increasing are

I suspect that it's a question that while the step down from 1440p might be very noticeable to you, when you stay at 1080p you don't notice it. That's the problem with upgrading (not just PCs, but much of life in general): habituation. You spend a bunch of cash, appreciate the improvement for a short while, then it becomes the new normal and the old way looks bad.
Even 1440p feels dated to me; I can never go back from 4k.

But judging by the results of the Steam survey you posted, only 3.9% of people are using 4k on their primary display. That's a lot lower than I expected.

Go adopt 4k everyone! And get an OLED display while you're at it. See the light! :bowdown:
 
Even 1440p feels dated to me; I can never go back from 4k.

But judging by the results of the Steam survey you posted, only 3.9% of people are using 4k on their primary display. That's a lot lower than I expected.

Go adopt 4k everyone! And get an OLED display while you're at it. See the light! :bowdown:
4K is outdated. Dual UHD wide screen is the way.

 
Is there any reason not to go with the cheapest that's compatible with what I will put on it? Looking around I think that's the B550 series for the 5700x3D and the B650 for the 7600x, is that correct?
From a platform standpoint, the reason to go with the higher-end X series would be if you were planning to build a PC with lots of expansion (PCIe) cards, and IIRC many of the X series boards have more/faster m.2 slots for SSDs as well. But for the average person, no, it wouldn't make a difference at all in how you play Civ.

Technically there are lower-end A520 and A620 platforms (for AM4 and AM5 respectively). These support fewer peripherals (USB, SATA), and drop multi-GPU (Crossfire) support, which virtually no one uses these days. But the savings over the B series aren't great, so there aren't many A-series motherboards out there.
Even 1440p feels dated to me; I can never go back from 4k.

But judging by the results of the Steam survey you posted, only 3.9% of people are using 4k on their primary display. That's a lot lower than I expected.

Go adopt 4k everyone! And get an OLED display while you're at it. See the light! :bowdown:
I've been on 1200p for close to 13 years because no one would make a 16:10 27" monitor, which was the next logical upgrade from 16:10 24", and the 16:10 30" options (at 2560x1600, which would be a real upgrade over 1920x1200) were $1000. It's still hard to find 16:10 upgrade options, and at a certain point I quit bothering to look for them. Though I've considered adding a second 16:10 24" monitor occasionally.

I did have a 5K monitor at work pre-pandemic. It was all right I suppose. But I wasn't doing photo editing or gaming or anything particularly intensive on it, just writing code, and it turns out that 100 dot-per-inch font is just as legible to me as 225 dot-per-inch font.
 
I have a hard time understanding how so many people are still playing on 1080p. It feels absolutely ancient to me.
Still in 1080p just because it's always felt sufficient to me and I've never known anything else 😅 I have been hemming and hawing about picking up a bigger monitor for months now, though, so might upgrade the resolution as well if I do so.
 
4K is outdated. Dual UHD wide screen is the way.


Nice, used triple screen setup for a very long time, I'm now using a 49" dual QHD since a few years, would never go back. That beast looks like the next step.

But while it's absolutely fantastic for various simulations, FPS or RPG, for 4X a normal screen is enough IMHO, ultra wide can be a pain for some UI. OTOH it's great for modding when playing windowed with tools left/right (as you'd do on a multiple screens setup)
 
Still in 1080p just because it's always felt sufficient to me and I've never known anything else 😅 I have been hemming and hawing about picking up a bigger monitor for months now, though, so might upgrade the resolution as well if I do so.
What really helps me in the 1080 - 1440 - 4K monitor comparison is having eyes so old that I can't tell the difference any more.

Makes any decision so much easier . . .
 
Even 1440p feels dated to me; I can never go back from 4k.

But judging by the results of the Steam survey you posted, only 3.9% of people are using 4k on their primary display. That's a lot lower than I expected.

Go adopt 4k everyone! And get an OLED display while you're at it. See the light! :bowdown:
Well... I'm stuck at 2560 x 1440 because my pc's in the living room, my monitor is 27 inches and won't get any bigger because it would then 'clash too much with the decor' according to my significant other :dunno: My pc could take 4k easily but
we do make sacrifices for peace's sake :hatsoff:I still feel the image quality I get is quite sufficient anyways !
 
Still in 1080p just because it's always felt sufficient to me and I've never known anything else 😅 I have been hemming and hawing about picking up a bigger monitor for months now, though, so might upgrade the resolution as well if I do so.
Went up to 1440p some 3 years ago and can't for the life of me imagine living with merely a 1080p screen now. :D It's just too old and poor resolution to look at.

Then again, I somehow ended up with not 1, not 2, but 3 screens. :D
 
What really helps me in the 1080 - 1440 - 4K monitor comparison is having eyes so old that I can't tell the difference any more.

Makes any decision so much easier . . .
I believe you are approaching this just backwards. Same sized images done with higher resolutions (higher ppi) help brain to interpret the visual information. So person suffering presbyopia (like me) and even other eyesight problems really want the best resolutions which any desired monitor size can offer.
 
Last edited:
I believe you are approaching this just backwards. Same sized images done with higher resolutions (higher ppi) help brain to interpret the visual information. So person suffering presbyopia (like me) and even other eyesight problems really want the best resolutions which any desired monitor size can offer.
I believe you didn't recognize it as a joke.

I've used a 4K monitor on my iMac for years and years, and regularly play games at the highest resolution/settings my machines can handle. I have, however, found that after years on 4K I can't see much difference between 1080 and 1440 on my PC, which is why I splurged on a 4K monitor a few years ago for that machine as well.

The only question left for me is whether I will feel it necessary to upgrade the PC for Civ VII, or will I be able to wait until Anno 117 comes out much later in 2025: my experience with Anno 1800, the previous game in that series, is that it will be an absolute CPU/RAM hog so that even if my GPU can handle it, I will probably have to upgrade at launch or soon afterwards.
 
I believe you didn't recognize it as a joke.

I've used a 4K monitor on my iMac for years and years, and regularly play games at the highest resolution/settings my machines can handle. I have, however, found that after years on 4K I can't see much difference between 1080 and 1440 on my PC, which is why I splurged on a 4K monitor a few years ago for that machine as well.

The only question left for me is whether I will feel it necessary to upgrade the PC for Civ VII, or will I be able to wait until Anno 117 comes out much later in 2025: my experience with Anno 1800, the previous game in that series, is that it will be an absolute CPU/RAM hog so that even if my GPU can handle it, I will probably have to upgrade at launch or soon afterwards.
I am serious person.

Waiting for Anno 117 should serve you well if GPUs are any interest at that time. Early 2025 nvidia is ezpexted to come out with new high end and towards middle during the year. AMD is expected to "strike" in mid end as they can not compete at high end. Then intel is expected to be on market with new ARC B series, but it is mystery where it lands and how it works. But still most competition in middle in many years.

CPU side will be more stagnant. Nothing new is coming expect high end AMD >8 core 3D cache gaming CPUs. Unless intel tries price wars as it's new core ultra series was not very good in gaming.
 
Top Bottom