Civ 6 Multicore performance

Status
Not open for further replies.
As for hyperthreading

Negative ghost rider... Hyper threading is the illusion of an additional core to the OS. The only time hyper-threading is even close to double speed is if there is 2 operations going on as the same time that utilized different parts of the core. The premise of hyper-threading originally was that intel found that only about 60-80% of the core was being used at any one time so to fool the OS into thinking there was more cores was hyper threading.

If hyper-threading was really that good I would have spent the extra $100 on the 4770k but its not worth it.. In fact i specifically didnt get the hyper threading model because of issues with stuttering in gaming. Also I suggest you look up hyper-threading for gaming. I can list titles that have known performance degradation with hyper-threading turned on if you want a list?

That's not how hyper-threading works at all. A hyper-threaded core can run two threads at once. Any time one is stalled the other can be active. And threads are constantly stalled waiting for memory reads/writes. So for most games, two threads can run on one hyper-threaded core almost as fast as they could run on two cores. This is a fact, not an opinion.

And by the way, that list of titles is entirely made up of games that don't run enough threads to take advantage of hyper-threading for the systems in question.

Yes, a CPU processes a single thread 2-3% faster if hyper-threading is turned off. So if you have a game which only really takes advantage of 4 cores, you're better off turning off the cores that aren't being used.

To be fair, *most games* don't run enough threads to take advantage of all 8 hyper-threads on a 4-core. Yet.

But this is changing as we speak, primarily due to the PS4 and XB1 giving developers access to 6-7 logical cores. Developers are starting to support multithreaded rendering, and in general, game engines are being developed with parallelism in mind.

There are a lot of examples. Tomb Raider relies heavily on multithreading, and so does Civ6. The fact that the game runs 30 threads is proof that they're headed in that direction.
 
Last edited:
True. You have to be selective about what information you're ok with being out-of-date. But games are merely simulations, and the only thing that matters is the approximation of intelligence, so there's some leeway.

Simultaneous decision-making with correction is perfectly valid. Players do it all the time. You look at the cards in your hand when you draw and start planning your next action. By the time your turn comes around, some things may have occurred that will change your final decision, but parts of your internal decision tree are still valid. Hence the correction element. Just because events have invalidated parts of the tree doesn't mean I have to recalculate everything.

This would be true if Civ used a stochastic method to process the End Turn step, but it doesn't. This is in large part, I would imagine, because of how they wanted multiplayer to work. In the current, deterministic, engine each player in a multiplayer game calculates the End Turn for themselves and only compare hashes afterword to check for desync. As such, in this model, it is extremely important that for any given input to the engine a predictable (pre-determined) outcome will result. This is accomplished by simulating randomness through use of a constant seed being fed through the engine for each action and only when one action is complete will the next "random" number become known to the engine for use in the next. What you are suggesting isn't likely possible without a new game engine of a completely different design, which would require a centralised server/client architecture rather than a peer2peer one.
 
This would be true if Civ used a stochastic method to process the End Turn step, but it doesn't. This is in large part, I would imagine, because of how they wanted multiplayer to work. In the current, deterministic, engine each player in a multiplayer game calculates the End Turn for themselves and only compare hashes afterword to check for desync. As such, in this model, it is extremely important that for any given input to the engine a predictable (pre-determined) outcome will result. This is accomplished by simulating randomness through use of a constant seed being fed through the engine for each action and only when one action is complete will the next "random" number become known to the engine for use in the next. What you are suggesting isn't likely possible without a new game engine of a completely different design, which would require a centralised server/client architecture rather than a peer2peer one.

If this is true it is rather unfortunate, given that (personally) I find multiplayer Civ incredibly annoying. I'd *much* rather have a good AI opponent and play on my own schedule. But that's just me.

The only multiplayer variant of Civ I liked was Civ 4's simultaneous turns mode, because it removed the tedium of waiting for your turn. Which would actually map really well to this, ironically.
 
Last edited:
BTW, since the subject of Skylake vs Westmere came up, I went and found a benchmark online for the i7 6700k:

(Online benchmark) Skylake 4.5GHz 6-core 12HT: 12.5ms (34ms 99th percentile)
(My benchmark) Westmere 4.4Ghz 6-core 12HT: 12.4ms (25ms 99th percentile)

I'm actually surprised the Skylake didn't at least have a slight advantage. It could just be within the margin of error. More likely it's the Westmere's triple-channel RAM outperforming the Skylake's dual-channel memory.

So there's your moment of zen. a 6700k costs $340 on Newegg right now. A Westmere runs $110. ;-)

EDIT: Correction, my X5670 was $110 when I bought it this summer. Since then the price has skyrocketed. $220 right now on NewEgg!

Wow, a 2010 "Potato" CPU doubling in price in the last six months..... hmmmmm. ;)

Nothing like a Q1 2010 CPU matching a Q3 2015 CPU. This would be why Intel started blocking overclocking on Xeons. :P
 
Last edited:
BTW, since the subject of Skylake vs Westmere came up, I went and found a benchmark online for the i7 6700k:

(Online benchmark) Skylake 4.5GHz 6-core 12HT: 12.5ms (34ms 99th percentile)
(My benchmark) Westmere 4.4Ghz 6-core 12HT: 12.4ms (25ms 99th percentile)

I'm actually surprised the Skylake didn't at least have a slight advantage. It could just be within the margin of error. More likely it's the Westmere's triple-channel RAM outperforming the Skylake's dual-channel memory.

So there's your moment of zen. a 6700k costs $340 on Newegg right now. A Westmere runs $110. ;-)

EDIT: Correction, my X5670 was $110 when I bought it this summer. Since then the price has skyrocketed. $220 right now on NewEgg!

Wow, a 2010 "Potato" CPU doubling in price in the last six months..... hmmmmm. ;)

Nothing like a Q1 2010 CPU matching a Q3 2015 CPU. This would be why Intel started blocking overclocking on Xeons. :p

That's very intereating but in games does it perform like a 1st gen i7 or a current gen one ?

People I've spoken to tend to think unless you have specific non gaming uses Xeons are kind of a waste. I have no experience with them myself
 
That's very intereating but in games does it perform like a 1st gen i7 or a current gen one ?

People I've spoken to tend to think unless you have specific non gaming uses Xeons are kind of a waste. I have no experience with them myself

Honestly I got the Xeon primarily because I compile on my gaming rig. But I've noticed the difference when gaming. The extra cores and OC potential improved hitching in both Doom and RotTR.

(Westmere was 32nm vs Nehalem 45n, so the reduction in TDP increased OC headroom)

Honestly, I was happy with the CPU performance of my old i7-920 until this year when I upgraded my GPU. A 4GHz (OC) 4-core 8HT chip is still perfectly valid.

But after the GPU upgrade I noticed my CPU was keeping me from holding a smooth 60 FPS in games. Prior to that I had been content with 30 FPS.

SO far all the games I've played on the Westmere run at smooth 60. But that list is short: Fallout 4, Doom, Rise of the Tomb Raider, Civ6, Divinity Original Sin. I can't speak for games I haven't tried playing.

I'm running a GTX 1070 at 2560x1440 for all the games mentioned.
 
That's not how hyper-threading works at all. A hyper-threaded core can run two threads at once. Any time one is stalled the other can be active. And threads are constantly stalled waiting for memory reads/writes. So for most games, two threads can run on one hyper-threaded core almost as fast as they could run on two cores. This is a fact, not an opinion.

And by the way, that list of titles is entirely made up of games that don't run enough threads to take advantage of hyper-threading for the systems in question.

Yes, a CPU processes a single thread 2-3% faster if hyper-threading is turned off. So if you have a game which only really takes advantage of 4 cores, you're better off turning off the cores that aren't being used.

To be fair, *most games* don't run enough threads to take advantage of all 8 hyper-threads on a 4-core. Yet.

But this is changing as we speak, primarily due to the PS4 and XB1 giving developers access to 6-7 logical cores. Developers are starting to support multithreaded rendering, and in general, game engines are being developed with parallelism in mind.

There are a lot of examples. Tomb Raider relies heavily on multithreading, and so does Civ6. The fact that the game runs 30 threads is proof that they're headed in that direction.
Well I don't have time to keep trying to teach you anything.

Go look it up. As for the other patrons here...

Let's begin.

Alan Wake American Nightmare



There is no difference between i7 mode and i5 mode.

Arma 3



There is no difference between i7 mode and i5 mode in this most demanding FPS game i've ever tested.

Batman Arkham Origins



I've deleted maximum FPS bar, since it was pulling over 300 and was irrelative.

Battlefield 4



There is almost no difference between i7 mode and i5 mode.

Bioshock Infinite



There is no difference between i7 mode and i5 mode.

Call of Duty Advanced Warfare




HT slightly decreases performance.

Company of Heroes 2



HT clearly hurts performance in this very demanding RTS

Crysis 3



There is quite a notable performance drop with HT on.

Dragon Age Inquisition



HT hurts minimal frame rate performance, while average and maximum remain the same.

F1 2015



HT slightly decreases performance.

Far Cry 4



There is no difference between i7 mode and i5 mode.

Hard Reset



Once again HT hurts minimal frame rate performance, while average and maximum remain competent.

Hitman Absolution



HT only improves maximum frame rate performance in this game - the same pattern was observed with Core i7 3770.

Max Payne 3



HT slightly decreases performance.

Metro Last Light Redux



Once again HT hurts minimal frame rate performance the most in this demanding game - the the same pattern was observed with Core i7 3770.

Rainbow Six Siege



There is a very small decrease in performance with HT on.

Serious Sam 3



HT decreases performance slightly.

Starcraft 2 Legacy of the Void



HT decreases performance slightly, yet constantly.

Tomb Raider



Like in Hitman Absolution, HT slightly increases maximum frame rates.

Watch Dogs



The performance drop with HT in this game is just too big to justify Core i7 over Core i5.

Witcher 3 Wild Hunt



Let's call it a draw.

-------------------------------------------------------------------

I've made these benchmarks 5 times in a row and they are as real as you can get.

CONCLUSIONS

Ever since i've got my first Nehalem Core i7 920, i've noticed no performance improvement in games with hyper-threading turned on. I have "cementified" these observations with testing my Core i7 3770 and now i do the same with a Core i7 6700K at my friends place. It's a pattern that continues for 6 years now... HT is not worthless in games however - it delivers awesome performance in Core i3 processors, but not in Core i7. Also HT might significantly improve online game performance, but that is not my domain.

1. Intel Core i7 HT offers no improvement in single player games.

2. Intel Core i7 HT slightly hurts gaming performance in most of the tested single player games.

I wish HT would improve gaming performance, but it actually hurts!!! It's a big disappointment and my friend made a mistake by replacing his Core i5 3570K with Core i7 6700K, because all he does is game, and nothing more.
 
CONCLUSIONS

Ever since i've got my first Nehalem Core i7 920, i've noticed no performance improvement in games with hyper-threading turned on. I have "cementified" these observations with testing my Core i7 3770 and now i do the same with a Core i7 6700K at my friends place. It's a pattern that continues for 6 years now... HT is not worthless in games however - it delivers awesome performance in Core i3 processors, but not in Core i7. Also HT might significantly improve online game performance, but that is not my domain.

1. Intel Core i7 HT offers no improvement in single player games.

2. Intel Core i7 HT slightly hurts gaming performance in most of the tested single player games.

I wish HT would improve gaming performance, but it actually hurts!!! It's a big disappointment and my friend made a mistake by replacing his Core i5 3570K with Core i7 6700K, because all he does is game, and nothing more.

I thought this issue was in part due to those AAA games being designed mostly for consoles with weaker and few cores (until recently, even then the multicore consoles on the market have relatively weaksauce individual cores), and or designed for the lowest common denominator rigs, which was 2 core rigs. I think 2 core still barely edges out 4 physical cores as the most common processor type on steam.

Civ6 having Core i5 4th gen as a recommended spec was surprising to a lot of people.
 
Last edited:
I thought this issue was in part due to those AAA games being designed mostly for consoles with weaker and few cores (until recently, even then the multicore consoles on the market have relatively weaksauce individual cores), and or designed for the lowest common denominator rigs, which was 2 core rigs. I think 2 core still barely edges out 4 physical cores as the most common processor type on steam.

Civ6 having Core i5 4th gen as a recommended spec was surprising to a lot of people.
Yes you are right in that respect as some developers have not made strides to give the games benefits of using more cores. I also figures that if you design a game to use say 6 cores then anyone with a 4 core system could have reduced performance.

The Xbox 360 had a triple core processor so developers could use all 3 but porting over a console game to PC is trashy at best. They have loads of performance issues and bugs.
 
Well I don't have time to keep trying to teach you anything.

Go look it up. As for the other patrons here...

Wow, that's incredibly rude. Luckily, I don't need you to teach me anything... because my *actual job* is optimizing games for multi-core architectures. Now of course, we all have things we can learn, myself included, but having done it for years, I feel I can say without arrogance that this is something I'm both very knowledgeable about and quite good at.

I should probably assume from your tone that there's no point in me trying to argue this further, but I will try anyway, because, well, you never know.

You posted a long list of titles that don't benefit from hyperthreading, and actually perform worse with it. I do not dispute that this is the case. But none of those games are well-optimized for large core counts, so are, by definition, not hyperthread-friendly.

Here is a benchmark that actually demonstrates the effect of hyperthreading:
dx12_cpu_ashes_of_the_singularity_beta_2_average_cpu_frame_rate_high_quality_19x10-100647718-large.png


As you can see, once a certain number of cores (8) is reached, there is no benefit from hyper-threading, because there is no more work to be distributed, and lo and behold, the performance is worse with hyperthreading turned on.

But at every other core count, the performance is *significantly* better, tapering off somewhat at 6 cores, likely because there is a bottleneck thread.

At 8 cores, that bottleneck thread is holding everything back.

For all the games you've listed (without exception) there is a bottleneck thread holding the game back from better performance *at the core count tested*.

The simplest example of this is a 1-thread game. It *cannot* perform better on a 2-core, or on a 1-core /w hyperthreading. However, a 2-thread game is *guaranteed* to perform better on a 1-core w/ hyperthreading. (Assuming both threads do a significant amount of work)

The reality of most games is that there is a single thread that is by far the most expensive. Once you expose that thread (put it on a core by itself) it's impossible for performance to improve with more cores, or with hyper-threading.

With the advent of the X360 & PS3, (a 1-core & a 3-core, both with hyper-threading) developers started to really focus on splitting that bottleneck thread into two threads. (Main & Render)

I won't get into the mess that was PS3 development, but 360 developers often went a step further, adding a physics thread, a particle thread, a network thread, and there are a large number of games that utilize all the hyperthreads on the 360.

It is a fact that 360 games which fully utilized all available hyper-threads would not have run anywhere near as fast with hyper-threading turned off.

With the advent of the XB1 and PS4, even more titles started targeting 6 cores. Neither of those platforms are hyper-threaded, because AMD is super-late to the party. Their first hyper-threaded chip, the Zen, will launch in Q1 2017.

AMD's decision not to go with hyper-threading almost put them out of business. But go ahead, believe it doesn't help. :p

The reason I keep mentioning consoles is that multi-platform games get the most optimization attention on console, simply because it is usually the worst performer. If the CPU performance of a title is worse on a person's PC than on console, it's a *really* crappy CPU.

So, XB1 and PS4 have pushed more and more games to be 6-core friendly. However, rendering thread performance is almost always the bottleneck. So devs started focusing on multi-threaded rendering.

The push for this was driven largely by console, so naturally what you see is that most games don't benefit from more than 6 logical cores. What this means is that a 3-core w/ Hyperthreading is about the most that a lot of games really need. Also, a lot of developers didn't bother to make the PC version multithreaded at all, because the XB1 and PS4 are so underpowered relative to a modern PC.

Which is why an i7-920 is still relevant. It is comparable, if not faster, than a XB1 or PS4, and has more logical cores available to developers. So no one has been targeting anything better.

To recap: There is one simple reason why games often don't benefit from hyper-threading. They're usually poorly multi-threaded. One thread is usually such a bottleneck that it runs slower than all other threads combined, meaning that a 2-core machine without hyperthreading is all you need.

But that is changing as we speak.

DX12 in particular is changing that. DX11 helped. Civ6, even with only 2 render threads (I assume) in DX11 mode, is already one of the most multi-core friendly games ever made (if the benchmark tool can be believed) and it still has a lot of room for improvement, as I mentioned earlier in the thread. This trend will only continue.

So, yes, unfortunately, most existing games don't benefit much from large core counts, whether they be logical (Hyper-thread) or physical cores. But that is because the game itself was not optimized.

Every game that is multi-core friendly benefits dramatically from hyper-threading until a certain core count is reached.

This is not an opinion. This is a fact.

However, although your conclusions were wrong, we actually agree on many things you said. As I mentioned, I only switched from an i7-920 this year, because this was the first year that having more than 8 logical cores helped *any game*. Civ Beyond Earth didn't, despite using Mantle. It only really used 6 logical cores if I recall. Rise of the Tomb Raider (DX12) and Doom are the first games that I've seen benefit from it.

Not that many games took advantage of Mantle, Doom is really the first optimized Vulkan game, and RoTT was one of the first optimized DX12 games. (AotS the other obvious one)

So, yes, in the past, it was often true that turning on hyper-threading hurt performance, but only because those games didn't need the extra cores. But, going forward, this is changing. You'll see soon enough. :)

It's a chicken-and-the-egg thing. Developers won't target 8-core/16HT machines as long as people who have them are in the minority. People won't buy them until they help performance. Hence, XB1 and PS4 have largely driven multi-core utilization.

But right now, this is changing. Now that *most* PC owners have 8+ logical cores, it will quickly become a competitive disadvantage for a title to not take advantage of them. Now that Windows 10 is starting to take over, DX12 has enough momentum to justify developers targeting it.

Now, if you're still not convinced, I'll do some benchmarking when I have time, with my 12-core/24HT, with a variety of configurations (# of cores enabled, hyper-threading enabled, etc.)

I'll even make a little chart, showing definite evidence of just exactly how much hyper-threading (at each core count) helps Civ6 performance.

Cheers,
Cro
 
Last edited:
Love the Top Gun reference. Considering "The need for Speed" nature of this thread, I must say, ironically well done :thumbsup:
 
Now that *most* PC owners have 8+ logical cores

It wasn't until October 2016 that most PC owners had more quad core processors then dual core..... Where are you getting your information from? The non hyperthreading i5 4690k in still the most popular processor in builds today but the most popular purchased processor is the i7 6700k. Remember though that we are talking about purchases and not entire builds so the people purchasing it as a part are likely people like you and me.

A hyperthreading core is a logical core. At best i would say you will get a 30% boost under the best conditions. I would rather have 8 logical cores instead of 4x4 cores. On another note 64bit has been round for 12 years in consumer grade processors but gaming is just now starting to take advantage of that. Even though the consoles had 64bit support. Really I think developers are holding themselves back. I believe they should make games that should push people to buy better hardware but within reason. But in the same breathe processor speed hasnt really increased since the days of your processor. Look up skylake vs haswell, it makes me sad.

Also you post stats about a game that most pc gamers cant run maxed out. Ashes is just a game with no limits on what can be done with cutting edge technology.



Anyways since you have so many cores I would like to see how they affect civ 6 performance.

I'm starting to think with my next build I may just go all out and get a 8 core or higher and then keep it for years longer but on the flip side of that is the fact that motherboard based technology is increasing faster then processor speed so you may run into issues where you need or want a new connection or faster speed for something and have the processor outlive the motherboards useful life.
 
Interesting debate going on here. Still stuck on a dual core CPU here. I wonder how many of the cores were also using existing software running in the background?

I note the GTX1070 has 8gb of ram? How much of this ram do you think Civ6 is using? How much is the system having to go back to the ddr2/ddr3/ddr4 ram on your motherboard to do basic game tasks? I note the new Vega chips will be using 16gb of graphic memory and will be using HBM2? Will this be something Civ 6 can use? Would any game really use 16gb of graphics memory?

Looking forward to the AMD Zen chips which now appear to have hyper threading. However AMD will have an 8 core cpu for gaming customers.with 16 threads? Assuming all the hype here is not just that is this something that would work well with Civ 6? I know many who still rate Broadwell chips highly due to 6 cores but here in the UK the cost is prohibitive.

My present 2.4ghz E6600 Core 2 Duo is not really good enough for Civ 6. Graphics too. I am looking to upgrade in the new year and very interested to know thoughts on the new chips and what you think a PC needs to be future proofed. Especially if more games are starting to use all cores. If this is the case it's quite frustrating Intel's skylake will still only be 4 cores. 8 with HT. When you buy a machine you want it to last at least 5-6 years.

Out of interest are you running Civ 6 from Ramdisk?? I know a few people who install the Civ software onto their ram instead of a SSD drive. Ram is so much faster than a hard drive.
 
Interesting debate going on here. Still stuck on a dual core CPU here. I wonder how many of the cores were also using existing software running in the background?

I note the GTX1070 has 8gb of ram? How much of this ram do you think Civ6 is using? How much is the system having to go back to the ddr2/ddr3/ddr4 ram on your motherboard to do basic game tasks? I note the new Vega chips will be using 16gb of graphic memory and will be using HBM2? Will this be something Civ 6 can use? Would any game really use 16gb of graphics memory?

Looking forward to the AMD Zen chips which now appear to have hyper threading. However AMD will have an 8 core cpu for gaming customers.with 16 threads? Assuming all the hype here is not just that is this something that would work well with Civ 6? I know many who still rate Broadwell chips highly due to 6 cores but here in the UK the cost is prohibitive.

My present 2.4ghz E6600 Core 2 Duo is not really good enough for Civ 6. Graphics too. I am looking to upgrade in the new year and very interested to know thoughts on the new chips and what you think a PC needs to be future proofed. Especially if more games are starting to use all cores. If this is the case it's quite frustrating Intel's skylake will still only be 4 cores. 8 with HT. When you buy a machine you want it to last at least 5-6 years.

Out of interest are you running Civ 6 from Ramdisk?? I know a few people who install the Civ software onto their ram instead of a SSD drive. Ram is so much faster than a hard drive.

So far my 1070 uses about 3.5-4GB and thats with everything maxed out but i have my FPS limited to 60 and am running on 1080p.

I had a E6550 and upgraded that to the Q9550. That was a big jump but and then I upgraded that to the i5 4670K. I'm still running that at stock but have pushed it as far at 4.6Ghz just to see if she was stable at 1.3volts and it wasnt so my chip isnt that good but its decent/average.

I still use my E6550 and Q9550. They are still fast.

I have my Civ6 installed on my SSD but its gonna be put on my HDD. It doesnt even load as much as you think when its loading a save and the only thing that would affect is auto saves mainly. Maybe also loading in the character models when a trade deal comes in.

If you wait and get Kaby Lake (7 series) processors your still looking at keeping it for 5 years. I'll have my processor for 4 years next year and I'm wondering when I'm going to get to upgrade it.. I'm having a REALLY hard time justifying this GTX 1070 over a GTX 680.
 
Well this sytem I have had for nearly 10 years now. Albeit maybe I am not a hardcore gamer. I am mainly a civ 4 player but would be nice to try civ 6 and other games in the future.

Kabylake is interesting but frustrating it's still stuck as a quad core. From what I have read main bonus seems to be fact it can run faster at same TDP. Not sure what new instructions Kabylake can handle. The new AMD chip is supposedly a new design which can apparently scale well. Naples they are talking about 32 cores. Guessing some of the Xeon server chips will scale highly too.

I think a lot of these new parts will be out Jan/Feb. Not sure about the Vega chips. I may just settle for an RX480 for graphics. See what the new parts can do. If AMD CPU are still lagging 10-20% that will sway me to Kabylake. The main bonus of AMD might be price and future upgrade paths.

Certainly the ability to use multiple cores in the future is a key part of any new machine for me.

Do we know how well Civ 4 stacks up core wise?? I assume it was designed too early to use multiple cores. It is nealy 11+ years old now.
 
lol Yea civ 4 maybe able to work on dual core but really the 2nd would just be handling other computer tasks and the game on 1.

76312.png


Wrap your heads around this... the new processors are actually slower.

I wouldnt worry too much about number of cores just yet. I'm hoping AMD Zen will put a hurting on intel and competition with fire up again. Intel has 0 competition right now and they are jsut releasing lackluster processors.

What are you mainly doing with your pc? Game developers have had 10 years to scale their games up to higher number of cores and have not done so. Shoot Paradox games are still using 32bit engines and their games a AWESOME in depth strategy. The game developers strive to have their games work on the most popular hardware so that means you got another 5 years atleast with quad cores being the norm. The only way I see this changing is if intel releases a Quad core as the base processor and 12-16 cores as the high end. (NOT including Hyperthreading cores here) I would LOVE to see AMD release a 12 core or so as the mainstream model. Its not like a quadcore and a octocore cost more to make. Its still the same wafer but it is easier to get yields up higher on dual cores because it effectively produces more per wafer.
 
At present mainly the older games. Civ 4 and warcraft 3 etc.Looking towards some of the newer games. Divinity, Heroes of might and Magic among others. Mainly into war and RTS games. Not so much MMORPG games. Key thing is to future proof. One thing to note about the older chips. Many of the new instructions/features the new chips run won't be backwards compatible on the older chips. Very much looking for a rig that can last 8-10 years and maybe be upgraded CPU wise if something becomes better in 4-5 years. AMD with a new platform could do that. With Intel Coffee lake/Cannon lake may need a new motherboard. Kabylake is the dead end of this current chip as far as I can see. I do agree about some of the older Intel chips beating the current Skylake chips. (Pending on OC too.) Always take reviews with a pinch of salt as not all drivers are up to date when the chips/graphics are first reviewed.

Although here I am very keen to improve graphics too. My current graphics card is not up to the job. AMD Radeon HD 5670 with at best 1gb memory. Limited by a Dell PSU and case that won't handle larger graphics cards. This rig has served me well but I need to embrace the new games coming out.
 
One think about pc is there is NO such that as future proofing. A system isnt gonna last you 8 years and it shouldn't. I upgrade whenever I need to and that typically once every 5 years or so and I play everything maxed out and always have. Plus it gives me the parts for my 2nd and 3rd computers if I ever need them. It also cost way less then what everyone thinks.

I don't see AMDs new platform being able to last more then a few years until the new processors come out. I doesnt make them money and thats what they need now more then ever along with some dang good engineers.

If I were you I would be looking at buying used parts. If you like civ 4 then I think you would love civ6. I put it up there together with civ4.
 
I hope 2017 will be the year where dual-AMD computers (as in both processor and graphics card are AMD) become common again. Gaming rigs are quite overpriced right now because there doesn't seem to be enough competition from AMD. If AMD Zen is as good as advertised, then maybe we can get very a decent machine to play Civ VI at the highest settings at reasonable prices.
 
Ok, I tried benchmarking Civ6 on my 12C/24HT and it's pointless right now. The benchmark is a hybrid, if I'm interpreting the results right. During heavy processing, the high core counts dominate performance, with 16-20 logical cores pegged. But quite frequently the CPU utilization drops dramatically and the 2 render threads dominate performance.

So, until they patch a CPU perf improvement for DX11, or add DX12 support, the game will run just as well on a 4 core (2c/4HT, or 4C/0HT) as it would on anything else, until you hit "Next Turn", at which point it would fall apart. ;-)

I tried the benchmark with 2 cores and it had max frametimes of almost 1 second. It kind of throws off the results when the average FPS is much better without HT but the spikes are much worse. :P
 
Status
Not open for further replies.
Back
Top Bottom