Interesting article about DX11 and Civ 5 at Anandtech

Moving from an nvidia GT 8800 to a GTX 480 I can say the game feels like "it's finally breathing properly". For the first time in many, many years I feel like I just moved from a S3Trio to a 3DFX Voodoo1. It's that much of a difference.

Obviously people were complaining back then as well "Doh I just bought my new S3 GFX card"... lol.
 
I have a top-end system and it looks and runs great on it.

Anyway, I think the impression that dev time spent on a graphics engine like this = dev time not spent on gameplay is probably mistaken. Graphics engineering time and gameplay/AI engineering time are not frungible - you can only have so many people productively working on gameplay or AI engineering at a time, so a partnership with NVidia or whoever probably means the engineering support to build out a new graphics engine without really affecting the rest of development (a snag here and there, sure, but it sounds like they had solid process for dealing with it). What they needed was more months of dev time, probably NOT more developers over the same amount of time. And I doubt it got pushed out the door because it ran out of money - I find it more likely that 2K just needed some extra revenue and Civ V was the sure hit that was closest to ready to go.
 
I have fairly high end system: i7 860 2.8ghz, nvidia gtx 580 with 3gbs of ram with an Asus 3d monitor, & I must say civ V is one of the most beautiful games I've ever seen. It's always smooth, never crashes. & honestly civ in 3d is so much better than I expected it to be.

Nice work, Firaxis & nvidia!!
 
10%? yeah right. I would guess that not even 1% of civ V buyers have the necessary video card and system specs to take advantage of this technology. What on earth were they thinking? For a small fraction of the budget, they could have made a lower-tech game which runs properly on all systems. My computer is less than a year old and has a fairly good video card, but it still lags at least 10 seconds on each turn.

I have an ATI 4870x2 card and except for tesselation (whih is used only for leaderheads) and other DX11 features (which seems to concern more the performance than graphic appearance) i can have all details set to maximum.
Considered that my GPU is 3 years old, the engine shows to be really scalable.
I have also friends which run the game with a single core CPU.

Taking in account that this game could have a potential lifespan of 5 years i'm quite sure that this was a really wise decision.

Where this game partially fails is in the gameplay not certainly in its technical proficiency
 
ATI will hopefully catch up with the 11.4 driver update in this month... :)

Something tells me Lenovo is going to wait for win 7 sp2 to incorporate updated drivers into the switchable graphics software. They don't seem to care to do more than the bare minimum for the w500 these days.
 
I, too, have a computer that can run optimized settings. Never have had a crash and the visuals really are stunning (except for rivers, what's up with that?).
 
Radeon 5870 and I crash regularly. (If you care about other relevant components: Phenom X4 955 3.2GHz, 4GB Mushkin O/C'd to 7/7/7/20, ASUS M4A79 XTD. And no, other games don't crash with the O/C. Well, except Fallout: New Vegas, but you expect that.)

Here's hoping that ATI catches back up to NVIDIA in the driver race. They're the clear leader in the price/performance silicon war at the moment, but NVIDIA has often had the better drivers.
 
Maybe this post from February is now more relevant
I think it points to how much the marketing team (whether 2K or Firaxis, I don't know) are trying to emphasise the graphical appeal of the game:
I have to say, I can't help but laugh at this...

steamciv5ad.png


Checking this quote, one finds: (link)

Graphics may be superficial in a game like this, but it has to be said that Civ V is indisputably the best looking turn-based strategy game ever made.

For the record, I use an ATI HD4850 (which to be fair is a bit oudated nowadays) and I find the graphical performance of the game, mainly in scrolling around the map, unacceptably sluggish.
 
I have an ATI 4870x2 card and except for tesselation (whih is used only for leaderheads) and other DX11 features (which seems to concern more the performance than graphic appearance) i can have all details set to maximum.
Considered that my GPU is 3 years old, the engine shows to be really scalable.
I have also friends which run the game with a single core CPU.

Dual GPU does seem to work better for this game if you go ATI. I built a rig for my in-laws in March with an X6 and a dual 5770 setup (not that they use Crossfire; they want to run a half-dozen monitors for trading purposes), and things ran a lot smoother under Crossfire than they do with my single GPU solution.
 
I wonder if some of the stability problems could be due to reasons other than the GPU technology installed. I run an ATI DX11 card (Radeon 5670) and have never had problems with crashes. And this on a circa-2007 Core2 duo.

If your cooling is inadequate, you don't update drivers regularly, or you are running substandard memory then I imagine you might get crashes using any software that puts a strain on your hardware. It'd be interesting to see if crashes correlated to component temperatures in the player's box (as if Steam didn't already collect too much info).
 
They made the engine future proof so they can continue building on it and it most systems will benefit from it in the long run. That is a good thing, consider how many years IV ran on the same engine and then evaluate their choice of focusing on DX11.

So they divert resources necessary to make the game good to cosmetics, in order to make the game look good on some Nvidia DX11 chipsets. I get that.

But where you lose me is where you announce that this is a good thing. Making a worse game that looks good sometimes is better for the game company, how? Please help me out here, I cannot see the logic in your position.

Edit: @The Quasar, you fail basic statistics. Look up confidence and sample size before you start quoting percentages again.

But tl;dr saying that 35% of the people get the game working perfectly graphics wise, based on a sample size of 10 is about as useful as asking a billionare on advice on how to stop people evading taxes, totally useless.
 
Radeon 5870 and I crash regularly. (If you care about other relevant components: Phenom X4 955 3.2GHz, 4GB Mushkin O/C'd to 7/7/7/20, ASUS M4A79 XTD. And no, other games don't crash with the O/C. Well, except Fallout: New Vegas, but you expect that.)

Here's hoping that ATI catches back up to NVIDIA in the driver race. They're the clear leader in the price/performance silicon war at the moment, but NVIDIA has often had the better drivers.

While this won't help with your crashing issue, I have my Phenom II 955 C3 undervolted at 1.2 and it reduced my CPU temp by 8-10 degrees C at full load on the stock heat sink. Just FYI.

I just upgraded to an HD 5770 from a 9500GT and I too really hope they catch up with the drivers.
 
I have a radeon 5770 and have had the scroll crash happen to me rarely and only when I play huge maps.

If I play large maps and below the game runs flawlessly and is beautiful. It's unfortunate it took them 6 more months to get the game right gameplay wise but the other peeps are right, this engine will server them well over the next 5-6 years.
 
So they divert resources necessary to make the game good to cosmetics, in order to make the game look good on some Nvidia DX11 chipsets. I get that.

That's just not how game development studios work. There are guys at Firaxis who work on the engine, there are people who work on art, and there are people who work on gameplay. Just look at the game credits, the engine programmers and gameplay programmers are different people because those are two very different skills.

Since they put so much effort into the engine (first multi-threaded DX11 graphics, etc), it's pretty clear they put extra effort into the engine because they plan to make a few games. Hired extra folks because of the expectation of future use of the engine. You can't hire extra game play folks in the same way, it's not creating a shared resource.
 
So they divert resources necessary to make the game good to cosmetics, in order to make the game look good on some Nvidia DX11 chipsets. I get that.

But where you lose me is where you announce that this is a good thing. Making a worse game that looks good sometimes is better for the game company, how? Please help me out here, I cannot see the logic in your position.

Edit: @The Quasar, you fail basic statistics. Look up confidence and sample size before you start quoting percentages again.

But tl;dr saying that 35% of the people get the game working perfectly graphics wise, based on a sample size of 10 is about as useful as asking a billionare on advice on how to stop people evading taxes, totally useless.

Comeon, this really comes out as sour grapes. One big angry rant at good grapchics, it's just hard to take you serious. If your system is inadequate, then please understand that pc gaming has moved into DX11 territory, and it's a good thing.

The problem with AI is a problem with AI. It's not some sort of dirty tradeoff made by CEO's now is it.
 
That's just not how game development studios work. There are guys at Firaxis who work on the engine, there are people who work on art, and there are people who work on gameplay. Just look at the game credits, the engine programmers and gameplay programmers are different people because those are two very different skills.

Since they put so much effort into the engine (first multi-threaded DX11 graphics, etc), it's pretty clear they put extra effort into the engine because they plan to make a few games. Hired extra folks because of the expectation of future use of the engine. You can't hire extra game play folks in the same way, it's not creating a shared resource.

Comeon, this really comes out as sour grapes. One big angry rant at good grapchics, it's just hard to take you serious. If your system is inadequate, then please understand that pc gaming has moved into DX11 territory, and it's a good thing.

The problem with AI is a problem with AI. It's not some sort of dirty tradeoff made by CEO's now is it.

Well, based on the product we were given (a broken game filled with bad mechanics) and the shennanigans going on behind the scenes, what I wrote is the only possible logical conclusion.
 
Well, based on the product we were given (a broken game filled with bad mechanics) and the shennanigans going on behind the scenes, what I wrote is the only possible logical conclusion.

What shennanigans, specifically?
 
That's just not how game development studios work. There are guys at Firaxis who work on the engine, there are people who work on art, and there are people who work on gameplay. Just look at the game credits, the engine programmers and gameplay programmers are different people because those are two very different skills.

Since they put so much effort into the engine (first multi-threaded DX11 graphics, etc), it's pretty clear they put extra effort into the engine because they plan to make a few games. Hired extra folks because of the expectation of future use of the engine. You can't hire extra game play folks in the same way, it's not creating a shared resource.

Everything is a budget. You budget how much money into engine programmers, and you hire that many engine programmers. You budget how much money into art, and you hire that many art designers. And so on.

So, yes, in a sense, the game studio can determine how many engine people and how many game play people they want to hire in the game. And this does affects the quality of the game in different ways.
 
Back
Top Bottom