Multi-core/64-bit support

Do you not have enough RAM or a decent GPU my Win 7, 8gb Ram, Intel i7 920,
radeon 5800 runs civ 4 great.

50 civ Rise of mankind :drool:
:drool: I want your computer
There's plenty of other reasons you should get Windows 7.

Your macs are all 64-bit and multicore because up until fairly recently, Apple had an absolute grip on the hardware used. Therefore, it could very easily dictate that all Macs are to be 64-bit and multicored, and voila, they were. PC's on the other hand are greatly varied in diversity. You have 3 CPU manufacturers, 3 primary graphics card manufacturers, a dozen plus motherboard manufacturers...etc. There are magnitudes more combinations. Microsoft probably did not want to alienate a large part of their market when they released Vista and Win 7.

But yeah, when XP was released, I believe OS X didnt even exist. And even after OS X was released, it supported both 32 and 64-bit for quite a while.



Your machine is not indicative of the average computer used for Civ 4. But even then, your point about not enough RAM stands. If you are trying to play huge maps with 2GB of RAM then it is you yourself that is in the wrong. Get at least 3 and give the OS some breathing room if you want to do that.
"OSX" was first released in 2001, ah Cheetah
Apple ruled with an Iron Fist which it has now SLIGHTLY loosened, still it allows vigourous quality control
OSX had 32 and 64 in 2005 with Tiger
 
That's something, which I don't get.

Win7 x64 would have been advantegous to you in case of having more than ~3,25 GB (maybe 3,5 - depending on your system). So, going back to XP x86 literally means to put the additional RAM into the garbage bin. And vice versa.
So, what did you really do?


You may stay with small maps and XP, where's the problem? :)

i only have 2 GB of RAM, so i didn't put anything in the garbage bin, :P
 
To address some misconceptions:

Multicore support: you pretty much have to have this built into the game engine from the ground up. There's always something to parallelize, especially in a simulation style game like Civ.

32-bit vs. 64-bit: the difference between 32-bit and 64-bit is the size of the addresses. Every byte of memory has its own address. With 32 bits, you can specify 4GB of addresses. Problem is, hardware components that the computer interacts with also get addresses. Keyboard and mouse? Addresses. Graphics memory? Addresses. Ethernet? Addresses. The OS needs to reserve some addresses too, so that''s why there's a 2-3GB limit.

The downside of 64-bit is that the addresses are twice as long as in 32-bit, so if you want to load a list of addresses (say, all your catapults in your SoD) you need twice the memory to store it and twice the bandwidth to access it.

The upside though greatly outweighs this. The most obvious benefit is more RAM, but another benefit is that everything gets calculated in 64-bits, so you don't need to spend more time working with very big or small numbers, and there are more registers available, which is the CPU's internal uber-fast memory which all the processing is acting upon. You can't say that a 64-bit application will always outperform its 32-bit equivalent, but you can say it generally will, even with the exact same amount of memory.

64-bit is pretty trivial to build for if you're using modern libraries and good programming techniques. Architecture-specifc hacks are pretty common in older engines for sure, because they made the games run faster but the code less portable. If Firaxis has good coders than 64-bit is just a button away.

Oh, and 32-bit will not be obsolete for a while. There's a lot of old programs where we generally only have access to the binary and not the source code (so you can't just rebuild it for 64-bit). Until these programs are replaced, 32-bit on the desktop will be here with us. But no worries! Modern 64-bit processors can easily run 32-bit and 64-bit programs even at the same time. :)
 
Good post DPyro.

Thats exactly what I mean, if they do have 64-bit, it will be a simple matter to support both 32 and 64-bit. Therefore, I don't see what the issue is with having 64-bit support like people are clamoring about.
 
i've had dual core/64-bit CPU for 5 years now... but i'm still running WinXP/32... why? because i tried out Win7/64 last october for about a month and absolutely HATED it...

I don't think Firaxis should support XP at all, it's a ancient OS from 2001, supporting this would limited their possibilities.
If you want to stick with a ancient OS like that you should stick to a ancient game as well.
 
So it has to be a dilemma? If I'm want to play classics such as the Thief or Baldur's Gate series, let alone the early Elder Scrolls series, I'm not allowed to play newer games? That's barmy.
 
So it has to be a dilemma? If I'm want to play classics such as the Thief or Baldur's Gate series, let alone the early Elder Scrolls series, I'm not allowed to play newer games? That's barmy.
It's called OSX dude, besides on a Windows 7 you can have a second
hard drive with XP I fail to see the problem
 
So it has to be a dilemma? If I'm want to play classics such as the Thief or Baldur's Gate series, let alone the early Elder Scrolls series, I'm not allowed to play newer games? That's barmy.

It's not "barmy", new systems becomes gradually incompatible with older software. You can still probably get it to work via patches and/or emulation, but it seems far more "barmy" to assume that brand new games are going to work on ancient systems.

We're talking about an OS that's almost a decade old here. It's like using windows 3.1 when XP first came out.

It's also not like your current PC is going to evaporate if you buy a new one, so you'll still be able to play your old games on there even if you can't get it to run via emulation on a modern system.
 
So it has to be a dilemma? If I'm want to play classics such as the Thief or Baldur's Gate series, let alone the early Elder Scrolls series, I'm not allowed to play newer games? That's barmy.

ROFLOL! I have TES: I, II, III, and IV on my Mac! in addition to Baldur's Gate I&II
They work fine on Snow Leopard
 
I don't think Firaxis should support XP at all, it's a ancient OS from 2001, supporting this would limited their possibilities.
If you want to stick with a ancient OS like that you should stick to a ancient game as well.

Well there's a much simpler way to do that and that is to make the game engine be DX10. But then you have to look at how much of the market XP currently has which Firaxis will most certainly do. Then they will ask themselves, "do we want to cut off so many people for the sake of progress?" I don't think they'll be doing that. According to wiki XP still has at least a 55% market share, which is a huge number.

Also consider that Civ 5 wont be released only in the US. XP still has a rather large market share in other countries. I know its still fairly large in Russia because upgrading to something else is an un-needed expense.
 
So it has to be a dilemma? If I'm want to play classics such as the Thief or Baldur's Gate series, let alone the early Elder Scrolls series, I'm not allowed to play newer games? That's barmy.

Dosbox & or VirtualPC would cover you, or do what I do...maintain an older machine running in isolation ( the web is definitely nogo on it) using Win98 SE.
 
So really, isn't 7 years long enough to wait?

Well, this is where the Mac users cry out in pain and frustration. All current machines are 64 bit, all current machines are multicore, and Snow Leopard goes out of its way (so I am told by people who can code "Hello, world" without a handbook) to make parallel programming easy. But Firaxis continues to pretend that the world is made of Windows users only ...

Oh well. StarCraft 2 will probably be out first, in both Mac and PC versions.
 
The CPU has very little performance impact on games nowadays. The real bottleneck is (and has been for years) the video card.
 
64-bit support? You can count on it.
Multi-core support? Depends on what is meant. Will it run on a multi-core processor? Sure. Will it run on a multi-processor system? Of course. Will it exploit the multiple cores and/or processors in a sophisticated fashion? Aye, there's the rub. Namely, the extra time and expense of development to write code explicitly for multiple cores/processors.
 
Well, this is where the Mac users cry out in pain and frustration. All current machines are 64 bit, all current machines are multicore, and Snow Leopard goes out of its way (so I am told by people who can code "Hello, world" without a handbook) to make parallel programming easy. But Firaxis continues to pretend that the world is made of Windows users only ...

Oh well. StarCraft 2 will probably be out first, in both Mac and PC versions.

Macs capable of gaming have a very small market share. It makes little economic sense to devote resources to a version that will in all likelihood not sell nearly the same numbers as a Windows one. Blizzard is not a good example. They have money coming out the wazoo thanks to WoW. Additionally, it makes sense for them to release the sequel to the best-selling game ever on as many platforms as possible. They have one of the few titles that have a market anywhere.

The CPU has very little performance impact on games nowadays. The real bottleneck is (and has been for years) the video card.

That's still dependant on game. Your statement is pretty much a regurgitation of what all the benchmark sites say, but if you will notice, they mainly test with real-time games where a lot of graphics have to be recalculated on the go, and there are a lot more effects on the screen. In a TBS such as civ, the CPU and RAM capacity are still quite important.
 
Macs capable of gaming have a very small market share. It makes little economic sense to devote resources to a version that will in all likelihood not sell nearly the same numbers as a Windows one. Blizzard is not a good example. They have money coming out the wazoo thanks to WoW. Additionally, it makes sense for them to release the sequel to the best-selling game ever on as many platforms as possible. They have one of the few titles that have a market anywhere.

If you stay off the Microsoft bandwagon and use crossplatform libraries such as SDL and OpenGL, you're already most of the way to being able to port to Mac OSX, Linux, etc...

Epic Games has released versions of Unreal Tournament on both Mac OSX and Linux, and there's no technical reason why it can't do the same for the UT3 engine.

That's still dependant on game. Your statement is pretty much a regurgitation of what all the benchmark sites say, but if you will notice, they mainly test with real-time games where a lot of graphics have to be recalculated on the go, and there are a lot more effects on the screen. In a TBS such as civ, the CPU and RAM capacity are still quite important.

I play way too much Team Fortress 2. It's a fast-paced multiplayer shooter, and it is almost entirely CPU-dependent once you get a dedicated graphics card.
 
64-bit won't do much, but good threading is definitely needed.

The CPU has very little performance impact on games nowadays. The real bottleneck is (and has been for years) the video card.
If I recall, the #1 bottleneck for Civ 4 is the L2 cache, which is part of the CPU. The screen shots do not look such a major leap forward that this should not continue to be the case.

Also, from observation Civ 4's greatest problem was long AI turns in the late game. That's certainly a CPU issue, not graphics.
 
If you stay off the Microsoft bandwagon and use crossplatform libraries such as SDL and OpenGL, you're already most of the way to being able to port to Mac OSX, Linux, etc...

Epic Games has released versions of Unreal Tournament on both Mac OSX and Linux, and there's no technical reason why it can't do the same for the UT3 engine.

I play way too much Team Fortress 2. It's a fast-paced multiplayer shooter, and it is almost entirely CPU-dependent once you get a dedicated graphics card.

Ah, with your username I figured you played TF2. Starting up your own server is very cpu intensive, but when I play regular MP I get maybe 10% utilization of my CPU. Granted, its an Intel Q9550 overclocked to 3.4 (or 3.6, can't quite remember) but even then, it's not by far CPU limited. Even my old P4 can play it when paired with a decent enough graphics card.

As for being crossplatform: yes, I hope to gods that they make it possible to easily port to Linux and Mac, I would be much happier finally getting some gaming use out of Ubuntu. At the same time though, It looks like the engine is similar to civ 4, which probably means DirectX. We can only hope.
 
Ah, with your username I figured you played TF2. Starting up your own server is very cpu intensive, but when I play regular MP I get maybe 10% utilization of my CPU. Granted, its an Intel Q9550 overclocked to 3.4 (or 3.6, can't quite remember) but even then, it's not by far CPU limited. Even my old P4 can play it when paired with a decent enough graphics card.

As for being crossplatform: yes, I hope to gods that they make it possible to easily port to Linux and Mac, I would be much happier finally getting some gaming use out of Ubuntu. At the same time though, It looks like the engine is similar to civ 4, which probably means DirectX. We can only hope.
They already said its gonna run with dx11 support, so you are probably stuck with windows.
 
That's still dependant on game. Your statement is pretty much a regurgitation of what all the benchmark sites say, but if you will notice, they mainly test with real-time games where a lot of graphics have to be recalculated on the go, and there are a lot more effects on the screen. In a TBS such as civ, the CPU and RAM capacity are still quite important.

I would often play Civ4 on my laptop which has switchable graphics, and when on battery I would experiment with different configurations. While lowering the CPU clock affected performance a little, switching to the slower video card (with full CPU clock) had a huge effect. It's anectodal, but it mirrors experience I've had with upgrading video cards and CPUs in my desktop.

That could all change if developers start making games utilize the CPU and extra cores more, of course. I don't think they do right now though.
 
Back
Top Bottom