the graphics are too good

SO You understand now?.

Look even if it takes me just 2 minuets, im not gonna waste 2 minuets of my life reading your rant about cedars or whatever. If you want to get your point accross, then you need to do 3 things:
1. put the main ideas of your post into bullet points
2. improve your sentance structure
3. get rid of the double capitals

if your post has anything to do with turn time, then apologize for wasting so much space because this thead has nothing to do with turn time
 
I don't know anything about the Cedar thing. But it doesn't surprise me that on certain games dual or quad core systems would be not be any faster than old single cores.

The argument, though, was that they shouldn't make the game so graphically intensive. It's not the sort of game that NEEDS eye candy.

I have no problem reading TA Jones' messages, nor do I have a problem comprehending the issue at hand.

I run Civ IV on a single core P4 at 3.46 ghz or whatever. The graphics card is totally weak compared to the rest of the equipment. It's a laptop as well. They skimped on RAM and the video card. Why must manufacturers do this? I see it ALL the time.

I had major lag running Civ IV at first so I went out and bought 1 gig of RAM to give me a grand total of 1.5. Problem solved. Now I only get slow at the end game section on large and huge maps. And I have found if I save and reboot, I regain some speed for some reason.

However, this is all just ramblin.' I agree with the initial poster. Some games just don't need all the eye candy. This is one of them.

Still, I have no complaints. My laptop is now 4 years old and running strong. The only other thing I've had to do is replace the grease on top of the CPU. Apparently the initial stuff was cheap and it had hardened and cracked, causing me overheating issues. So I bought some of that pure silver stuff and slapped it on. Problem solved.

Umm...I ramble again. :crazyeye:

Ghostwind
 
There are two solutions to this problem: Firaxis could never bother upgrading civ's graphics and end up going out of business, or the people with prehistoric computers could suck it up and shell out a few bucks for a decent graphics card.
 
There are two solutions to this problem: Firaxis could never bother upgrading civ's graphics and end up going out of business, or the people with prehistoric computers could suck it up and shell out a few bucks for a decent graphics card.

Far too many people think of computers just like they do their TVs. You buy one and it will be good enough until it stops working several years later. But that's not how the technology works. Moore's Law is still in effect and any computer technology essentially becomes obsolete every two years. If someone is using a computer that's 4 years old or more, they're basically working with a dinosaur.
 
Far too many people think of computers just like they do their TVs. You buy one and it will be good enough until it stops working several years later. But that's not how the technology works. Moore's Law is still in effect and any computer technology essentially becomes obsolete every two years. If someone is using a computer that's 4 years old or more, they're basically working with a dinosaur.

I really disagree. If a computer does the job for what you want to do 4 years ago, then there is not reason it will not do it now, except that "lazy" programmers assume everyone upgrades their computers offen.

Of course as far as civ is concerned, it does not worry me much. This attude is much more damaging in OS development, where it is taken to much greater length, and the lack of support for older systems can obligate you to upgrade your hardware.
 
...except that "lazy" programmers assume everyone upgrades their computers offen.

That's because they do. Even if an individual may not upgrade every two years, new systems are constantly being purchased and the power of the average system becomes higher and higher every year. Developers can't afford to neglect this fact and continue to produce products that will work just fine on 5-10 year old systems. Especially in the game industry with the competition that console systems add. If they don't stay relevant they will soon be out of business.

And even if people don't upgrade their entire systems, they will most likely buy new components for them, like new video cards. There was awhile where I was buying a new video card every year, until I finally found one that was powerful enough to handle the games I like to play. And even though the one I have now is a little over a year old, I would like to upgrade it again once I can afford to.
 
That's because they do. Even if an individual may not upgrade every two years, new systems are constantly being purchased and the power of the average system becomes higher and higher every year. Developers can't afford to neglect this fact and continue to produce products that will work just fine on 5-10 year old systems. Especially in the game industry with the competition that console systems add. If they don't stay relevant they will soon be out of business.

And even if people don't upgrade their entire systems, they will most likely buy new components for them, like new video cards. There was awhile where I was buying a new video card every year, until I finally found one that was powerful enough to handle the games I like to play. And even though the one I have now is a little over a year old, I would like to upgrade it again once I can afford to.

A computer that is 5-10 years old is obsolete. Other consumer products are mature. Computers are not.

I believe you are both talking about gaming computers.

I would agree that game developers will tend to leave anyone using a computer 4 years old or so behind as they can safely assume people who want their games are going to upgrade their machines every now and then.

But I think you are exaggerating if you believe any computer is obsolete after 4 years. I have seen computers at least 10 years old that are perfectly adequate for the functions they perform. You don't need a $2000 computer with the latest CPU and graphics card to run a few spreadsheets and user programs in a science lab. Many small businesses would not upgrade their computers every 4 years unless they had to. I would suspect they upgrade their computers only when the damn things break, or when they actually need to overhaul the information system they are using (very expensive).

My point is, when you are talking applications that don't need a lot of processing power, there is little point in restricting their use to the most powerful systems. When you are talking about video editing, code compiling, gaming, etc. - things where technology is more important - then it's more sensible to rely on the improving average of the power of systems you described.
 
Well, if all someone does is work with office documents and surf the web and get email, then they don't really care about civ's graphics, then do they?

Civ wasn't even that demanding when it came out. A system would have to have really low specs or the user would have to not know how to lower the graphics settings to have a problem at anything other than huge maps and/or lots of civs.
 
I think it's mainly people who don't have a graphics card or more than 512MB who will have issues running Civ 4. I agree Civ4 is not all that demanding but it at least takes a computer which is capable of gaming. The older civ games could be run on any machine really with the necessary OS. Civ3 and earlier I played on the office computer.
 
My comps relativly old (motherboard dates back to 03, but it was a good one then and I have a not bad graphics card and plenty of memory) but it works fine on civ. Since it can play Medieval 2:Total War well (really graphics intensive) it cant be that bad. If you are playing on a dinosaur computer (like the ancient Commodores they use at my school for doorstops), it is upgrade time. If you are playing on an old comp, dish out a few buck for memory, update your video drivers, make sure your graphics card is at least minimum quality, and play on duel maps the rest of your life.
 
Well, first time I got Civ4, I thought it sucked (*hides of angry CivFanatics:D*) cause my PC just got very lagged. I had recently been on Civ3, and... OMG so much lag:mad:...

However, once I could get past the lag, I realized it was cool...

My point is, for a game like Civ, you dont need such neaty graphics. The cool thing about Civilization is its umparalleled gameplay. Graphics are important, but not as much as gameplay is. Some game developers just don´t understand that.

I´m not refering to Firaxis here. Civ4 has an excellent gameplay. I just think that Civ3 graphics were fine, and there was no need to jump off to Civ4 graphics...
 
Well, first time I got Civ4, I thought it sucked (*hides of angry CivFanatics:D*) cause my PC just got very lagged. I had recently been on Civ3, and... OMG so much lag:mad:...

However, once I could get past the lag, I realized it was cool...

My point is, for a game like Civ, you dont need such neaty graphics. The cool thing about Civilization is its umparalleled gameplay. Graphics are important, but not as much as gameplay is. Some game developers just don´t understand that.

I´m not refering to Firaxis here. Civ4 has an excellent gameplay. I just think that Civ3 graphics were fine, and there was no need to jump off to Civ4 graphics...

I cant belive it took 20+ post for someone to agree with me.

I would not consider any of the civ games real hard core, they are not the kind of games you should have to buy expencive computers/video cards/crap for. Obviously if you are running windows 95 or something, you cant expect civ 4 to work very well. But they should at least make the game so the average, non hard core gammer, can play without having to upgrade.

I mean, common, you got to display tiles, units, cities, and some interface. Civ is a strategy game, not a first person shooter or an RPG were the gamplay is so bland that you need graphics to make up for it (shooting people aimlessly gets boring after awhile if there are not pretty graphics to look at). Strategy games, expecialy turn based ones, are fun because of the strategy, not the graphics. The people who made civ can't never upgrade the graphics (please excuse the double negetive); they will go out of buisness. but at least they could keep the will of the average player more in mind.
 
lordmacroer, you just said it... "at least they could keep the will of the average player more in mind."

What you really want is to keep the will of the not-so-average player in mind.

The computer I bought in 2002 ran Civ4 very well. This was not a gaming computer at all. It did have a graphics card but it was cheap and nasty.

The computer I use now also has a very modest graphics card and while it doesn't work perfectly for some games, Civ4 is somewhere near the bottom in terms of graphics card load. Turn times are more dependent on the CPU and this is something that cannot be avoided unless you want to sacrifice AI.
 
My comp is far from a 'gamer computer'. I tried to run Bioshock and Fallout 3 on my computer, my computer started spitting silicon at me. The civ graphics are nice. Crysis, this game is not. (My friend who had an awsome gaming comp had to get a new one to fully handle Crysis.). Before I got more memory and a better graphics card (who package was around $60. That was on 07. If you shop around on Woot and sites like that, you should be able to find the part you need relativly cheaply.) As opposed to complaning, tell us your comp specs, then the tech people here who know what they are talking about can help you.
 
Yeah, Civ4 is playable with a reasonable PC.

The problem is, though, when I got the game, I hadnt got a reasonable PC:D

BTW, I´m alive:D (even having said I used to think that Civ4 sucked:D)

@lordmacroer

What are your specs? When I used to think Civ4 sucked (I´m testing your patience guys:D) I had the following specs:

2.66 ghz processor (too much processor, coming to think about it:crazyeye:)
512 MB RAM
GForce 5500 GPU (it is 64 MB, I recall?)
Windows XP SP3:king:

I also have to mention that I hadnt actually patched the game, and I´m aware that there was a memory usage error in version 1.0.

If you haven´t patched the game, patch it with the latest version: 1.74.
If you have patched it, but still works laggy... perhaps your specs arent good enough
 
I just think that Civ3 graphics were fine, and there was no need to jump off to Civ4 graphics...

Civ 3 graphics were 2D. That is now an obsolete format. Not only does it cost more to produce a game that way, but the general public expects new games to be produced in 3D these days. For one thing it's the only way to handle the plethora of different monitor resolutions out there as 2D graphics don't scale, 3D does. So yes, there was a need to jump off to Civ 4 graphics. As far as 3D standards go Civ 4's graphics are pretty basic. If your computer can't run them, then it's time for a new system. Or at least a video card. You people who keep moaning about Civ 4's graphics are starting to sound like a bunch of old fogey's: "Well back in the good old days..." It's time to get in touch with the modern world.
 
I cant belive it took 20+ post for someone to agree with me.

So rather than acknowledging that the other 20+ posts might have a valid point, you jump on the only one that supports your view. :rolleyes:

But they should at least make the game so the average, non hard core gammer, can play without having to upgrade.

Why should they try and appeal to the lowest common denominator? That's just a recipe for financial disaster. The average "gamer" does have a rig that can easily handle the game as they are also playing games like Oblivion and Neverwinter Nights 2, or even Crysis. Only the fringe player is too paranoid to upgrade their systems and they're not the ones that are spending the most money on games. If the consumer who is buying 8 or more games per year thinks a particular game looks like crap compared to the others he/she is also playing, it's not going sell. Period. People who only go out and buy a new game every 5 years aren't the ones that are keeping the video game industry alive. If a developer was to bend over backward to cater to them they're not going to stay in business, pure and simple.
 
Hi, I'm Camikaze, and I used to use a pretty bad computer. I always got that 'your machine is below minimum requirements' message every time I opened Civ. But since then, I've turned my life around with an all new and improved new computer plan. And you at home can too. This gives you the highest resolution and levels of graphics in everything, greatly enhancing your Civ experience, making it more aesthetically pleasing for your own personal enjoyment. I've done it and so can you. Join the let's get a new computer revolution today.
End of horrible attempt at parody.
You get the idea.
 
I think it's mainly people who don't have a graphics card or more than 512MB who will have issues running Civ 4. I agree Civ4 is not all that demanding but it at least takes a computer which is capable of gaming. The older civ games could be run on any machine really with the necessary OS. Civ3 and earlier I played on the office computer.

My main issue is that the "recommended" specs on the box are deficient.

My computer at home (not the laptop, the desktop) has the recommended RAM, a slightly better video card, and a better processor. This computer moves units painfully slowly on the lowest graphical settings possible. It's bad enough that it isn't tolerable. There shouldn't be movement lag and long time between turns when using the RECOMMENDED, NOT MINIMAL specs on the lowest settings! That's sour.
 
Back
Top Bottom