graphics card

Civfan333

full metal alchemist
Joined
Oct 17, 2007
Messages
1,603
Location
Hidden base
What's a good graphics card that's below 100$? I currently have a NVIDIA GeForce 6150 LE and think I might want to upgrade. What's a good one?:confused:
 
What's a good graphics card that's below 100$? I currently have a NVIDIA GeForce 6150 LE and think I might want to upgrade. What's a good one?:confused:

I don't think you can get a card better than your current one for $100, but if you were looking in the price range of 100-150 you can get a 7*** for a decent amount.
 
The X1650 Pro will definitely do you fine if you have AGP. I have the card, and while it cant run oblivion on max, I managed to run World In Conflict on low settings at a decent speed ( lags only when I drop nukes or carpet bomb)
 
thanks for replies!!

ok, new question.

Is the Nvidia GeForce 8500 GT a good graphics card?
 
Any nVidia card that is below x6xxx is crappy and meant for low end to low end mainstream PC's
 
But is it better than a Nvidia 6150 LE?
 
Can it run Medieval II total war good?

Or Rome Total war?

Run good, I mean.
 
Most likely not very well at all. As I said, you want x6xx or higher. NEVER, EVER, buy anything lower than that as it is a complete waste of money.
 
I have a 6600GT that handles Rome:Total War pretty well, but choked on the demo of M2:TW. 7600 and 7900 should be cheap by now, 8600 is pretty good too.
 
7900's, surprisingly, arent too cheap. Much better to go with a 8600, or even an 8800 GT if you have 250$
 
How about getting a 8600 GT instead of a 8500? Would that play better than a 8500? What's the difference anyway?:confused:
 
Any nVidia card that is below x6xxx is crappy and meant for low end to low end mainstream PC's

My 5200 is still going well.. It can run Supreme commander if you turn off the specs check.

7900's, surprisingly, arent too cheap. Much better to go with a 8600, or even an 8800 GT if you have 250$

They are €140 here.

EDIT: Just noticed its $ which its $210


I have a 6600GT that handles Rome:Total War pretty well, but choked on the demo of M2:TW. 7600 and 7900 should be cheap by now, 8600 is pretty good too.

I'd say that's another part of your system (probably your RAM) causing the problems as my 5200 can run M2TW on the second highest setting.
 
I dunno, the numbers only really apply to the 6 series and above, as before they had a weird naming conventions ( FX 5200, 5500, 5700, 5900, but nowhere did you see a 5800). Either way, buying a FX today would be a bit counterintuitive
 
I dunno, the numbers only really apply to the 6 series and above, as before they had a weird naming conventions ( FX 5200, 5500, 5700, 5900, but nowhere did you see a 5800). Either way, buying a FX today would be a bit counterintuitive

An FX 5200 isn't massively different from a 8800 GT today. No huge gains on speed. The difference is that the new cards support better pixel shading and other modern graphic features. Speed/Memory is not much of an improvement.
 
I dunno, the numbers only really apply to the 6 series and above, as before they had a weird naming conventions ( FX 5200, 5500, 5700, 5900, but nowhere did you see a 5800). Either way, buying a FX today would be a bit counterintuitive
http://www.nvidia.com/page/fx_5800.html

An FX 5200 isn't massively different from a 8800 GT today. No huge gains on speed. The difference is that the new cards support better pixel shading and other modern graphic features. Speed/Memory is not much of an improvement.
That's not true at all - graphics cards are continually getting faster, as well as having more features. A quick glance at benchmarks ( http://www.tomshardware.com/2003/12/29/vga_charts_iii/index.html ) shows the FX 5200 to be at the slow end even compared to old cards.
 
Speed gains are major when comparing those two. For instance, a FX 5200 has a 250 mHz core clock, while the 8800 GT has a 600 mHz core clock. The FX has no separate shader clock listed, while the 8800 GT has a 1.6 gHz one. The FX 5200 cannot support more than 256 mb of memory which is only DDR1 and runs at a max of 400 mHz. The 8800 GT on the other hand is GDDR3 running at a maximum of DDR3-1800.

The two cards have a severely different bus speed as one has AGP/PCI support, while the other is PCIe 2.0. Even the Bus width is double for the 8800 GT. The overall fillrate is also much much higher in the 8800 GT.

In other words, there is a major difference. What you're saying is like comparing a 3.0 gHz P4 with a 3.0 gHz C2D/C2Q. Sure the numbers are the same, or similar, but the speed difference is huge.

EDIT: sniped. Also

Thanks for pointing that out. Ive never really seen that card.
 
Speed gains are major when comparing those two. For instance, a FX 5200 has a 250 mHz core clock, while the 8800 GT has a 600 mHz core clock.

As far as computing power. That is pretty much nothing. Checking the specs on my cell phone. It has a clock of 1900 mhz. So basically your buying the equivalent of 1/3 of a cell phone.

The FX has no separate shader clock listed, while the 8800 GT has a 1.6 gHz one.

http://www.newegg.com/Product/Product.aspx?Item=N82E16814130319

That doesn't show any separate shader clock, much less one that runs at 1.6 ghz.

The FX 5200 cannot support more than 256 mb of memory which is only DDR1 and runs at a max of 400 mHz. The 8800 GT on the other hand is GDDR3 running at a maximum of DDR3-1800.

Which is.. once again. Relationally not a huge improvement. A little more than double, but that is not saying a lot in terms of processing capabilities.

In other words, there is a major difference. What you're saying is like comparing a 3.0 gHz P4 with a 3.0 gHz C2D/C2Q. Sure the numbers are the same, or similar, but the speed difference is huge.

The speed difference is not "huge." They have barely doubled in the past few years. It's even worst when my little cell phone more than triples it's clock. Graphical cards should not be rated by clock or memory speeds, because they are negligible to the performance of the card. An 8800GT would do better than "barely double" the performance of a FX in benchmarks.

As I said.. the difference is better support for shaders, graphic technology, physics, and other aspects.
 
Back
Top Bottom