Your current graphics card?

Your graphics chipset?

  • Nvidia Card

    Votes: 27 56.3%
  • ATI Card

    Votes: 20 41.7%
  • Matrox Card

    Votes: 0 0.0%
  • 3dfx Card

    Votes: 0 0.0%
  • S3 Card

    Votes: 0 0.0%
  • Intel Card

    Votes: 0 0.0%
  • Intergrated graphics... please state

    Votes: 1 2.1%
  • Other... please state

    Votes: 0 0.0%

  • Total voters
    48

HAND

Armchair Philosopher
Joined
May 18, 2003
Messages
661
Location
School of Esoterica
For a bit of fun, im just wondering what graphics chipsets everyone has in their system... I just need to satisfy the little nerd in me. :) :scan:
 
nvidia GF 6800NU 128MB

It's overkill for civ3, but I like to play FPS games too.
 
Nvidia GeForce 4 MX. I want a better one - I'm starting to get stalling on games, and a faster one would make my unit making faster.
 
Nvidia Asylum GeForce FX 5200 256 MB

Says on the box that it's a "Asylum Limited Edition!! Over 10% faster than the standard 5200!!"
 
nVidia GeForce 64 Mb

Not that great, but it was good 2 years ago, when i bought my current machine.
 
Radeon 9700 Pro + Arctic Cooling VGA Silencer for some o/c potential :goodjob:
 
DaEezT said:
Radeon 9700 Pro + Arctic Cooling VGA Silencer for some o/c potential :goodjob:

My Nvidia Geforce 5200 can beat the crap out of your wimpy Radeon any day! :lol:

On a more serious note, is there any real differance between the Radeon and Nvidia cards? If so, what are the main differances between them?
 
256mb Radeon 9600 Pro.

@Strider: There's not a lot of difference between the latest two from what I've been reading.
 
ATI 9800Pro with a Zalman heatsink.
 
Strider said:
On a more serious note, is there any real differance between the Radeon and Nvidia cards? If so, what are the main differances between them?
Yes and no. Just a quick history lesson. Back in the early days of computers, programs were written where there was a "true picture". In other words, if a graphics card did not draw certian pixel a defined color, it was not showing what the programmer intended. Basically the program (i.e. the CPU) did all the calculations and feed this to the graphics card. Back then the Graphics card only displayed what it was told.

However, with better technology came more options. Things like Anti-aliasing and bump maping are not done by the program. The graphics card enhanced the picture before displaying it. The advantage is faster response times and reduced load on the CPU allowing for better AI and "behind the scenes" calculations.

So, now programmers do not define the "true picture". They only define the parameters to be met. For example, if I were to tell two people to draw a red firetruck, their pictures would be different. But would either be inaccurate? nope. Same thing with video cards (though the differences are very small). They both draw the same thing but in different ways.

Bottom line, for an average user, they probably would not notice much difference and should buy based on price, customer service, and a little bit of company loyalty. =)
 
Nvida Asylum GeForce FX 5200 256 MB
 
nVidia GeForce FX 5200, 128 MB
 
Yet another - nVidia GeForce FX 5200, 128 MB

I don't play shooters so this will do me for another couple of years and it is already long in the tooth.
 
Another GeForce 4 MX. Hey, I have all the games I could possibly want. :)
 
Strider said:
On a more serious note, is there any real differance between the Radeon and Nvidia cards? If so, what are the main differances between them?
CrackedCrystal covered the bases pretty well, but I just thought I would throw in that nvidea has a history of better support for Open Source. A lot of Linux users would never even consider an ATI card.

OTOH, to install the proprietary nvidea drivers, you have to recompile the kernel. To install the proprietary ATI drivers, you just have to compile a module, and tell the kernel to use it.

On still another hand :)confused:), unless you really *need* top-of-the-line 3D support in Linux, the Open Source community has reverse-engineered basic video drivers, so the default "nv" or "fg" drivers in Linux can handle 2D displays extremely well. (As a matter of fact, using the "fg" drivers, I get framerates of 1800+. When I tried installing a kernel with ATI support already linked, I only got 1650. :eek: )
 
Forgive me for bashing, but worst two cards are Geforce 4 MX and Geforce FX 5200 series.

They are simply crap. First one because it doesn't support vertex nor pixel shaders; second one becasue it is slow and DirectX 9 brings it to its knees. The 256MB doesn't mean sh.. because core is inadequate to handle anything more than 64MB.

That said, difference between NVIDIA and ATI cards is insignificant compared to difference between good and bad graphics card.
 
Strider said:
On a more serious note, is there any real differance between the Radeon and Nvidia cards? If so, what are the main differances between them?
Nvidia lost ground with the GF FX series due to their problems with running DX9, but have recovered their reputation well with the latest range. The R9x00 was definitely a better option within the previous generation of cards.

With the latest line of cards, the GF6x00 and RXx00, there is little difference at all performance wise, (provided you are comparing cards with equivalent price tags of course).

The generally accepted opinion of the two companies is that whilst ATI produces better image quality with their cards, Nvidia drivers are much more stable.

Edit: Also, Nvidia cards give better performance in OpenGL games (eg. Doom 3) whilst ATI cards are better in DirectX (eg. Half Life 2).
 
Top Bottom