Question about Quad cores.

Skwink

FRIIIIIIIIIITZ
Joined
May 14, 2010
Messages
5,688
If I get a computer with a quad-core processor, but it says it's 1.9ghz, will it be good for modern gaming? The video card is AMD Radeon HD 7640G, and it has 6 gigs of RAM.
 
If I get a computer with a quad-core processor, but it says it's 1.9ghz, will it be good for modern gaming? The video card is AMD Radeon HD 7640G, and it has 6 gigs of RAM.

The graphics card is more important overall.

Quad/Hex/Whatever cores are definately good for gaming though. Problem is, the way most gaming software is coded, they don't come close to taking full advantage of the multiple cores through things like multithreading, etc.
 
Eh, a quad at 1.9 ain't great for gaming.
 
Yeah, but I'm looking for something around $500, and I'm mainly looking to play games like TF2 and other source games.

Do you think it could run Civ V tolerably?
 
That would depend upon your tolerance :) It would not be good for huge size maps, no, unless you're okay spending minutes between turns.

Most games aren't going to take full advantage of a quadcore, so you would be better off with a faster dual core processor for the specific task of gaming.
 
Ok, thanks.

I've heard before to avoid Intel video cards is I intend to play games, is that generally true?
 
GHz is a worthless measurement.

A single core of a modern 2GHz CPU is over double the speed of a 3.8GHz Pentium 4.

The GPU in that laptop (and pretty much any laptop) is going to be way more of a limitation than the CPU. (ie. CPU is totally irrelevant to games in a laptop.)

Basically no computer runs civ5 tolerably, but it's not fun anyway, so you're not missing anything.
 
Basically no computer runs civ5 tolerably, but it's not fun anyway, so you're not missing anything.

It shouldn't have any trouble with Source games, right? That's really what I care most about, and I know source is built to run on most things.
 
Ok, thanks.

I've heard before to avoid Intel video cards is I intend to play games, is that generally true?
That doesn't make much sense. The graphics chips are made by AMD (ATI) or Nvidia. The graphics cards are pretty much the same, although some are a bit more fancy than others in regard to heat dissipation, quality of the other components, and the like.

I would personally recommend Nvidia chips. They seem to have less compatibility issues than ATI does.
 
While we're playing Dear Abbey, I'd like your opinion on my plans for the big tax return check in February.

My last two computers have been laptops, for mobility. But since I got the Kindle, my current old Gateway hasn't left my desk. Towers seem to have more power and accessories for less cost. I'm thinking of returning to the tower and upgrade to a full tablet for travel.
 
If I get a computer with a quad-core processor, but it says it's 1.9ghz, will it be good for modern gaming? The video card is AMD Radeon HD 7640G, and it has 6 gigs of RAM.

I am using an I7 860. It claims 2.8

I have found that you need to stay on top of the extraneous process issues
or performance DOES degrade.
 
Intel GPUs are sh*t because they are integrated in the motherboard, they suck up RAM and are in general very weak. Be sure to have an Nvidia/Ati card, because otherwise you'd throw away your money pointlessly.
 
Yeah, CiV is going to have slow moments no matter how great of a computer you have. I have a beast and it struggles at some times in some situations like bigish maps.
 
Well then, do you know any of their GPUs that are at least comparable to contemporary mid-low tier Ati/Nvidia cards?

See the previous link I posted, Trinity has about a 20% advantage over HD 4000.

This doesn't make integrated Intel a bad choice for laptops - laptops are generally a bad choice for gaming.

The only reason I'd still recommend discrete graphics at all at this point is because integrated solutions aren't yet powerful enough to properly drive multiple high-resolution displays for intensive desktop use.
 
Back
Top Bottom