Real Gaming Challenge Rematch: Intel vs. AMD

@ 2006/08/02
All games are run at a resolution of 1280x1024, with details turned up high. We wanted to test by running the games the way real gamers do—at a reasonably high resolution with all the eye candy turned on. The vast majority of monitors sold these days are either 17" or 19", and 1280x1024 is almost always the native resolution for these displays. We're using a high-quality, speedy, but affordable graphics card: a GeForce 7900 GT (currently costing less than $300). While we want to play the games the way people expect to be able to—with the graphics options turn up high—we didn't want the graphics card to be the limiting performance factor, so we never enabled anti-aliasing or anisotropic filtering.
Comment from Rutar @ 2006/08/02
except your got money to waste

they should have done it like FEAR tough, with % for several FPS ranges and min FPS
Comment from The Senile Doctor @ 2006/08/02
no need to rush into core 2 even with 2 1/2 year old fx53 with high end gpu
Comment from Sidney @ 2006/08/02
so, no need to rush into Core 2 if you have A64 X2 with high end graphic card?
Comment from jmke @ 2006/08/02
let's analyse the table below:


Half Life: 113 or 86... both are playable, you won't notice the difference
Oblivion: 2 FPS difference.. not quite worth mentioning
BattleField 2: both well above playable FPS
Ris of Legends: aha, noticeable difference here, this RTS is CPU dependant
Titan Quest: both well above playable FPS
World of WarCraft: no difference

to sum things up; only in CPU limited games, does the CPU matter. what's new?
Comment from jmke @ 2006/08/02
"real gamers" will enable AA/AF if GPU performance allows it, and it still remains GPU limited with little to no impact from CPU except for the odd game like Oblivion and Rise of Legends