Our test system is based on an Intel Core 2 Duo with Intel 975x “bad axe” motherboard, 2Gb of system memory, a fast Western Digital Raptor hard drive and fresh install of Windows XP SP2. While Windows Vista has been released earlier this year, and it has DX10, we’re waiting for games to actually make good use of the new DX10 features to make it worth switching operation system, as well as SP1 to work out the kinks in the system. With OEM’s asking Microsoft to keep XP around a bit longer it seems we’re not the only ones finding a lack of reasons to “upgrade” to a new OS. Our tests were done with the latest NVIDIA Forcedrivers available at the time of writing, 162.18 for all game tests except Bioshock, which we ran with 163.44 Beta drivers.
The enclosure used for our test is an Ultra Grid
Intel Test Setup
|CPU ||Intel Core 2 E6400 @ 2.8Ghz (from CSMSA)|
|Cooling ||Coolermaster Hyper TX|
|Mainboard ||Intel 975X Bad Axe (Modded by Piotke)|
|Memory ||2 * 1Gb PC6400 OCZ|
|Other ||Sunbeamtech 3D Storm|
Antec TruePower Trio! 650W
Western Digital 74Gb Raptor SATA HDD
Maxtor 200Gb SATA HDD
We previously did a performance review of the 8500 GT
and will highlight now only the differences between the DDR2 and DDR3 versions, we compared the performance with the Futuremark benchmark suite as well as 2 games we previously tested with the 8500 GT DDR2 too: TES: Oblivion and Colin McRae DIRT. A last minute addition was made and only run on the DDR3 version, the BioShock Demo.
In our 8500 GT DDR2 review we found playable resolution of 800x600 in most games with medium-low detail, only hand full older games allowed for 1024x768. When the 8500 GT DDR2 was overclocked slightly higher detail and settings were possible, we will compare both cards at default and overclocked speeds here, this brings us to the next subject.Overclocking the 8500 GT DDR3
The Leadtek 8500 GT doesn’t only come with DDR3 memory, the GPU clock have been boosted to, the default 450Mhz clock of the core on the DDR2 is passé, Leadtek has it running at 522Mhz, but that’s not the only change. The Shader clock also increased quite a bit, at 522Mhz GPU it’s already running at 1296Mhz, our previous Geforce 8 overclocking experience showed us the GPU and Shader overclock in “bumps” not in 1Mhz steps, the Shader on the Leadtek increases by 54Mhz step, the GPU was easier now, allowing for smaller 5~10Mhz steps. If you plan to push this video card to the maximum speed you’ll have to edit the BIOS to reduce the Shader clock speeds as those were limiting our overclock.
As a reminder, this was our 8500 GT DDR2 overclock:PNY 8500 GT stock 450/400 (900 Shader) --> 702/470 (1404 Shader) // GPU: +56% / MEM: +16%
At 702Mhz GPU the Shader is running at 1404Mhz which is already quite high, but the Leadtek 8500 GT DDR3 does better:
Leadtek 8500 GT stock 522/702 (1296 Shader) --> 729/810 (1836 Shader) // GPU: +39% / MEM: +15%
We ran into problems at Shader speeds higher than 1836Mhz, which is extremely high for a low end card.
Time to put these cards to the test: The synthetic benchmarks from Futuremark next ->