Sparkle 8800GTS 512MB Review: G92 Madness Continues

Videocards/VGA Reviews by geoffrey @ 2008-01-28

The G92 based Geforce 8800GT was a very exciting product, unfortunately NVIDIA was short of supply and lower priced ATI HD3850 & HD3870 gained a lot of popularity this way. The higher-end market still remained untouched; to fill up this gap NVIDIA came up with a second wave of 8800GTS video cards. With the GT in mind, could we be dealing with yet another great NVIDIA product? We find out.

  • prev
  • next

Overclocking, scaling & Power consumption

Overclocking

Our Sparkle 8800 GTS 512MB test sample overclocked to 771/1836/1064, where 648/1620/978 are the default clock settings. Although we could get our sample clocked a bit higher per domain, we had to reduce some of the settings because the pc no longer remained stable under heavy 3D load. Again we ran through 3D Mark 2006 and see what other could expect when overclocking those new GTS's.

Madshrimps (c)


Considering the 19% increase in GPU core clock, the 13% increase in GPU shader clock, and the 9% increase in VGA memory clock, we could boost our 3D Mark score with less then 5%. My thoughts go out the the impact of the CPU, with a much higher clocked Intel C2D we could possible see much better results coming from our overclocked video card. Let's move on to our game test:

Madshrimps (c)


Like I said, D.I.R.T. is one the better games I've found to test 3D performance, logically our overclocked 8800GTS 512MB took the performance crown. The downside is that we gained not too much performance, only 9% higher frame rate while the difference in minimum frame rate is even smaller. Possible, the memory bandwidth is to blame, while the GPU rendering performance took a big leap forward we see FPS increments on par with the memory clock increments.

Performance scaling

To further research the 8800GTS behavior, I overclocked all 3 VGA domains separately and then compared it to the stock results we accomplished earlier in this review. I increased clock speeds with 9% in order to see what domain is most liked by the D.I.R.T. game engine, here are the results.

Madshrimps (c)


Well despite my thoughts that the memory bus did not provide enough bandwidth, it seems like it's just the opposite because the maximum memory chip overclock did not give us the performance boost seen above. In the end, the shader clock increment boosted frame rate the most, but with results this close the highest overclock will most likely give the best performance boost.

Power Consumption

Madshrimps (c)


The system power consumption was measured using a Brennenstuhl PM 230 electrical energy meter. In fact the power consumption is not measured, it is being calculated by measuring the AC voltage at the back of the pc's power supply multiplied with the measured current flowing through the mains cable and multiplied with the power factor which occurs when using capacitive or inductive loads on alternating current (AC). In any case, our device is no professional equipment, our results can be off by 5%, if not more, but at least we are not left guessing the power consumption. Do understand that this is the total system powered measured at the back of our PC's power supply, it is not an indication of what to expect from a single computer component. Here is what we got for our tested products:

Madshrimps (c)


While the GT did improve the 8800GTS 320MB not only by performance, but also by efficiency, the extra unlocked shaders and increased clock speed makes the new GTS fall slightly behind the two others 8800 video card. Total system usage is near 270W this way, maybe acceptable for most but I think its a bit on the high side keeping in mind the modest performance boost we obtained in-game.

Let's move on shall we, having a look at the heating and cooling ability's of the new generation 8800GTS videocards ->
  • prev
  • next
Comment from Massman @ 2008/01/29
Excellent review, Geoffrey
Comment from Subwoofer @ 2008/01/30
niceeeee

 

reply