Galaxy 9600 GT 512MB OC Video Card Review

Videocards/VGA Reviews by geoffrey @ 2008-02-26

Galaxy is launching a very different 9600 GT, featuring a custom board design, dual slot cooler, two BIOS and windows flash tool, it is geared toward the enthusiasts out there. This sample comes factory overclocked we compare its performance to a reference 9600 GT video card, as well as a 8800 GT, 8800 GTS and AMD´s pride: HD3870. Read on to find out of this product is the best mainstream card out there!

  • next

Introduction

Introduction

Traditionally, when NVIDIA launches a new family of Geforce cards, they start with the top end, grabbing first spot in all performance benchmarks. They have deviated from this plan with the Geforce 9 series, as we’re introducing the 9600 GT, a mid-range video card from the 9 family. A questionable act many have thought, due to the current situation in the mainstream market we’re offered a wide choice in high performance/price rated 3D adapters, but once you try to fit the pieces together everything suddenly seems to make perfect sense. Why? Well that’s what we’re here for today isn’t it, I will guide you all the way down to the point were I hope everybody will have a good view on what you can expect from this new series of Geforce mainstream video card, let’s us first have a look at the top of the art technology NVIDIA added to this new family.

Madshrimps (c)


The Geforce 9600 GT is the second generation DirectX 10 compatible video cards manufactured by NVIDIA. The card is equipped with a brand new G94 GPU and 512MB DDR RAM of the third generation. Being DX10 compatible, the G94 processor is designed around the unified shader architecture first seen in previous Geforce 8 families. This approach in processor design allows the programmer to do many different kinds of 3D calculations on the same processor, with past generation video cards the GPU existed out of a few processing groups which were capable of doing only what they were designed to do. Starting from the Geforce 8 series the graphics programmers have more freedom to create their virtual art, but in practice it all boils down to the point that this new architecture is just great in load balancing.

Madshrimps (c)


The downside is that the transistor count increased a lot, and that you need at least half decent working drivers to maximize the capabilities of the video card. Remembering last year Geforce 8 series, I’d say NVIDIA did great with their new GPU architecture, it is logical that they keep on building on the same road they started with the Geforce 8 family, you don’t change a winning design do you?

Madshrimps (c)


What did change was the efficiency of their GPU, the G94 is in fact a tweaked version of the G80 GPU, especially the Anti-Aliasing capabilities of the GPU have been tweaked to increase the performance.
Furthermore, compared to its predecessor, the 8600 GT, the Geforce 9600 GT should offer nearly double performance; it comes with 64 parallel shader processors where the 8600 GT has ‘only’ 32 shader processors enabled. Here is an overview on how the 9600 GT is configured compared with many of the current available mid-range video cards:

Madshrimps (c)


While the new Geforce 9600 GT GPU is clocked the highest, it is not theoretical the fastest due to its 64 stream (shader) processors. Others have either 96 or 112 processors enabled, that is a 50 to 75 increment of shading processing power. We think that the 9600 GT is a good challenger for the 8800GTS 320MB or the 8800GS 384MB, even though the G94 GPU has only 64 shaders enabled, due to its much higher clock rate it might catch up in pure rendering performance which leaves the memory and its bus width to decide who’s the fastest of the three.

The 8800 GT should have no trouble dealing with the 9600 GT, both cards nearly match but the higher amount of shader processors will certainly have its impact on. This also leads us to believe that the 9600 GT is a daunting competitor for the HD3870 512MB because ATI’s fastest single core video card somehow failed to be of a real threat to NVIDIA’s G92 based video card. You were questioning why ATI lowered their HD3800 prices since last week, well… now you got your answer ;)

Enough technical talking for now, let's head over to the tested sample ->
  • next
Comment from geoffrey @ 2008/02/29
During my test period I did notice strange behavior with the 9600GT's I was testing. Unfortunately, my available time is very limited and I did not have the time to go too much in depth with what was causing problems on my current setup.

The problem was that one of the standard overclocked cards became instable whenever being put through heavy 3D rendering tasks, downclocking it solved the problem so I guessed we got stuck up with a bugged card (remember the news regarding transient voltage fix on 9600GT's).

Now, nearly one week after launch date, there seems to be something sneaky going on with NV9600 series. NVIDIA has changed the ways clocks are being build up. Earlier, a 27MHz crystal was used, the clock frequency got multiplied and devided until you get the final clockspeed, 650MHz for example. With GeForce 9600GT the GPU core frequency is based on the PCIe speed, the downside here is that my system always is configured with a 110MHz PCIe clock, making the 9600GT used in our article extra overclocked by another 10%!

So... let me warn you here: the results obtained in our article are based on a system using 110MHz PCI-Express clock instead of 100MHz, this difference stands for a 10% overclock on the 9600GT core clock and will have its impact on total system performance.

This being said, we will fix our way of testing in upcoming articles. Those who are still puzzled, don't hesitate to ask around what I was trying to explain here above, but I insist you have a read at our source, thank you techPowerUp!

 

reply