Overclocking resultsThe information provided on the previous page was needed to understand why the overclocking results are so poor. As we couldn't risk having a memory temperature below -50°C, the temperature of the core was kept at a safe -25°C. This, however, meant that a core frequency of 950MHz was already pushing it, so to complete the entire series of benchmarks, we clocked the graphics card to 930/2023/975 MHz. Despite of the relatively poor clock frequency of the core, which didn't matter any way, we did break all the records in the, it has to be said: not (yet?) very popular, HWBOT GT 220 hardware category.
Screenshots of the overclocking results. From left to right: 3DMark2001SE, 3DMark03, 3DMark05, 3DMark06, 3DMark Vantage and Aquamark3.
(Click to enlarge)As I only ran each benchmark once and LOD wasn't altered (Rivatuner doesn't support GT220 yet), I'm absolutely confident that these scores will end up outside of the top 20 in a couple of months.
Interesting bits: shader clock world record?!Fairly disappointed by the memory failure below -50°C, I decided to give the card a whirl anyway and tried to find the maximum stable core frequency knowing the performance was bad. So, what's important about the picture below is the frequencies.
(Click to see more information)The core frequency increased almost 1-to-1 with the temperature. As indicated on the previous page, due to memory failure below -100°C this card didn't work stable at 1300MHz. I'm sure that if we could take this card to -160°C or lower it could function at 1.4GHz+ MHz core frequency! Not that it would matter, though, knowing that the card is heavily bottlenecked by the memory frequency.
Another
very important thing to note is the
shader frequency of 3.4GHz, which is the highest I've
EVER seen on any Nvidia graphics card, so until proven differently, I guess this is a world record. I know that this frequency does seem quite unbelievable and I'm sure that some critics will call this a bugged run. However, following are arguments that can be used to defend the legitimacy of this overclock:
1) Three different applications indicate the working frequency at 3.4GHz
2) The frequency goes up to 3.4GHz in 3D load mode and decreases back to 2D idle mode when idling at the desktop
3) This type of graphics card hasn't been tested this thoroughly in the past; Nvidia's 40nm is unknown terrain for most of us, thus the lack of precedents is normal.
4) There's definitely an end to this overclock: 3.45GHz wasn't 3D stable any more at -90°C and 1.6V.
5) The performance scaling when increasing the shader frequency is in line with the expectations based on the scaling graphs on the third page of this article as shown in underneath chart.
6) Last but not least, we have to return to the memory failure due to temperature topic. I assume that temperature causes the memory to underclock itself to 2D mode clock frequencies because the card still works fluently (produces 3D image), but the performance drops significantly. Given that from previous observations we know that the shader frequency overclockability is linked to the memory frequency, it's very possible that the shader frequency of 3.4GHz is also due to the non-performance of the memory. In other words, the high shader frequency is partially caused by the memory coldbug.