Nvidia G80 innards exposed

@ 2006/11/02
THIS ARTICLE REVEALS all of the important information regarding GeForce 8800 series, which is set to be released to the world on November 8th, 2006 in San Jose. We have learned that during traditional Editor's Day in San Francisco nVidia kept its rules, so "no porn surfing" and "no leaks to the Inquirer" banners were shown. But, we have no hard feelings about that. It is up to the companies to either respect millions of our readers, including employees of Nvidia or... not.
Comment from Kougar @ 2006/11/03
Hm, no more shimmering effect problems... That'll be very nice.

As for being CPU bound... The benchmark should be set at a C2D running 3.6ghz. If it remains more than marginally CPU bound I'll be amazed.

For having about 100 million more transistors than a Kentsfield, the 140 watts doesn't sound to bad to me?
Comment from jmke @ 2006/11/02
Quote:
G80 is a 681 million transistor chip manufactured by TSMC. Since Graphzilla opted for the traditional approach, it eats up around 140 Watts of power. The rest gets eaten by Nvidia's I/O chip, video memory and the losses in power conversion on the PCB itself.
140W... that's more than that Quad Core Intel CPU released today :/

Quote:
Yes, you've read it correctly. Both GTS and GTX are maxing out the CPUs of today, and even Kentsfield and upcoming 4x4 will not have enough CPU to max out the graphics card – G80 chip just eats up all the processing power that a CPU can provide to them.
first see... then believe... at 1600x1200+ resolutions with HDR+AA/AF thrown in the mix, I can think of a few games which will stress the GPU more than the CPU. But we'll see.

that write-up on TheINQ is truely interesting.

 

reply