Nvidia GF100 pulls 280W and is unmanufacturable?

@ 2010/01/18
What is Nvidia going deliver? As we have said earlier on tapeout, the GF100 Fermi is a 23.x * 23.x mm chip, we hear it is within a hair of 550mm^2. This compares quite unfavorably to it's main competitor, ATI's Cypress HD5870 at 334mm^2. ATI gets over 160 chip candidates from a wafer, but Nvidia gets only 104. To make matters worse, defective chips go up roughly by the square of chip the area, meaning Nvidia loses almost three times as many dies to defects as ATI because of the chip size differential.

The raw manufacturing cost of each GF100 to Nvidia is more than double that of ATI's Cypress. If the target product with 512 shaders is real, the recently reported 40 percent yield rates don't seem to be obtainable. It won't hit half of that based on Nvidia's current 40nm product yields, likely far far less.

Cost aside, the next problem is power. The demo cards at CES were pulling 280W for a single GPU which is perilously close to the 300W max for PCIe cards. Nvidia can choose to break that cap, but it would not be able to call the cards PCIe. OEMs really frown on such things. Knowingly selling out of spec parts puts a huge liability burden on their shoulders, and OEMs avoid that at all costs.

Comment from Kougar @ 2010/01/19
Why is this such a worry? G80 was a much larger die area and ran hot, but NVIDIA was able to run with it. Die area on G100 is huge, but proportionally speaking G80 was bigger!

Can't find the article but either Anandtech or TechReport extrapolated the die size based upon transistor count + die sizes from past 65nm and 55nm process shrinks. Even BSN says it is smaller than G80. Link If they could do it with G80, then they can with GF100.
Comment from jmke @ 2010/01/19
whether you like his approach or not is not the real "issue" here; the real issue will be for NVIDIA to produce the GF100 in large enough quantities; because if they fail to do so, this will either have to sell their chips with a loss, or price them outside the range of most people;

meanwhile ATI can just sit on its laurels and they might not even have to do: http://www.madshrimps.be/vbulletin/f...-2010-a-69240/
Comment from leeghoofd @ 2010/01/18
semiaccurate isn't hat run now by "mister I had to tell something very negative about Nvidia today ?" Usually he's in the right direction but the specs are miles off... not interested in how and what, I want to see it perform...
Comment from thorgal @ 2010/01/18
Two possibilities :

a) These guys are the ultimate ATi fanboys and they like to bash nVidia

b) They have some very strong sources and nVidia is in very deep sh*t...