Quote Originally Posted by Husar
It's a 65nm design, unlike the old ones which are 90nm. I also read it got 10,769 points in 3DMark06, which sounds rather weird considering that Tom's Hardware shows the 8800GTX gets less than 6000.
65nm design or not it appears the single slot stock cooler design Nvidia chose for the 8800GT might not be terrible effective in systems where ventilation is usually lacking. This is all rumor so it means nothing right now.

http://www.techreport.com/discussions.x/13321

Spoiler Alert, click show to read: 
Early GeForce 8800 GTs suffer from overheating?
by Cyril Kowaliski — 9:24 AM on October 4, 2007

Reports regarding the most minute attributes of Nvidia's upcoming GeForce 8800 GT graphics cards have been appearing on rumor sites by the truckload lately. Based on those reports, we know 8800 GTs have a single-slot cooler, a 65nm graphics processor dubbed G92, and that they're expected to launch on October 29 in the $199-249 range.

The latest news is that early 8800 GT cards may be suffering from overheating problems. A new report by The Inquirer says Nvidia sent an "urgent letter" to PC vendors asking them to send in systems for "thermal analysis." The Inq says it heard the same story from several PC vendors, who added that Nvidia gave them roughly a week to comply without providing any further explanations. Since the G92 GPU is expected to be smaller than the G80 found in current GeForce 8800-series cards, The Inq speculates that the culprit is the 8800 GT's cooler, which may not be performing adequately on production hardware.

In related news, Fudzilla has word that Nvidia will retire the GeForce 8800 GTS 320MB once the 8800 GTs launch. The 8800 GTS 320MB should be replaced by an 8800 GT with 512MB of memory, the site says.


Ja, 3DMark scores are always an oddity. You have to pay attention to the systems specs otherwise those numbers could mean anything. Anyway 3DMark is a synthetic benchmark that has been subject to all kinds of shenanigans by 3D chip makers throughout its history; the most notorious being clever driver workarounds by Nvidia and ATI to make the benchmark score higher for their cards. It's much wiser to pay attention to actual game benchmarks than something like 3DMark. Look at enough actual game benchmarks and you can get a much better idea what a 3D card is capable of. Remember, you can't 'play' 3DMark so it's nothing more than a glorified demo.