Page 26 - Conclusion
ConclusionSo there you have it, I could likely rant on for a dozen more pages covering features and technology, fact is at this moment (second update to this article) we are still not finished with this preview... However, I bet that you were convinced a few pages ago.
The GeForce 6800 Ultra is of course a power house for sure. But I have a feeling that NVIDIA has something up their sleeves. The fact that this product is clocked at only 400 MHz is at the least surprising. I'm pretty confident that in a few months we'll see a 6900 Ultra model @ 500 or higher MHz. I know this is speculation, but it's likely NVIDIA is awaiting ATI's next killer product to see how it performs and react quickly to it. At the current clock frequency the card already kicks ass, but 400 MHz for a .13 micron product simply is a tad low.
The GeForce Series 6 of NVIDIA's graphics cards are simply looking fantastic from what I can judge. The debate Pixel Shader 2.0 versus 3.0 is a long one. Do we really need 3.0, is it indeed a big advantage over 2.0 ? All valid questions; yet always keep one thing in mind, never slow down technology in the sense of technological evolution. That means, newer is often better, no doubt there.
The performance is really breathtaking, its features astounding and remember this was only the engineering sample with beta drivers, not the final product. This is the first and to date only videocard that can handle Shader Model 3.0 and with 16 pixel pipelines it surely has raw power and of course what it needed; fantastic shading performance, MPEG encoding and decoding at GPU level, Rotated Grid Antialiasing, 16x Anisotropic Filtering and so much more that I can ramble on for a long time. The new ForceWare drivers are really awesome, one setting in it I simply do not like, Trilinear Optimizations is at standard enabled. So no full Trilinear filtering at default. NVIDIA please change this, this is a high-end card, only the best for your users is the keyword here. Err on the side of Image Quality, not speed.
The board's photos show a clean design, the cooling solution is single slot (not our sample though) and hey DUAL DVI output. I'm still a bit smothered by the two Molex power connectors, true... this card consumes a lot of power; how much? Well, expect 15-20 Watts more over the 5950 Ultra. NVIDIA recommended using a 480 Watt PSU. We used a 350 Watt PSU deliberately to see how it was handling. We experienced no issues at all. In fact we ran all our benchmark runs based on that PSU.
Heat & active fan - both not an issue. At idle the GPU core was roughly 45 Degrees C where at full 100% utilization we noticed a max of 62 Degrees C while at 23 Degrees C room temperature. The cooling technique right now is not the final one, as we should see a single slot solution. Of course board partners are free to choose their own cooling design. We are quite happy with it though and it's not at all too noisy. So you won't be reliving the 5800 Ultra...
To sum things up; if you have the money to spend on the best of the best then hey this might be the one for sure and also, don't count ATI out. As always in the high-end game, you'll pay the price for it alright. If you decide to buy this product then please bare this in mind, PCI-Express is coming very soon, please decide now whether you want to wait for an upgrade. No matter what your choice is going to be, the 6800 Ultra will smoke your system as the card is an FPS monster!
If you like to chat a little about the new product(s), then by all means do do in our NVIDIA forums.
** This article has been updated on the 29th or April with a product photoshoot and benchmarks done in our test-lab.
** Another update on 4th of July - Page 12 - new photo's of the 6800 Ultra stripped down.
Update - We posted additional benchmarks on done on an Athlon 64 3800+ system with the Radeon x800 Pro, XT, GeForce GT and Ultra. http://www.guru3d.com/article/article/136/