Google
 



Saturday, November 04, 2006

NVIDIA's G80

I was reading up on nVidia's newest release, the "G80" or GeForce 8800GTX/8800GTS and let me say this baby looks pretty intense. It takes up TWO expansion slots on a motherboard, has a massive fan (always a plus), requires TWO pci-e power pin connectors and also has TWO SLI connectors. These dual connectors means that you can have THREE, yes THREE G80's in your computer which will, may I say, own really really hard.

the G80 has 768MB of GDDR3 on board memory on a 384-bit memory bus with a core speed clocked at 575MHz and the memory at 900MHz.

Now for some tests. . .
Daily Tech tested the GeForce 8800GTX with
* Intel Core 2 Extreme QX6700
* NVIDIA nForce 650i SLI based motherboard
* 2x1GB PC2-6400
* PowerColor ATI Radeon X1950 XTX
* Western Digital Raptor 150

The 3DMark06 gave the X1950XTX a score of 7026 whereas the G80 received a 11200
-That's 59% higher than AMD/ATI's best card on the market.

On Half Life 2 set at 4xAA/16xAF and a resolution of 1600x1200, the Radeon X1950 XTX received an FPS of 60.74 whereas the GeForce 8800GTX had an FPS of 116.93. NVIDIA's card runs 92% better than ATI's which is pretty hardcore.

But wait, it gets better:
On Quake 4 at 4xAA and a resolution of 1600x1200, the G80 once again out-performed the X1950XTX by 92% with an FPS of 65.93 as opposed to the X1950XTX's scrawny 34.23. Roflsauce!

But wait, there's more? (no, not the Nickelodeon magazine commercial which I WILL find and link to. . .)

On Prey at 4xAA/16xAF and at 1600x1200 the G80 was 60% better than the x1950XTXwith an 88.87 FPS as opposed to 55.53 FPS which ATI's card performs at.

The great thing is that the G80 doesn't just perform amazingly well, it also looks very sleak and stylish (not that it really matters, but nontheless it is a plus) and would look great in any case.

However, I find it interesting that NVIDIA would release this card now instead of waiting for Direct X10 because the smart consumer (which very few Americans are) would wait and buy the best card on the market when it is released (which will no doubt be the R600).

But nonetheless, NVIDIA does need to catch up with the competition (which is only ATI) and needed this release and hope - they have their fingers crossed - that many people will buy it.

Consindering the performance I've seen, I think nVidia's in for a few very profitable quarters.

1 Comments:

Anonymous Anonymous said...

hell the thing does have a lot of power, the only thing that sucks about it is the fact that it requires and extremely powerful CPU to be made the most out of. tarnation

12:08 AM  

Post a Comment

<< Home