GeForce 8800 Roundup: The Best of the Best

by Josh Venning on 11/13/2006 11:04 AM EST
Comments Locked

34 Comments

Back to Article

  • peternelson - Saturday, November 25, 2006 - link


    I'm hearing rumours of an even never "dual" type card called 8850gx2.

    Anandtech can you reveal any news on this?
  • at80eighty - Wednesday, November 22, 2006 - link

    Annual Computernerd Wanfest of 2006 just rolled into town (im a wanking nerd btw :-p)

    what i saw in the papers was an ad for an XFX 8800 GTX. but this article doesnt mention it's existence (or i missed it)

    or did you pick he GTS as it was a better deal than the GTX?
  • Modular - Saturday, November 18, 2006 - link

    I was just wondering why there are no charts showing the core temps when the cards were overclocked. I'd be interested to see how much more heat these things crank out @ faster speeds. I also heard that they no longet throttle the GPU core when in 2D mode. That seems silly to me as it probably is a huge reason for the high idle power draw as well as the high idle core temps...
  • dpante1s - Wednesday, November 22, 2006 - link

    Would be very interesting to see a roundup just for the 8800 GTS cards as I think that many users may only afford to buy this one but would like to know which one of those is the best for overclocking...
  • crystal clear - Tuesday, November 14, 2006 - link

    More GeForce G80 Series Revealed
    Published on November 13th, 2006

    http://www.ngohq.com/home.php?page=Articles&go...">http://www.ngohq.com/home.php?page=Articles&go...

    G80-200, G80-400, G80-600,G80-850, G80-875

  • AnnonymousCoward - Monday, November 13, 2006 - link

    I believe your overclocking results are horribly flawed and misleading. The max core clock varies with each semiconductor part, so you can't just take 8 cards from different companies and determine which company overclocks the best! They all got different G80 dies.

    Now, cooling could affect the overclock amount. But based on the cooling results, there's no correlation. Look at Sparkle's poor overclock versus its great cooling, as well as EVGA's and MSI's great overclocking versus heat. No correlation.

    At least Page 5 said "Whether the overclocks we reached are due to variability in cores or..." But Page 8 showed more misunderstanding with "The temperature levels of this card under load are even lower than the XFX 8800 GTS by over ten degrees. This is somewhat perplexing considering that our Sparkle Calibre 8800 GTX sample didn't overclock very well compared to the other 8800 GTXs."

    The conclusion should have been "8800GTX's overclock between 627-659MHz", and don't bold the one in the table from the company that happened to get the best die.
  • shamgar03 - Wednesday, November 15, 2006 - link

    I concur, unless the author can present more evidence?
  • cryptonomicon - Monday, November 13, 2006 - link

    They're valueable to me :(
    Would love if they every got included on vid card roundups like these...
  • shank15217 - Monday, November 13, 2006 - link

    the new nvidia cards are doing great but just take a look at the older 7 series and compare it to the ATI offering. ATI 1950XTX hands the geforce 7 series its butt. I have a strange feeling the R600 will give Nvidia a run for its money.
  • xsilver - Monday, November 13, 2006 - link

    i found it interesting that on one of your graphs that the overclocked GTS is able to noticibly beat the GTX
    would it be possible at a more sane 1600x1200 resolution?
  • JarredWalton - Monday, November 13, 2006 - link

    It appears Oblivion isn't fully able to use all the SPs at present. The stock 8800 GTX should still have about 17% more potential core performance, although maybe not? If the SPs run at 1.35 GHz, what runs at 575 MHz? Or in the case of the OC'ed GTS, at 654 MHz? It could be they have a similar number of ROPs or some other logic that somehow makes the core clock more important in some cases. Or it could just be that the drivers need more optimizations to make the GTX outperform the GTS in all games. Obviously Oblivion isn't GPU bandwidth limited; beyond that, more testing will need to be done.
  • dcalfine - Monday, November 13, 2006 - link

    What about the Liquid-cooled BFG 8800GTX?
    Any news on that? I'd be interested in seeing how it compared in speed, overclockablility, temperature and power consumption.

    Keep up the good work though!
  • shamgar03 - Monday, November 13, 2006 - link

    I ordered one, hopefully it will do well in the over clocking section. I am a bit concerned with the differences in over clocking the cards from different manufacturers. Does anyone know the cause of that? I mean if two cards are the exact same as the reference except for the sticker you have to wonder if there is a bit of a variance in quality of semiconductor production. Maybe favorite distributors get the better cores? Any thoughts on what causes these differences?
  • yyrkoon - Monday, November 13, 2006 - link

    quote:

    If you can't find the EVGA card, then pretty much any of the reference designs will work, and although Sparkle gets extra points for trying something different with its peltier cooler the implementation just didn't work out.


    I assume this text about the sparkle card is in refference to it's in-ability to overclock ? In my opinion, I would rather use this card, or another card that ran equaly (or better), and remained as cool (or cooler). I dont know about you guys, or anyone else, but the though of a Graphics card approaching 90C (@ load, barring the sparkle) scares the crap out of me, and if this is a sign of things to come, then I'm not sure what my future options are. Lets not forget about 300WATTS + under load . . .

    Just as the heat / power consumption is an issue (once again, in my opinion), equally disturbing, is the brass it takes to charge $650 usd, for a first generation, card, that obviously needs alot of work. Yes, it would be nice to own such a card, for pumping out graphics better than anything previous, however, I personally would rather pay $650 for something that ran a lot cooler, and offered just as much performance, or better.

    Now, to the guy talking about Vista RC2 drivers from nVidia . . . Do you really expect someone to keep up on drivers, for a "product" that is basicly doomed to die a quiet death ? "RC2" . . . Release candadite . . . as far as I'm aware, the last I checked, alot of the graphics features (of Vista) in these betas were not even implemented. This means, that quite possibly, the drivers between RC2, and release could be a good bit different. Personally, I'd rather have nVidia work on the finished product drivers, VS. the release candadite drivers any day of the week. Aside from yourself, I hardly think anyone cares if you want to run RC2 until May 2007 (legally).
  • Griswold - Thursday, November 23, 2006 - link

    I fail to see your issue with temperatures. These cards were designed to run safely at these temperatures. Just because the figures are higher than you have come used to over the years, doesnt mean its bad.
  • RMSistight - Monday, November 13, 2006 - link

    How come the Quad SLI setup was not included on the tests? Quad SLI owners want to know.
  • DigitalFreak - Monday, November 13, 2006 - link

    quote:

    How come the Quad SLI setup was not included on the tests? Quad SLI owners want to know.


    LOL. You really want to see how bad a $1200 setup will get spanked by a single card that costs half as much? You must be a masochist.
  • penga - Monday, November 13, 2006 - link

    Hey, iam always interested in the most exact wattage number a card uses and i find it hard to do the maths from the given total system power consumption and conclude how much only the card eats. So my idea was why not use a mainboard with integrated graphics card and compare the numbers? hope u get the idea. what u think, wouldnt that work?
  • DerekWilson - Monday, November 13, 2006 - link

    The only way to do this would be to place extremely low resistance (but high current) shunt resistors in the power lines AND build a PCIe riser card to measure the power supplied by the motherboard while the system is running at load.

    There isn't a really good way to report the power of just the card any other way -- using an onboard graphics card wouldn't do it because the rest of the system would be using a different ammount of power as well (different cards require the system to do different types of work -- a higher powered graphics card will cause the CPU, memory, and chipset to all work harder and draw more power than a lower performance card).
  • yyrkoon - Monday, November 13, 2006 - link

    Derek, I think he was asking: "why not use an integrated graphics motherboard, as a refference system, for power consumption tests".

    However, it should be obvious, that this wouldnt be a good idea from a game benchmark perspective, in that, it's been my experience that integrated graphics mainboards dont normally perform as well, and often use dated technology / components. Although I havent really paid that much attention to detail, I would assume you guys use the "best" motherboard, for gaming benchmarks, and probably use the same mainboard for the rest of your tests.
  • JarredWalton - Monday, November 13, 2006 - link

    Derek already addressed the major problem with measuring GPU power draw on its own. However, given similar performance we can say that the cards are the primary difference in the power testing, so you get GTX cards using 1-6w more power at idle, and the Calibre uses up to 15W more. At load, the power differences cover a 10W spread, with the Calibre using up to 24W more.

    If we were to compare idle power with IGP and an 8800 card, we could reasonably compare how much power the card requires at idle. However, doing so at full load is impossible without some customized hardware, and such a measurement isn't really all that meaningful anyway if the card is going to make the rest of the system draw more power anyway. To that end, we feel the system power draw numbers are about the most useful representation of power requirements. If all other components are kept constant on a testbed, the power differences we should should stay consistent as well. How much less power would an E6400 with one of these cards require? Probably somewhere in the range of 10-15W at most be likely.
  • IKeelU - Monday, November 13, 2006 - link

    Nice roundup. One comment about the first page, last paragraph:

    "HDMI outputs are still not very common on PC graphics cards and thus HDCP is supported on each card."

    Maybe I'm misinterpreting, but it sounds like you are saying that HDCP is present *instead* of HDMI. The two are independent of each other. HDMI is the electrical/physical interface, whereas HDCP is the type of DRM with which the information will be encrypted.
  • Josh Venning - Monday, November 13, 2006 - link

    The sentence has been reworked. We meant to say HDCP is supported through DVI on each card. Thanks.
  • TigerFlash - Monday, November 13, 2006 - link

    Does anyone know if the Evga WITH ACS3 is what is on retail right now? Evga's website seems to be the only place that distinguishes the difference. Everyone else is just selling an "8800 GTX."

    Thanks.
  • Josh Venning - Monday, November 13, 2006 - link

    The ACS3 version of the EVGA 8800 GTX we had for this review is apparently not available yet anywhere, and we couldn't find any info on their website about it. Right now we are only seeing the reference design 8800 GTX for sale from EVGA, but the ACS3 should be out soon. The price for this part may be a bit higher, but our sample has the same clock speeds as the reference part.
  • SithSolo1 - Monday, November 13, 2006 - link

    They have two different heat sinks so I assume one could tell by looking at the product picture. I know a lot of sites use the product picture from the manfacture's site but I think they would use the one of the ASC3 if that's the one they had. I also assume they would charge a little more for it.
  • imaheadcase - Monday, November 13, 2006 - link

    Would love to see how these cards performance in Vista, even RC2 would be great.

    I know the graphics drivers for nivdia are terrible, i mean terrible, in vista atm but when they at least get a final out for 8800 a RC2 or a Vista final roundup vs winxp SP@ would be great :D
  • DerekWilson - Monday, November 13, 2006 - link

    Vista won't be that interesting until we see a DX10 driver from NVIDIA -- which we haven't yet and don't expect for a while. We'll certainly test it and see what happens though.
  • imaheadcase - Monday, November 13, 2006 - link

    Oh the current drivers for the 8800 beta do not support DX10? Is that the new detX drivers I read about Nvidia working on are for?
  • peternelson - Saturday, November 25, 2006 - link


    I'd be interested to know if the 8800 drivers even support SLI yet? The initial ones I heard of did not.
  • yacoub - Monday, November 13, 2006 - link

    I was surprised the eVGA card too the lead since the MSI had a much higher memory clock. I guess these cards are nowhere near being fillrate limited, so the core clock boost is more important? I'm not sure if that's the right conclusion to make.

    Also lol @ the 1950XTX's heinous noise and heat levels. ;)

    The power consumption reduction from its die-shrink (right?) over the 1900XTX is nice though.

    Very helpful article. Obvious conclusion: Stay away from the Caliber, lol.
  • kalrith - Monday, November 13, 2006 - link

    On page 5, it says that the Calibre "did get a core boost of 31MHz on the core clock". If the stock speed is 575 and the overclock is 631, shouldn't it be a boost of 56MHz?

    Also, on page 5 the Calibre's memory overclock is listed as 1021MHz, and on page 6 it's lists as 914MHz.
  • Josh Venning - Monday, November 13, 2006 - link

    Thanks for pointing out the errors on page 5, they've been fixed.

  • Kyteland - Monday, November 13, 2006 - link

    On page 2: "First we have the (insert exact card name here) from BFG."

Log in

Don't have an account? Sign up now