Many of the readers of these tech sites want to know the full capabilities of the cards, yet, sadly, reviewers at anandtech and every other tech site ignore the video capabilities of video cards. Even in the reviews for the new 6600 agp, the video aspect has not been tested by any reviewer despite the problems of the 6800. Never mind the fact that EVERY review of these cards is about the 3d aspect and is nearly the exact same - run halo, doom 3, hl 2, etc. and list the performance, yet no tests of dvd movies or the video aspect are conducted, thus doing a HUGE disservice to readers.
I dont understand why on you previous 6200 review the X300 wins, loses (Doom3), and keep up, but now a much worst 6200 wins over X300. How the hell did that hapen, new nvidia drivers?
I dont understand why on you previous 6200 review the X300 wins, loses (Doom3), and keep up, but now a much worst 6200 wins over X300. How the hell did that hapen, new nvidia drivers?
Surprisingly, my 865G with Intel Extreme Graphics 2 can run Doom 3 beta at default, it still crashes, but when I run it, I get barely playable frames, I say around 20 at the highest and less than 10. I think the GMA900 should be much better, but maybe the DX9 support in it really sucks.
Doesnt 2 cards cost more then one?
And whats the difference between having two 6600GT vs 6800GT? in price and performance?
I think this kind of "edge" could come in the future like the voodoo2 did, the card was getting old, people getting rid of it and "some" get them cheap just to keep their PC the longger time they could.
Everyone wants 660GT because they are cheap and two of them can be put into SLI mode (once Nforce 4 comes out) which could mean better performance than the X700, and maybe even the X800.
I'm sure the core of the 6600 will overclock very well, but the memory all depends on the particular chips used and might not have any real headroom. That could be its main problem as its an 8-pipe 300MHz core so theres plenty of power there, but only 128-bit 500MHz (effective) memory which is what is probably holding it back. If thats the case then overclocking the core may not help very much.
Its a pity no attempt to overclock was performed in the review, but then again the results from overclocking cards sent out by the manufacturer are always suspect as they could have hand-picked the best.
" I can't see how the 6200 could have a street-price of $149 (128-bit) and $129 (64-bit). "
It's actually $129 for the 128MB 128-bit version and $149 for the 256MB 128-bit version. The 64-bit version (only 128MB) should have an MSRP of $100, according to the Inquirer.
In my opinion ATI beats all nVidia cards except for their $200, where the 6600GT wins. But we can't forget the 6600 has a great overclocking potential, and street prices should be lower than the X700's, because of the slower memory.
Like already mentioned, you can find the 6600 for $135 already.
I also wish to thank you for keeping up the fight to unravel the mystery behind the mysterious video processor. That notion of that feature really got me excited when I first heard about it, yet site after site after site reviewed these cards without even touching on the subject.
I'm assuming the 6200 you tested was a 128-bit version? You don't seem to mention it at all in the review, but I doubt nVidia would send you a 64-bit model unless they wanted to do badly in the benchmarks :)
I don't think the X700 has appeared on an AT review before, only the X700 XT. Did you underclock your XT, or have you got hold of a standard X700? I trust those X700 results aren't from the X700 XT at full speed! :)
As #11 and #12 mentioned, with the exception of Doom 3, the X600 Pro is faster than the 6200:
So the X600 Pro is slower than the 6200 (128-bit) in Doom 3 by a significant amount, but its marginally faster than it in three games, and its significantly faster than the 6200 in the other three games and also the HL2 Stress Test. So that makes the X600 Pro the better card.
The X700 absolutely thrashed even the 6600, let alone the 6200, in every game except of course Doom 3 where the 6600 was faster, and Halo where the X700 was a bit faster than the 6600 but not by such a large amount.
Given the prices of the ATI cards, X300SE ($75), X300 ($100), X600 Pro ($130), X700 (MSRP $149); the 6600 is going to have to be priced at under its MSRP of $149 because of the far superior X700 at the same price point. Lets say a maximum of $130 for the 6600.
If thats the case, I can't see how the 6200 could have a street-price of $149 (128-bit) and $129 (64-bit). How can the 6200 (128-bit) even have the same price as the faster 6600 anyway? Its also outperformed by the $130 X600 Pro which makes a $149 price ridiculous. I think the 6200 will have to be priced more like the X300 and X300SE-- $100 and $75 for the 128-bit and 64-bit versions respectively, if they are to be successful.
Maybe most 6200's will end up being cheap 64-bit cards that are sold to people who aren't really bothered about gaming, or who mistakenly believe the amount of memory is the most important factor. You just have to look at how many 64-bit FX5200's are sold.
The PT Barnum theory, wilburpan. There's a sucker born every minute, and if they're willing to drop $60 for a 64-bit version of a card when they could have had a 128-bit version, so much the better for profits. The FX5200 continues to be one of the best selling AGP cards on the market, despite the fact that it's worse than a Ti4200 at playing games, let alone DX9 games.
"The first thing to notice here is that the 6200 supports either a 64-bit or 128-bit memory bus, and as far as NVIDIA is concerned, they are not going to be distinguishing cards equipped with either a 64-bit or 128-bit memory configuration."
This really bothers me a lot. If I knew there were two versions of this card, I definitely would want to know which version I was buying.
Why do you all keep talking about the Geforce 6600 cards (buying them) when the X700 was the clear winner?
You all want to buy the worst card (less performing)? I dont understand.
Why dont anantech use 3Dmark05?
No doubt that mine 9700 was a magnificent buy almost 2 years ago. What a piece of cheat are the Geforce FX line of cards....
Why didnt they use one (a 5600/5700) just to see...
Even 4pipe line Ati cards can keep up with 8 pipe nvidia, gee what a mess... old tech yeah right.
I am very happy you included Sims 2 into your benchmark suite:)
I think this game like the amount of vertex processor on X700 plus it's advanatge in fillrate and memory bandwidth, could you please test the Sims 2 when you can on the high end cards from both vendors? :P
"OpenGl was never really big on ATi's list of supported API's... However, adding in Doom3, and the requirement of OGL on non-Windows-based systems, and OGL is at least as important to ATi now as DirectX."
Quake 3, RtCW, HL, CS, CoD, SW:KotOR, Serious Sam (1&2), Painkiller, etc, etc, etc, etc, are OpenGL games. Why would they ONLY NOW want to optimize for OpenGL?
xsliver : I think it's because ATi has generally cared more about optimizing for DirectX, and more recently just optimizing for API. OpenGl was never really big on ATi's list of supported API's... However, adding in Doom3, and the requirement of OGL on non-Windows-based systems, and OGL is at least as important to ATi now as DirectX. How long it will take to convert that priority into performance is unknown.
Also, keep this in mind: Nvidia specifically built the Geforce mark-itecture from the ground up to power John Carmack's 3D dream. Nvidia has specifically stated they create their cards based on what Carmack says. Wether or not that is right or wrong I will leave up to you to decide, but that does very well explain the disparity between ID games and other games, even under OGL.
Just a conspiracy theory -- does the NV cards only perform well on the most popular / publicised games whereas the ATI cards excel due to a better written driver / better hardware?
Or is the FRAPS testing biasing ATI for some reason?
What do you mean "who has the right games"? If you want to play Doom3, look at the Doom3 graphs. If you want to play FarCry, look a the FarCry graphs. If you want to play CoH, Madden, or Thunder 04, look at HardOCP's graphs. Every game is going to handle the cards differently. I really dont see anything wrong with AnandTech's current group of testing programs.
And Newegg now has 5 6600 non-GTs in stock, ranging in price from $175-$148. But remember that it takes time to test and review these cards. When Anand went to get a 6600, its very likely that that was the only card he could find. I know I couldnt find one at all a week ago.
Furthermore, the games you pick for a review make a large difference for the conclusion. Because of that, HardOCP has the 6200 outperforming the x600 by a small margin. So, I would like to know who has the right games.
And #2:
The X700/X800 is simular enough to the 9800 to compare them on pipelines and clock speeds. Based on that, the x700 should perform about the same.
Thanks for the responses, here are some answers in no specific order:
1) The X300 was omitted from the Video Stress Test benchmark because CS: Source was released before we could finish testing the X300, no longer giving us access to the beta. We will run the cards on the final version of CS: Source in future reviews.
2) I apologize for the confusing conclusion, that statement was meant to follow the line before it about the X300. I've made the appropriate changes.
3) No prob in regards to the Video Processor, I've literally been asking every week since May about this thing. I will get the full story one way or another.
4) I am working on answering some of your questions about comparing other cards to what we've seen here. Don't worry, the comparisons are coming...
Here's my question, what is better? A Geforce 6800 or a Geforce 6600 GT? I wish there was like a Geforce round-up somewhere. And I saw some benchmarks that showed SLI does indeed work, but these were just used on 3dmark and anyone know if there is any actual tests out yet on SLI?
Also to address another issue some of you have brought up, these new line of cards beat the 9800 Pro by a huge amount. But it's not worth the upgrade. Stick with what you have until it no longer works, and right now a 9800 Pro works just fine. Of course if you do need a new graphics card, the 6600 GT seems the way to go. If you can find someone that sells them.
O, and to address the pricing. nVidia only offers suggested retail prices. Vendors can up the price on parts so that they can still sell the inventory they have on older cards. In the next couple of months we should see these new graphics cards drop to the MSRP
#10, because it's still an MP game at the core. The AI is as dumb as rocks, and is there for the console users. Most PC users will be playing this online, not alone in SP mode.
Thanks for the tidbit on the 6800's PVP. I'd like to see Anandtech take on a video card round up aimed at video processing and what these cards are actually capable of. It would fit in nicely with the media software/hardware Andrew's been looking at, and let users know what to actually expect from their hardware.
Anand, thanks so much for updating us on the PVP feature in the NV40. I think it's high-time somebody held nVidia accountable for a "broken" feature. Do you know if the PVP is working in the PCI-Express version (NV45)? Any information you can get would be great. Thanks Anand!
That's an odd conclusion... "In most cases, the GeForce 6200 does significantly outperform the X300 and X600 Pro, its target competitors from ATI."
But looking at the results, the X600Pro is _faster_ in 5 of 8 benchmarks (sometimes significantly), 2 are a draw, and only slower in 1 (DoomIII, by a significant margin). Not to disregard DoomIII, but if you base your conclusion entirely on that game alone why do you even bother with the other titles?
I just can't see why that alone justifies "...overall, the 6200 takes the crown".
There are some other odd comments as well, for instance at the Star Wars Battlefront performance: "The X300SE is basically too slow to play this game. There's nothing more to it. The X300 doesn't make it much better either." Compared to the 6200 which gets "An OK performer;..." but is actually (very slightly) slower than the X300?
Thanks Anand! I've been on about the PVP problems with nV40 for months now, and have become increasing fustrated with the lack of information and/or progress by nV. Now that a major site is pursuing this with vigor I can at least take comfort in the knowledge that answers will be forthcoming one way or another!
Again, thanks for making this issue a priority and emphatically stating you will get more information for us. It's nV vs Anand so "Rumble young man! Rumble!" :-)
if you ask me, all these low end cards are stupid if you have a PCIe motherboard.. who the heck would get one of these crappy cards if they spent all the money for a brand new PCIe computer??? these cards would be perfect for AGP as they are now going to start to be lower end..
Unless LucasArts changes something Anand, you may want to drop the Battlefront test. With multiplayer, the framerate is locked to the tick rate(usually 20FPS), so its performance is nearly irrelivant.
PS #1, he's talking about the full load graph, not the idle graph
"For example, the GeForce 6600 is supposed to have a street price of $149, but currently, it's selling for closer to $170. So, as the pricing changes, so does our recommendation."
i have yet to see the 6600 anywhere. pricewatch only lists two aopen cards (both well over 200.00) it and newegg doesn't carry it. i'm curious as to where he got the 170.00 street price.
OT, I wonder about the outcome for us 6800 owners and the VP... Nvidia screamed this new feature to us and I bought it . Will this end in a class action,or perhaps some kind of voucher for people that bought the 6800 specifically for this highly touted feature....
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
44 Comments
Back to Article
nvdm24 - Sunday, December 19, 2004 - link
Many of the readers of these tech sites want to know the full capabilities of the cards, yet, sadly, reviewers at anandtech and every other tech site ignore the video capabilities of video cards. Even in the reviews for the new 6600 agp, the video aspect has not been tested by any reviewer despite the problems of the 6800. Never mind the fact that EVERY review of these cards is about the 3d aspect and is nearly the exact same - run halo, doom 3, hl 2, etc. and list the performance, yet no tests of dvd movies or the video aspect are conducted, thus doing a HUGE disservice to readers.nserra - Thursday, December 16, 2004 - link
I dont understand why on you previous 6200 review the X300 wins, loses (Doom3), and keep up, but now a much worst 6200 wins over X300. How the hell did that hapen, new nvidia drivers?nserra - Thursday, December 16, 2004 - link
I dont understand why on you previous 6200 review the X300 wins, loses (Doom3), and keep up, but now a much worst 6200 wins over X300. How the hell did that hapen, new nvidia drivers?IntelUser2000 - Thursday, October 14, 2004 - link
Surprisingly, my 865G with Intel Extreme Graphics 2 can run Doom 3 beta at default, it still crashes, but when I run it, I get barely playable frames, I say around 20 at the highest and less than 10. I think the GMA900 should be much better, but maybe the DX9 support in it really sucks.nserra - Wednesday, October 13, 2004 - link
#39 Thanks to the answer, but...Doesnt 2 cards cost more then one?
And whats the difference between having two 6600GT vs 6800GT? in price and performance?
I think this kind of "edge" could come in the future like the voodoo2 did, the card was getting old, people getting rid of it and "some" get them cheap just to keep their PC the longger time they could.
Confusednewbie1552 - Tuesday, October 12, 2004 - link
#30Everyone wants 660GT because they are cheap and two of them can be put into SLI mode (once Nforce 4 comes out) which could mean better performance than the X700, and maybe even the X800.
PrinceGaz - Tuesday, October 12, 2004 - link
I'm sure the core of the 6600 will overclock very well, but the memory all depends on the particular chips used and might not have any real headroom. That could be its main problem as its an 8-pipe 300MHz core so theres plenty of power there, but only 128-bit 500MHz (effective) memory which is what is probably holding it back. If thats the case then overclocking the core may not help very much.Its a pity no attempt to overclock was performed in the review, but then again the results from overclocking cards sent out by the manufacturer are always suspect as they could have hand-picked the best.
thebluesgnr - Tuesday, October 12, 2004 - link
" I can't see how the 6200 could have a street-price of $149 (128-bit) and $129 (64-bit). "It's actually $129 for the 128MB 128-bit version and $149 for the 256MB 128-bit version. The 64-bit version (only 128MB) should have an MSRP of $100, according to the Inquirer.
So nVidia has:
$100 6200 128MB 64-bit
$130 6200 128MB 128-bit
$150 6200 256MB 128-bit
$150 6600 128MB 128-bit
$200 6600GT 128MB 128-bit
In my opinion ATI beats all nVidia cards except for their $200, where the 6600GT wins. But we can't forget the 6600 has a great overclocking potential, and street prices should be lower than the X700's, because of the slower memory.
Like already mentioned, you can find the 6600 for $135 already.
mkruer - Tuesday, October 12, 2004 - link
To X700 XT or to 9800 Pro, that is the questionneo229 - Tuesday, October 12, 2004 - link
I also wish to thank you for keeping up the fight to unravel the mystery behind the mysterious video processor. That notion of that feature really got me excited when I first heard about it, yet site after site after site reviewed these cards without even touching on the subject.PrinceGaz - Tuesday, October 12, 2004 - link
I'm assuming the 6200 you tested was a 128-bit version? You don't seem to mention it at all in the review, but I doubt nVidia would send you a 64-bit model unless they wanted to do badly in the benchmarks :)I don't think the X700 has appeared on an AT review before, only the X700 XT. Did you underclock your XT, or have you got hold of a standard X700? I trust those X700 results aren't from the X700 XT at full speed! :)
As #11 and #12 mentioned, with the exception of Doom 3, the X600 Pro is faster than the 6200:
Doom 3 - 39.3 60.1 (-35%)
HL2 Stress Test - 91 76 (+20%)
SW Battlefront - 45 33 (+36%)
Sims 2 - 33.9 32.2 (+5%)
UT2004 (1024x768) - 46.3 37 (+25%) [they were CPU limited at lower resolutions]
BF Vietnam - 81 77 (+5%)
Halo - 45.2 44 (+3%)
Far Cry - 74.7 60.6 (+23%)
So the X600 Pro is slower than the 6200 (128-bit) in Doom 3 by a significant amount, but its marginally faster than it in three games, and its significantly faster than the 6200 in the other three games and also the HL2 Stress Test. So that makes the X600 Pro the better card.
The X700 absolutely thrashed even the 6600, let alone the 6200, in every game except of course Doom 3 where the 6600 was faster, and Halo where the X700 was a bit faster than the 6600 but not by such a large amount.
Given the prices of the ATI cards, X300SE ($75), X300 ($100), X600 Pro ($130), X700 (MSRP $149); the 6600 is going to have to be priced at under its MSRP of $149 because of the far superior X700 at the same price point. Lets say a maximum of $130 for the 6600.
If thats the case, I can't see how the 6200 could have a street-price of $149 (128-bit) and $129 (64-bit). How can the 6200 (128-bit) even have the same price as the faster 6600 anyway? Its also outperformed by the $130 X600 Pro which makes a $149 price ridiculous. I think the 6200 will have to be priced more like the X300 and X300SE-- $100 and $75 for the 128-bit and 64-bit versions respectively, if they are to be successful.
Maybe most 6200's will end up being cheap 64-bit cards that are sold to people who aren't really bothered about gaming, or who mistakenly believe the amount of memory is the most important factor. You just have to look at how many 64-bit FX5200's are sold.
Shinei - Tuesday, October 12, 2004 - link
The PT Barnum theory, wilburpan. There's a sucker born every minute, and if they're willing to drop $60 for a 64-bit version of a card when they could have had a 128-bit version, so much the better for profits. The FX5200 continues to be one of the best selling AGP cards on the market, despite the fact that it's worse than a Ti4200 at playing games, let alone DX9 games.wilburpan - Tuesday, October 12, 2004 - link
"The first thing to notice here is that the 6200 supports either a 64-bit or 128-bit memory bus, and as far as NVIDIA is concerned, they are not going to be distinguishing cards equipped with either a 64-bit or 128-bit memory configuration."This really bothers me a lot. If I knew there were two versions of this card, I definitely would want to know which version I was buying.
What would be the rationale for such a policy?
wilburpan - Tuesday, October 12, 2004 - link
nserra - Tuesday, October 12, 2004 - link
Why do you all keep talking about the Geforce 6600 cards (buying them) when the X700 was the clear winner?You all want to buy the worst card (less performing)? I dont understand.
Why dont anantech use 3Dmark05?
No doubt that mine 9700 was a magnificent buy almost 2 years ago. What a piece of cheat are the Geforce FX line of cards....
Why didnt they use one (a 5600/5700) just to see...
Even 4pipe line Ati cards can keep up with 8 pipe nvidia, gee what a mess... old tech yeah right.
coldpower27 - Tuesday, October 12, 2004 - link
I am very happy you included Sims 2 into your benchmark suite:)I think this game like the amount of vertex processor on X700 plus it's advanatge in fillrate and memory bandwidth, could you please test the Sims 2 when you can on the high end cards from both vendors? :P
jediknight - Tuesday, October 12, 2004 - link
What I'm wondering is.. how do previous generation top-of-the-line cards stack up to current gen mainstream cards?AnonymouseUser - Tuesday, October 12, 2004 - link
Saist, you are an idiot."OpenGl was never really big on ATi's list of supported API's... However, adding in Doom3, and the requirement of OGL on non-Windows-based systems, and OGL is at least as important to ATi now as DirectX."
Quake 3, RtCW, HL, CS, CoD, SW:KotOR, Serious Sam (1&2), Painkiller, etc, etc, etc, etc, are OpenGL games. Why would they ONLY NOW want to optimize for OpenGL?
Avalon - Monday, October 11, 2004 - link
Nice review on the budget sector. It's good to see a review from you again, Anand :)Bonesdad - Monday, October 11, 2004 - link
Affordable gaming??? Not until the 6600GT AGP's come out...affordable is not replacing your mobo, cpu and video card...Saist - Monday, October 11, 2004 - link
xsliver : I think it's because ATi has generally cared more about optimizing for DirectX, and more recently just optimizing for API. OpenGl was never really big on ATi's list of supported API's... However, adding in Doom3, and the requirement of OGL on non-Windows-based systems, and OGL is at least as important to ATi now as DirectX. How long it will take to convert that priority into performance is unknown.Also, keep this in mind: Nvidia specifically built the Geforce mark-itecture from the ground up to power John Carmack's 3D dream. Nvidia has specifically stated they create their cards based on what Carmack says. Wether or not that is right or wrong I will leave up to you to decide, but that does very well explain the disparity between ID games and other games, even under OGL.
xsilver - Monday, October 11, 2004 - link
Just a conspiracy theory -- does the NV cards only perform well on the most popular / publicised games whereas the ATI cards excel due to a better written driver / better hardware?Or is the FRAPS testing biasing ATI for some reason?
Cygni - Monday, October 11, 2004 - link
What do you mean "who has the right games"? If you want to play Doom3, look at the Doom3 graphs. If you want to play FarCry, look a the FarCry graphs. If you want to play CoH, Madden, or Thunder 04, look at HardOCP's graphs. Every game is going to handle the cards differently. I really dont see anything wrong with AnandTech's current group of testing programs.And Newegg now has 5 6600 non-GTs in stock, ranging in price from $175-$148. But remember that it takes time to test and review these cards. When Anand went to get a 6600, its very likely that that was the only card he could find. I know I couldnt find one at all a week ago.
T8000 - Monday, October 11, 2004 - link
Check this, a XFX 6600 in stock for just $143:http://www.gameve.com/gve/Store/ProductDetails.asp...
Furthermore, the games you pick for a review make a large difference for the conclusion. Because of that, HardOCP has the 6200 outperforming the x600 by a small margin. So, I would like to know who has the right games.
And #2:
The X700/X800 is simular enough to the 9800 to compare them on pipelines and clock speeds. Based on that, the x700 should perform about the same.
Anand Lal Shimpi - Monday, October 11, 2004 - link
Thanks for the responses, here are some answers in no specific order:1) The X300 was omitted from the Video Stress Test benchmark because CS: Source was released before we could finish testing the X300, no longer giving us access to the beta. We will run the cards on the final version of CS: Source in future reviews.
2) I apologize for the confusing conclusion, that statement was meant to follow the line before it about the X300. I've made the appropriate changes.
3) No prob in regards to the Video Processor, I've literally been asking every week since May about this thing. I will get the full story one way or another.
4) I am working on answering some of your questions about comparing other cards to what we've seen here. Don't worry, the comparisons are coming...
Take care,
Anand
friedrice - Monday, October 11, 2004 - link
Here's my question, what is better? A Geforce 6800 or a Geforce 6600 GT? I wish there was like a Geforce round-up somewhere. And I saw some benchmarks that showed SLI does indeed work, but these were just used on 3dmark and anyone know if there is any actual tests out yet on SLI?Also to address another issue some of you have brought up, these new line of cards beat the 9800 Pro by a huge amount. But it's not worth the upgrade. Stick with what you have until it no longer works, and right now a 9800 Pro works just fine. Of course if you do need a new graphics card, the 6600 GT seems the way to go. If you can find someone that sells them.
O, and to address the pricing. nVidia only offers suggested retail prices. Vendors can up the price on parts so that they can still sell the inventory they have on older cards. In the next couple of months we should see these new graphics cards drop to the MSRP
ViRGE - Monday, October 11, 2004 - link
#10, because it's still an MP game at the core. The AI is as dumb as rocks, and is there for the console users. Most PC users will be playing this online, not alone in SP mode.rbV5 - Monday, October 11, 2004 - link
Thanks for the tidbit on the 6800's PVP. I'd like to see Anandtech take on a video card round up aimed at video processing and what these cards are actually capable of. It would fit in nicely with the media software/hardware Andrew's been looking at, and let users know what to actually expect from their hardware.thebluesgnr - Monday, October 11, 2004 - link
Buyxtremegear has the GeForce 6600 from Leadtek for $135. Gameve has 3 different cards (Sparkle, XFX, Leadtek) all under $150 for the 128MB version.#1,
they're probably talking about the power consumption under full load.
Sunbird - Monday, October 11, 2004 - link
All I hope is that the 128bit and 64bit versions have some easy way of distinguishing between them.Sunbird - Monday, October 11, 2004 - link
bpt8056 - Monday, October 11, 2004 - link
Anand, thanks so much for updating us on the PVP feature in the NV40. I think it's high-time somebody held nVidia accountable for a "broken" feature. Do you know if the PVP is working in the PCI-Express version (NV45)? Any information you can get would be great. Thanks Anand!mczak - Monday, October 11, 2004 - link
That's an odd conclusion... "In most cases, the GeForce 6200 does significantly outperform the X300 and X600 Pro, its target competitors from ATI."But looking at the results, the X600Pro is _faster_ in 5 of 8 benchmarks (sometimes significantly), 2 are a draw, and only slower in 1 (DoomIII, by a significant margin). Not to disregard DoomIII, but if you base your conclusion entirely on that game alone why do you even bother with the other titles?
I just can't see why that alone justifies "...overall, the 6200 takes the crown".
There are some other odd comments as well, for instance at the Star Wars Battlefront performance: "The X300SE is basically too slow to play this game. There's nothing more to it. The X300 doesn't make it much better either." Compared to the 6200 which gets "An OK performer;..." but is actually (very slightly) slower than the X300?
gordon151 - Monday, October 11, 2004 - link
"In most cases, the GeForce 6200 does significantly outperform the X300 and X600 Pro, its target competitors from ATI."Eh, am I missing something or wasnt it the X600 Pro the card that significantly outperformed the 6200 in almost all areas with the exception of Doom3.
dragonic - Monday, October 11, 2004 - link
#6 Why would they drop it because the multiplayer framerate is locked? They benchmark using the single player, not the multiplayerDAPUNISHER - Monday, October 11, 2004 - link
Thanks Anand! I've been on about the PVP problems with nV40 for months now, and have become increasing fustrated with the lack of information and/or progress by nV. Now that a major site is pursuing this with vigor I can at least take comfort in the knowledge that answers will be forthcoming one way or another!Again, thanks for making this issue a priority and emphatically stating you will get more information for us. It's nV vs Anand so "Rumble young man! Rumble!" :-)
AlphaFox - Monday, October 11, 2004 - link
if you ask me, all these low end cards are stupid if you have a PCIe motherboard.. who the heck would get one of these crappy cards if they spent all the money for a brand new PCIe computer??? these cards would be perfect for AGP as they are now going to start to be lower end..ROcHE - Monday, October 11, 2004 - link
How would a 9800 Pro do against these card?ViRGE - Monday, October 11, 2004 - link
Unless LucasArts changes something Anand, you may want to drop the Battlefront test. With multiplayer, the framerate is locked to the tick rate(usually 20FPS), so its performance is nearly irrelivant.PS #1, he's talking about the full load graph, not the idle graph
teng029 - Monday, October 11, 2004 - link
"For example, the GeForce 6600 is supposed to have a street price of $149, but currently, it's selling for closer to $170. So, as the pricing changes, so does our recommendation."i have yet to see the 6600 anywhere. pricewatch only lists two aopen cards (both well over 200.00) it and newegg doesn't carry it. i'm curious as to where he got the 170.00 street price.
MemberSince97 - Monday, October 11, 2004 - link
OT, I wonder about the outcome for us 6800 owners and the VP... Nvidia screamed this new feature to us and I bought it . Will this end in a class action,or perhaps some kind of voucher for people that bought the 6800 specifically for this highly touted feature....Lonyo - Monday, October 11, 2004 - link
Why is there no X300 in the CS: Source stress test?It seems oddly missing, and with no comment as to why...
projecteda - Monday, October 11, 2004 - link
x700 > 9800 Pro?NesuD - Monday, October 11, 2004 - link
there is some kind of error concerning your max power graph and this statement."other than the integrated graphics solution, the 6200 is the lowest power card here - drawing even less power than the X300,"
the graph clearly shows the 6200 drawing 117 watts while the x300 is shown drawing 110 watts. Just thought i would point that out.