I guess I was just expecting more from Anandtech, thats all.
Tired of buyer guides that are biased towards certain manufacturers (ie) the FX5900XT was released in December 2003 and has proven itself to be an awesome price/performance card. It was a sub $200 card that ran with the +$300 + cards of the time. Yet ATI, ATI, ATI was all that was touted. How come nobody picked up on that little powerhouse of a card that finally gave nVidia fans something to get excited about? There weren't even any good articles on that card until february!?!
I used to come to anandtech with confidence that what I was reading was the newest "unbiased" hardware analysis. I can't do that anymore. Thats all.
Hey guys, forgive my lack of knowledge but does anyone know if the final product, Half Life 2 when released that is, will have the capability to use SSE/3DNow! instructios in the CPU? Would it also take advantage of Hyperthreading? That would probably lead to higher framerates wouldn't it? I raise this question because I read at the S.T.A.L.K.E.R game website that the X-Ray engine is supposedly capable of taking advantage of not only the GPU (Duhh) but rather the type of CPU the computer has as well. Any comments? Thanks guys. As for the article, I am kind of leaning towards people who say that the games should be benhced on high end, middle end and lower end spec machines. I agree completely so it would give a better idea to most people. But since this is just a beta and I am sure that most people are interested in knowing about HL2 instead of CS IMHO I cant really blame AT for not making the article the way people want it. I am sure when they get the final released copy of HL2 all our questions will be answered. Thanks everyone.
One more thing, if I may... Some of you guys have it all wrong. Yes, AT is a hardware site... that's a given. But something that (some of) you people aren't getting is that hardware just doesn't stand alone. People don't just buy the newest nvidia card because of it's awesome architecture. Nor do people buy $800 dollar cpus because of their sweet pipelines, right? Hardware is used to run software.... duh.
So.. what i'm getting at: AT is using this game to glean information about all the available hardware there is. CPUs, Graphics cards, and maybe RAM too. You need to know how your current card measures up before you upgrade right?... That is why AT is going to write a comprehensive review of ALL (or it may be safer to say most) of the graphics cards out there (CPUs too). Not just the new ones. So let's stop this silly griping and wait for them to do their thing. Go AT! - Message posted with good intentions, not to hurt anyone's feelings.
Ok, thanks Derek. That probably explains the difference. I am just amazed that my $170 processor can outperform an overclocked $800 processor. I guess the optimizations really make a big difference.
Well, as someone said earlier, AT happens to be a hardware site, not a gaming site. It's a lot more relevant for them to benchmark new cards in an interesting game, than 3 year old ones.
Presumably they're more interested in which of the new cards actually performs best, than in how old harware HL2 can handle.
Makes sense to me, and doesn't bother me the least. I read AT to learn about new hardware, and to know what I should upgrade to, not to find out whether my current system can run games. I use actual game reviews for that.
We all need to chill out a little bit. I am shure AT will do a complete review as soon as they have a final HL2 copy to benchmark.
My suggetion goes to havinga bang for the buck report. Better if including both Doom3/Hl2 results combined.
Most people prefers one of the games. But everybody sooner or later will use both engine based games to play on. And if they are using this benchmarks to help them decide the best way to go, this shoul help them.
And yes, not everybody is lucky to have a 800$ CPU... (altough you used it to take the cpu out of the equation, I know). :)
I think everybody needs to chill about AT not including the midrange video cards. I would also liked to have seen them, but we have to keep in mind that this is NOT a real game. This is just a preliminary test of a test world. I'm sure AT will come out with a full blown (midrange included) review when the real thing comes out. If everyone just exercised a little bit of patience, we wouldn't have such hot heads floating around.
we run with default configuration -- trilin opt on aniso opt off ... this probably accounts for the issues.
there isn't a config that you can set to make nv and ati do the exact same thing. unfortunately. also, most people run default settings when it comes to opts (AFAIK).
Derek, I may have missed it in the article, but did you say whether or not you enabled Trilinear and Anistropic Optimizations? Also, I didn't see whether you ran with vsync off, but I am guessing you did since that causes about a 10 fps performance loss on my system.
I ran the benchmarks with this system (yes, they are beta drivers, and I had Anistropic and Trilinear optimizations enabled, but I also ran the test with 61.77 drivers and at 1600x1200 with 4xAA, 8xAF and the Highest detail settings including Water: Reflect all, I was getting 52 fps)
Athlon 64 3000+
MSI K8T800
1GB OCZ PC3200
Geforce 6800 GT
Windows XP Pro SP1
DX9.0c
Forceware 65.62
My tests were all run with the highest settings in the advanced options, except reflect world/reflect all which I will specify in each individual benchmark. These tests were also run with 4xAA and 8xAF.
I don't understand why I had better performance than your system Derek. I have nothing overclocked, and the only settings I can think of that I have enabled that you may not have are the optimizations. Is it true that ATI cards default to running with optimizations and they cannot be disabled? If that is true, I would think that it would be fair to enable optimizations on the nvidia cards and that may show a nice improvement and a closer race between the two brands of cards.
"As a student at North Carolina State University, Derek Wilson [B]double majored[/B] in both [B]Electrical and Computer Engineering[/B]. After graduating, Derek brought his extensive Engineering background to work with the AnandTech team. Derek's specializations include [B]compiler theory and design[/B], giving him [B]a unique understanding of microprocessor architecture and optimization[/B]. He has also done [B]extensive work[/B] in the [B]3D field[/B], having [B]designed and implemented[/B] his own [B]3D rendering engine[/B] as well as having done much [B]programming for modern console platforms[/B]. Derek's hands-on experience in the realm of 3D graphics gives him a unique eye in his coverage of the PC graphics industry."
#35 you know better then this guy I suppose? Here's an idea for ya', why don't you shut up and go away you [B][U][I]Troll[/I][/U][/B]
Just read the rest of the article, after reading between the typos and reading the conclusion, it appears even with the console commands, the FX series is still running the DX 8.0/8.1 path, even if you try to force the dx 9 path. Thus AT is justified in not including the fx 5950 in their review.
#35, Looks like you could benefit from some homework too, perhaps reading that article you posted more carefully. ;) Next time, give constructive criticism but try not to be so harsh. You aren't the only one guilty of harshness, intellectual discussion and debate is great, but many of the discussions on the net would be better with more cool heads. :)
#35 As I recall, and as the article you posted confirms, the 6800 series do not automatically run the benchmark in dx8, it is only the fx series and below.
If you guys would have done your homework before posting this article you would have been informed. Having done that you could have accurstely informed us.
Don't know a good way to benchmark CS Source? Don't know how to force the hardware to use DX 9.0 ? Don't mention that all nVidia cards are forced to use DX 8.1 while ATI trudges away at DX 9.0 and coincidentally falls behind?
In the future, Anandtech should do at least two benchmarks--one for the extreme gamer and one for the average gamer.
Right now, the extreme gamers probably have AMD FX-53, the Raptor Drive, 2 Gbytes worth of elite super overclocked ram, and Geforce 6800GT/Ultra.
The average gamers probably have something close to AMD XP 2500+, any 7200rpm harddrive with 8Mbytes of cache, 512 MB to 1 GB worth of value DDR ram, Geforce 5700/ATI 96000/ ATI 9800.
Can you guys post results for more midrange hardware? Not everyone has a Geforce 6800 XT, Radeon X800 SuperMegaUltraProPlatinumSpecialLimitedEdition or Athlon 64 50000+.
I like the way people are bitching about typos on a site thats offering FREE articles to the public. Jeeze.
And oh, I dont really care what OTHER sites are getting on these tests. If you have been around the net, you know the likelyhood of Anandtech being wrong is pretty close to nil. This aint Toms.
When you look at the CPU scaling, I think you have no need to worry about that, unless your CPU runs below 2 Ghz, like the Athlon 2200+ mentioned above.
Besides, Geforce 6800 cards scale pretty predictable, so a $200 Geforce 6800LE should still get above 50 fps where a 6800GT has 80. (1280x960/4xAA/8xAF)
ummm, can we have a few more mid-range cards in there? How about always benchmarking the cards in anandtech's mid-range and value buyer's guides? How about hot sellers such as the 9800 Pro.
Most people who read these, IMHO, dont have $500 cards.
#9, Running at that resolution is definitely in the minority. Just as #25 mentioned a $800 CPU isnt too common, so goes people running CS at that resolution.
bigpow: Again, same as #15. If you wnat to see the source engine limited on CPU, then we could stick an Athlon XP 2200+ in there and all of the cards would get 23FPS. That would not accurately show which card performs better. We are in the business of benchmarking hardware, not video games.
I'm getting sick of reading reviews that rely on 800$ CPU (FX53)
How many percentage of the readers do you think have Athlon-64 FX53
IMO, I'd read, remember & enjoy these reviews more when they're more real.
#18 If there are any typos, please point them out any typos and they will be corrected. We have already fixed the problem our first commenter pointed out.
#19 We have Extreme cards from a couple different manufactures. We also have Platinum cards from a couple different manufactures. We wouldn't still be testing these cards if all we had were NV and ATI reference samples.
Umm, i dont know why, but there are no official Ultra Extremes only overclocked ones, nVidia has stated that they told the Add on manufacturers they can indeed overclock their cards, but they cannot call it the Ultra Extreme, so i dont know why u have that card in there as there are none, if u would have called it an Ultra OC then that would have been fine, because it seems there will never be an Ultra Extreme.
Regardless of the quality of the article and benchmarks and whatnot, it seems like there are a lot of typos in this article (just takes away from its credibility and professionalism IMO).
Shame about the X800 pro. Would be interesting to be able to compare it to the GT at high-res. Would be interesting to see how much the GT benefits from having all 16 pipelines at the high-res scenario... (Or how much it loses)
In either case, I disagree with #5.
It doesn't show clearly that NVidia isn't performance leader.
On the other hand it shows that NVidia isn't clearly the performance leader. :P
Performance-wise, I'd call it a tie for now. They're both damn fast as far as I'm concerned. ATI are working on improving their Doom 3 performance, and I have a hunch NVidia are going to put some more effort into their HL2 performance now.
Anyway, to those wanting to see a mid-range card, you've got the 4400. You should be able to extrapolate from that.
Granted these are real game benchmarks, but we can extrapolate and estimate like the article said. HL2 will probably be more cpu intesive and less graphically intensive. These benchmarks will cheer up many people I think. Those with high end cards will be happy that whatever they chose to buy, it will run HL2 great, and mid-range card owners will be happy that their cards should run HL2 very well. The real game will probably be less graphically demanding but more cpu intenstive, so my 9700pro with my 2.5GHz A64 will probably run the game better than the graphics stress test, especially at 10x7, my lcd's native resolution.
I agree completely with #2. Benchmark it on some of the midrange cards. And a $400 6800GT isn't midrange. Specifically because Valve has said they hope the game will be scalable for slower hardware, and alot of those 9600 Pro/XT owners have HL2 vouchers, and I'm sure not all have upgraded.
Can't exactly call ATI the performance winner here. the X800 XT PE is often only a few frames better than the 6800 UE, and the GT is often a few frames better than the X800 pro. Seems almost more closer to a tie than one side actually performing better. Regardless, by using a little observation, it seems like my 9700 pro will be able to run the game just fine at 10x7, and I might even have room for a bit of eye candy :)
I'm glad to see that they used the 4.8 catalyst drivers. I was wondering whether you guys can run Doom 3 benchies with the 4.8's as well. With my laptop (M6805 R9600), I saw an incredible performance gain (between 15-30%) in resolutions up to 1024x768 going from 4.7 to 4.8. I was wondering if this wasy typical.
Have you ever played a game at 20*15? If you have, you know it is awesome. Why would I plop $600 on my Sony 21" if I didn't want to use the higher resolutions? Battlefield looks really good :-) I fully support seeing these resolutions in the future, I was really happy when we finally saw the shift away from 1024*768 on review sites.
#7..I can't necessarily disagree with you...I own a 6800gt (replaced my 9700pro)..my point was to mearly point out that you can't crown a graphics card line king based soley on its performance in one game. They could do these comparisons all day long and the results will flip flop back and forth depending on the tool you are using to measure performance.
These numbers are screwed up. The HL2 video stress test IS NOT ACCURATE, nearly every review site has very different results. Don't use this review as your only source of information. On my 6800GT overclocked to ultra and my AMD 64 @ 2310 i'm getting way lower results. Funny thing is the first few times I ran the benchmark my results were in line with Anandtech's review...
#5, I don't think it clearly shows that. For what its worth I have owned both nVidia and ATI (ATI Currently) and I have no problem buying the best card in price/performance ratio. I'd have to say that nVidia seems to be my choice for a new card. Also the gap in the benchmarks doesn't show ATI having that great of a lead. Its damned close.
Also this was rather disappointing. Those resolutions are just ridiculous.
This test clearly demonstrates that Nvidia is "not" the performance leader when it comes to gaming. Mearly being on top of the heap in one game "Doom3" doesn't crown you the king. Especially when that game uses a graphis engine (opengl) that is clearly not the engine of choice over the long haul in graphics engines for future game developement...that being said..the new line of 6800 cards from Nvidia are products that they should be proud of.
I wish you would include R9800Pro (very good buy right now), since it is not the same as 9800XT. Just look at xbit labs article and you will see a surprisingly big difference. Not all 9800Pros take kindly to OC-ig to 9800XT levels, and many people just don't bother!
And those of you who want to see how other, lower end cards, perform under Source, may want to check it out as well.
The GF4 4400 is not last generation hardware. It is two generations old. We now have GF 6xxx and before that was GF FX 5xxx. If anyone is still using a GF4 to play current games, I think they know they will be running at resolutions at 800x600 with no AF or AA. No one needs a benchmark test to prove that. Besides anyone who can't afford a last generation or this generation video card, probably don't have an Athlon 64 FX in their rig. So looking at a GF 4 4400 with this processor will tell you nothing about how your GF4 with probably a 1st generation P4 or Athlon XP will run on this game.
I'm sure when the actual game comes out we will see exactly what we saw with Doom 3 at Anandtech. A huge CPU and GPU roundup with exhaustive tests. So you will get your chance to see the 9600 in action.
I would have found it much mor useful if they'd included at least some ATI cards other than the top range. A 9600 would have been very much appreciated. Same with an FX5700. That way we could have some idea how the less financially able gamer (read: those of us with finicial obligations outside our computer) will be able to play CS:Source / HL2.
I did however appreciate the inclusion of the GF4 4400. Knowing how last gen products run on current games is important when thinking about purchasing either a new card or the new game in question.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
50 Comments
Back to Article
Ballistics - Sunday, September 5, 2004 - link
I guess I was just expecting more from Anandtech, thats all.Tired of buyer guides that are biased towards certain manufacturers (ie) the FX5900XT was released in December 2003 and has proven itself to be an awesome price/performance card. It was a sub $200 card that ran with the +$300 + cards of the time. Yet ATI, ATI, ATI was all that was touted. How come nobody picked up on that little powerhouse of a card that finally gave nVidia fans something to get excited about? There weren't even any good articles on that card until february!?!
I used to come to anandtech with confidence that what I was reading was the newest "unbiased" hardware analysis. I can't do that anymore. Thats all.
AtaStrumf - Saturday, August 28, 2004 - link
False alarm :(suryad - Saturday, August 28, 2004 - link
Hey guys, forgive my lack of knowledge but does anyone know if the final product, Half Life 2 when released that is, will have the capability to use SSE/3DNow! instructios in the CPU? Would it also take advantage of Hyperthreading? That would probably lead to higher framerates wouldn't it? I raise this question because I read at the S.T.A.L.K.E.R game website that the X-Ray engine is supposedly capable of taking advantage of not only the GPU (Duhh) but rather the type of CPU the computer has as well. Any comments? Thanks guys. As for the article, I am kind of leaning towards people who say that the games should be benhced on high end, middle end and lower end spec machines. I agree completely so it would give a better idea to most people. But since this is just a beta and I am sure that most people are interested in knowing about HL2 instead of CS IMHO I cant really blame AT for not making the article the way people want it. I am sure when they get the final released copy of HL2 all our questions will be answered. Thanks everyone.AtaStrumf - Saturday, August 28, 2004 - link
It appears that Gabe said that HL2 will be going gold on Monday, August 30th.http://www.hl2fallout.com/forums/index.php?showtop...
robbase29a - Saturday, August 28, 2004 - link
One more thing, if I may... Some of you guys have it all wrong. Yes, AT is a hardware site... that's a given. But something that (some of) you people aren't getting is that hardware just doesn't stand alone. People don't just buy the newest nvidia card because of it's awesome architecture. Nor do people buy $800 dollar cpus because of their sweet pipelines, right? Hardware is used to run software.... duh.So.. what i'm getting at: AT is using this game to glean information about all the available hardware there is. CPUs, Graphics cards, and maybe RAM too. You need to know how your current card measures up before you upgrade right?... That is why AT is going to write a comprehensive review of ALL (or it may be safer to say most) of the graphics cards out there (CPUs too). Not just the new ones. So let's stop this silly griping and wait for them to do their thing. Go AT! - Message posted with good intentions, not to hurt anyone's feelings.
Tobyus - Friday, August 27, 2004 - link
Ok, thanks Derek. That probably explains the difference. I am just amazed that my $170 processor can outperform an overclocked $800 processor. I guess the optimizations really make a big difference.Phiro - Friday, August 27, 2004 - link
Jalf said it all :)Jalf - Friday, August 27, 2004 - link
Well, as someone said earlier, AT happens to be a hardware site, not a gaming site. It's a lot more relevant for them to benchmark new cards in an interesting game, than 3 year old ones.Presumably they're more interested in which of the new cards actually performs best, than in how old harware HL2 can handle.
Makes sense to me, and doesn't bother me the least. I read AT to learn about new hardware, and to know what I should upgrade to, not to find out whether my current system can run games. I use actual game reviews for that.
Gugax - Friday, August 27, 2004 - link
We all need to chill out a little bit. I am shure AT will do a complete review as soon as they have a final HL2 copy to benchmark.My suggetion goes to havinga bang for the buck report. Better if including both Doom3/Hl2 results combined.
Most people prefers one of the games. But everybody sooner or later will use both engine based games to play on. And if they are using this benchmarks to help them decide the best way to go, this shoul help them.
And yes, not everybody is lucky to have a 800$ CPU... (altough you used it to take the cpu out of the equation, I know). :)
robbase29a - Friday, August 27, 2004 - link
I think everybody needs to chill about AT not including the midrange video cards. I would also liked to have seen them, but we have to keep in mind that this is NOT a real game. This is just a preliminary test of a test world. I'm sure AT will come out with a full blown (midrange included) review when the real thing comes out. If everyone just exercised a little bit of patience, we wouldn't have such hot heads floating around.DerekWilson - Friday, August 27, 2004 - link
we run with default configuration -- trilin opt on aniso opt off ... this probably accounts for the issues.there isn't a config that you can set to make nv and ati do the exact same thing. unfortunately. also, most people run default settings when it comes to opts (AFAIK).
Tobyus - Thursday, August 26, 2004 - link
Derek, I may have missed it in the article, but did you say whether or not you enabled Trilinear and Anistropic Optimizations? Also, I didn't see whether you ran with vsync off, but I am guessing you did since that causes about a 10 fps performance loss on my system.I ran the benchmarks with this system (yes, they are beta drivers, and I had Anistropic and Trilinear optimizations enabled, but I also ran the test with 61.77 drivers and at 1600x1200 with 4xAA, 8xAF and the Highest detail settings including Water: Reflect all, I was getting 52 fps)
Athlon 64 3000+
MSI K8T800
1GB OCZ PC3200
Geforce 6800 GT
Windows XP Pro SP1
DX9.0c
Forceware 65.62
My tests were all run with the highest settings in the advanced options, except reflect world/reflect all which I will specify in each individual benchmark. These tests were also run with 4xAA and 8xAF.
800x600
Reflect World: 126.88
Reflect All: 114.45
1024x768
Reflect World: 113.25
Reflect All: 102.98
1280x960
Reflect World: 88.95
Reflect All: 83.25
1600x1200
Reflect World: 55.61
Reflect All: 53.21
2048x1536
Reflect World: 31.30
Reflect All: 29.98
I don't understand why I had better performance than your system Derek. I have nothing overclocked, and the only settings I can think of that I have enabled that you may not have are the optimizations. Is it true that ATI cards default to running with optimizations and they cannot be disabled? If that is true, I would think that it would be fair to enable optimizations on the nvidia cards and that may show a nice improvement and a closer race between the two brands of cards.
SirDude - Thursday, August 26, 2004 - link
"As a student at North Carolina State University, Derek Wilson [B]double majored[/B] in both [B]Electrical and Computer Engineering[/B]. After graduating, Derek brought his extensive Engineering background to work with the AnandTech team. Derek's specializations include [B]compiler theory and design[/B], giving him [B]a unique understanding of microprocessor architecture and optimization[/B]. He has also done [B]extensive work[/B] in the [B]3D field[/B], having [B]designed and implemented[/B] his own [B]3D rendering engine[/B] as well as having done much [B]programming for modern console platforms[/B]. Derek's hands-on experience in the realm of 3D graphics gives him a unique eye in his coverage of the PC graphics industry."#35 you know better then this guy I suppose? Here's an idea for ya', why don't you shut up and go away you [B][U][I]Troll[/I][/U][/B]
thelanx - Thursday, August 26, 2004 - link
Just read the rest of the article, after reading between the typos and reading the conclusion, it appears even with the console commands, the FX series is still running the DX 8.0/8.1 path, even if you try to force the dx 9 path. Thus AT is justified in not including the fx 5950 in their review.#35, Looks like you could benefit from some homework too, perhaps reading that article you posted more carefully. ;) Next time, give constructive criticism but try not to be so harsh. You aren't the only one guilty of harshness, intellectual discussion and debate is great, but many of the discussions on the net would be better with more cool heads. :)
thelanx - Thursday, August 26, 2004 - link
#35 As I recall, and as the article you posted confirms, the 6800 series do not automatically run the benchmark in dx8, it is only the fx series and below.Ballistics - Thursday, August 26, 2004 - link
If you guys would have done your homework before posting this article you would have been informed. Having done that you could have accurstely informed us.Don't know a good way to benchmark CS Source? Don't know how to force the hardware to use DX 9.0 ? Don't mention that all nVidia cards are forced to use DX 8.1 while ATI trudges away at DX 9.0 and coincidentally falls behind?
Here's a link to the article: http://www.firingsquad.com/hardware/half_life_2_fx...
Educate yourselves.
yanon - Thursday, August 26, 2004 - link
In the future, Anandtech should do at least two benchmarks--one for the extreme gamer and one for the average gamer.Right now, the extreme gamers probably have AMD FX-53, the Raptor Drive, 2 Gbytes worth of elite super overclocked ram, and Geforce 6800GT/Ultra.
The average gamers probably have something close to AMD XP 2500+, any 7200rpm harddrive with 8Mbytes of cache, 512 MB to 1 GB worth of value DDR ram, Geforce 5700/ATI 96000/ ATI 9800.
yanon - Thursday, August 26, 2004 - link
The sentiment is clear. People want to see a benchmark score for a setup that includes AMD XP 2500+, ATI 9800 Pro, and 512 Mbytes of DDR 3200 ram.flashbacck - Thursday, August 26, 2004 - link
Can you guys post results for more midrange hardware? Not everyone has a Geforce 6800 XT, Radeon X800 SuperMegaUltraProPlatinumSpecialLimitedEdition or Athlon 64 50000+.Cygni - Thursday, August 26, 2004 - link
I like the way people are bitching about typos on a site thats offering FREE articles to the public. Jeeze.And oh, I dont really care what OTHER sites are getting on these tests. If you have been around the net, you know the likelyhood of Anandtech being wrong is pretty close to nil. This aint Toms.
T8000 - Thursday, August 26, 2004 - link
When you look at the CPU scaling, I think you have no need to worry about that, unless your CPU runs below 2 Ghz, like the Athlon 2200+ mentioned above.Besides, Geforce 6800 cards scale pretty predictable, so a $200 Geforce 6800LE should still get above 50 fps where a 6800GT has 80. (1280x960/4xAA/8xAF)
saiku - Thursday, August 26, 2004 - link
ummm, can we have a few more mid-range cards in there? How about always benchmarking the cards in anandtech's mid-range and value buyer's guides? How about hot sellers such as the 9800 Pro.Most people who read these, IMHO, dont have $500 cards.
FuryVII - Thursday, August 26, 2004 - link
#9, Running at that resolution is definitely in the minority. Just as #25 mentioned a $800 CPU isnt too common, so goes people running CS at that resolution.KristopherKubicki - Thursday, August 26, 2004 - link
bigpow: Again, same as #15. If you wnat to see the source engine limited on CPU, then we could stick an Athlon XP 2200+ in there and all of the cards would get 23FPS. That would not accurately show which card performs better. We are in the business of benchmarking hardware, not video games.Kristopher
KristopherKubicki - Thursday, August 26, 2004 - link
#15 this is true of any video benchmarking, whether from AnandTech or not. The idea is to eliminate bottlenecks to show video card performance.Kristopher
bigpow - Thursday, August 26, 2004 - link
I'm getting sick of reading reviews that rely on 800$ CPU (FX53)How many percentage of the readers do you think have Athlon-64 FX53
IMO, I'd read, remember & enjoy these reviews more when they're more real.
saechaka - Thursday, August 26, 2004 - link
hey araczynski maybe you didn't like cs cause u sucked. heheklah - Thursday, August 26, 2004 - link
Any chance we can see some preliminary benchmarks using the beta 66.00 Forceware?http://www.3dchipset.com/drivers/beta/nvidia/nt5/i...
Early results show a 10-15% bump in VST scores.
araczynski - Thursday, August 26, 2004 - link
...hmmm, am i the only one in the world who doesn't give (and never did) a sh*t about CS?I love HL and probably will love HL2, but i think i had more fun playing 3D pac man then CS.
BF1942(ForgottenHope) is a thing of beauty however.
DerekWilson - Thursday, August 26, 2004 - link
/me notes the irony of having a typo in a comment about correcting typos ...We should work on getting editable comments :-)
DerekWilson - Thursday, August 26, 2004 - link
#18 If there are any typos, please point them out any typos and they will be corrected. We have already fixed the problem our first commenter pointed out.#19 We have Extreme cards from a couple different manufactures. We also have Platinum cards from a couple different manufactures. We wouldn't still be testing these cards if all we had were NV and ATI reference samples.
Drayvn - Thursday, August 26, 2004 - link
Umm, i dont know why, but there are no official Ultra Extremes only overclocked ones, nVidia has stated that they told the Add on manufacturers they can indeed overclock their cards, but they cannot call it the Ultra Extreme, so i dont know why u have that card in there as there are none, if u would have called it an Ultra OC then that would have been fine, because it seems there will never be an Ultra Extreme.esun - Thursday, August 26, 2004 - link
Regardless of the quality of the article and benchmarks and whatnot, it seems like there are a lot of typos in this article (just takes away from its credibility and professionalism IMO).DerekWilson - Thursday, August 26, 2004 - link
#15Sorry, the video stress test does not run with any sound. It actually does (as much as possible) what it says -- it focuses on video performance.
Jalf - Thursday, August 26, 2004 - link
Shame about the X800 pro. Would be interesting to be able to compare it to the GT at high-res. Would be interesting to see how much the GT benefits from having all 16 pipelines at the high-res scenario... (Or how much it loses)In either case, I disagree with #5.
It doesn't show clearly that NVidia isn't performance leader.
On the other hand it shows that NVidia isn't clearly the performance leader. :P
Performance-wise, I'd call it a tie for now. They're both damn fast as far as I'm concerned. ATI are working on improving their Doom 3 performance, and I have a hunch NVidia are going to put some more effort into their HL2 performance now.
Anyway, to those wanting to see a mid-range card, you've got the 4400. You should be able to extrapolate from that.
ir0nw0lf - Thursday, August 26, 2004 - link
Was the sound turned on or off during these tests? There is no mention that I could find of that, perhaps I missed it being mentioned?thelanx - Thursday, August 26, 2004 - link
Granted these are real game benchmarks, but we can extrapolate and estimate like the article said. HL2 will probably be more cpu intesive and less graphically intensive. These benchmarks will cheer up many people I think. Those with high end cards will be happy that whatever they chose to buy, it will run HL2 great, and mid-range card owners will be happy that their cards should run HL2 very well. The real game will probably be less graphically demanding but more cpu intenstive, so my 9700pro with my 2.5GHz A64 will probably run the game better than the graphics stress test, especially at 10x7, my lcd's native resolution.Zephyr106 - Thursday, August 26, 2004 - link
I agree completely with #2. Benchmark it on some of the midrange cards. And a $400 6800GT isn't midrange. Specifically because Valve has said they hope the game will be scalable for slower hardware, and alot of those 9600 Pro/XT owners have HL2 vouchers, and I'm sure not all have upgraded.Avalon - Thursday, August 26, 2004 - link
Can't exactly call ATI the performance winner here. the X800 XT PE is often only a few frames better than the 6800 UE, and the GT is often a few frames better than the X800 pro. Seems almost more closer to a tie than one side actually performing better. Regardless, by using a little observation, it seems like my 9700 pro will be able to run the game just fine at 10x7, and I might even have room for a bit of eye candy :)PsharkJF - Thursday, August 26, 2004 - link
Why would you even need to run HL2 at 20x15? lol.10x7 or 12x10 is fine for me, and it looks like my old GF4Ti4200 can run it well enough.
Connoisseur - Thursday, August 26, 2004 - link
I'm glad to see that they used the 4.8 catalyst drivers. I was wondering whether you guys can run Doom 3 benchies with the 4.8's as well. With my laptop (M6805 R9600), I saw an incredible performance gain (between 15-30%) in resolutions up to 1024x768 going from 4.7 to 4.8. I was wondering if this wasy typical.blckgrffn - Thursday, August 26, 2004 - link
Have you ever played a game at 20*15? If you have, you know it is awesome. Why would I plop $600 on my Sony 21" if I didn't want to use the higher resolutions? Battlefield looks really good :-) I fully support seeing these resolutions in the future, I was really happy when we finally saw the shift away from 1024*768 on review sites.deathwalker - Thursday, August 26, 2004 - link
#7..I can't necessarily disagree with you...I own a 6800gt (replaced my 9700pro)..my point was to mearly point out that you can't crown a graphics card line king based soley on its performance in one game. They could do these comparisons all day long and the results will flip flop back and forth depending on the tool you are using to measure performance.Bumrush99 - Thursday, August 26, 2004 - link
These numbers are screwed up. The HL2 video stress test IS NOT ACCURATE, nearly every review site has very different results. Don't use this review as your only source of information. On my 6800GT overclocked to ultra and my AMD 64 @ 2310 i'm getting way lower results. Funny thing is the first few times I ran the benchmark my results were in line with Anandtech's review...FuryVII - Thursday, August 26, 2004 - link
#5, I don't think it clearly shows that. For what its worth I have owned both nVidia and ATI (ATI Currently) and I have no problem buying the best card in price/performance ratio. I'd have to say that nVidia seems to be my choice for a new card. Also the gap in the benchmarks doesn't show ATI having that great of a lead. Its damned close.Also this was rather disappointing. Those resolutions are just ridiculous.
deathwalker - Thursday, August 26, 2004 - link
This test clearly demonstrates that Nvidia is "not" the performance leader when it comes to gaming. Mearly being on top of the heap in one game "Doom3" doesn't crown you the king. Especially when that game uses a graphis engine (opengl) that is clearly not the engine of choice over the long haul in graphics engines for future game developement...that being said..the new line of 6800 cards from Nvidia are products that they should be proud of.AtaStrumf - Thursday, August 26, 2004 - link
I wish you would include R9800Pro (very good buy right now), since it is not the same as 9800XT. Just look at xbit labs article and you will see a surprisingly big difference. Not all 9800Pros take kindly to OC-ig to 9800XT levels, and many people just don't bother!And those of you who want to see how other, lower end cards, perform under Source, may want to check it out as well.
http://www.xbitlabs.com/articles/video/display/cou...
mlittl3 - Thursday, August 26, 2004 - link
#2The GF4 4400 is not last generation hardware. It is two generations old. We now have GF 6xxx and before that was GF FX 5xxx. If anyone is still using a GF4 to play current games, I think they know they will be running at resolutions at 800x600 with no AF or AA. No one needs a benchmark test to prove that. Besides anyone who can't afford a last generation or this generation video card, probably don't have an Athlon 64 FX in their rig. So looking at a GF 4 4400 with this processor will tell you nothing about how your GF4 with probably a 1st generation P4 or Athlon XP will run on this game.
I'm sure when the actual game comes out we will see exactly what we saw with Doom 3 at Anandtech. A huge CPU and GPU roundup with exhaustive tests. So you will get your chance to see the 9600 in action.
DefconFunk - Thursday, August 26, 2004 - link
This review was kind of dissapointing.I would have found it much mor useful if they'd included at least some ATI cards other than the top range. A 9600 would have been very much appreciated. Same with an FX5700. That way we could have some idea how the less financially able gamer (read: those of us with finicial obligations outside our computer) will be able to play CS:Source / HL2.
I did however appreciate the inclusion of the GF4 4400. Knowing how last gen products run on current games is important when thinking about purchasing either a new card or the new game in question.
mikecel79 - Thursday, August 26, 2004 - link
Great article. Found one little mistake:"Kicking a box or rolling an oil down a hill are fun enough to distract players from the game at hand"
Shouldn't that be oil drum.