Its more than just difference in visuals. By removing some of the visuals the card will run faster. The Nvidia drivers for example do not do trilinear filtering in dx they do some fake bilinear. That makes the card better than it really is.
The whining is how the reveiwers missed all this stuff. People are not getting the true story here.
Okay now I'm new here and all but DAMN do some of you whine! You act like any visual diffrences between the Nvidia cards and the Ati (of which I can't see at all) are astronomically huge! It's not, this is the first and last time I post here looks like half of the people here are fanboys!
I don't understand why the obvious differences in IQ in the Aquamark 3 4xAA/8xAF shots, for example, are totally ignored by the reviewer. Just looked at the fuzziness in the plants surrounding the explosion in the nvidia shot.
Here's is part of an addendum to the 3DCenter article direclty addressing this comparison:
"AnandTech made an extremely extensive article about the performance and image quality of the current high-end graphic cards like Radeon 9800XT and GeForceFX 5950 Ultra (NV38). Beside the game benchmarks with 18 games, the image quality tests made with each of those games are strongly worth to be mentioned. AnandTech uses the Catalyst 3.7 on ATi side and the Detonator 52.14 on the nVidia side to compare the image quality. In contrast to the statements of our youngest driver comparison, AnandTech didn’t notice any general differences of the image quality between the Detonator 52.14 and 45.23 and therefore AnandTech praises the new driver a little into the sky.
This however not even absolutely contradicts itself with our realizations. The nVidia-"optimizations" of the anisotropic filter with texture stages 1 till 7 in Control panel mode (only a 2x anisotropic filter is uses, regardless if there were made higher settings) are only to find with proper searching for it, besides most image quality comparisons by AnandTech were concerned without the anisotropic filter and therefore it’s impossible to find any differences on those pictures. The generally forced "optimization" of the trilinear filter into a pseudo trilinear filter by the Detonator 52.14 is besides nearly not possible to see on fixed images of real games, because the trilinear filter was created in order to prevent nearly only the MIP-Banding which can be seen in motion.
Thus it can be stated that the determined "optimizations" of the Detonator 52.14 won’t be recognized with the view of screenshots, if you do not look for them explicitly (why however AnandTech awards the driver 52.14 a finer filter quality than the driver 51.75 is a mystery for us, then the only difference between them is a correctly working Application mode of the Detonator 52.14). Thus the "optimizations" of nVidia are not to be really seen, whereby there is also a clear exception as for example Tron 2.0 (screenshots will follow). Whether this is now a reason to excuse the "optimizations" of nVidia about it, one can surely argue."
All on-line computer journalists should strive to inform their viewing public like these folks do.
Once again: 51/52.XX nvidia drivers do *not* apply trilinear filtering in D3D when AF is on. The 51.75 at least, applies trilinear to the first (0) stage, though not *AT ALL* to any other stage - the 52 series does not apply trilinear filtering to any stage in D3D, regardless.
Bing! Bing! Try again!
May I suggest the filtering tester used by 3dCenter, and perhaps a mipmap shading program (as used by everyone in the known universe), and rthdribl to discern *ACTUAL* image quality via high dynamic range light source rendering.
The only thing I give Anand credit for is allowing us freely write about his review. I mean he did not have to allow us to reply in a open forum.
After reading it I am not at all suprised at the heat he is taking, I hope he was not either.
The review had potential but was squandered.
Todays cards all all fast enough to do Dx8 games. The question is can they do it will all the goodies turned on?
The main reason to buy a ATI 96-9800/5900U is to clean up the graphics but not at the expense of speed. If you don't care about image quality stick with your GF4 or 8500 as they both are horrid vs the new gen cards.
An old Gforce 4 kicks butt in mnay games so long as you don't have FSAA turned on.
Most people know that the 5x.xx detonator drivers do reduce image quality in many areas. This is not a driver bug its what Nvidia choose to keep pace. Image quality is much more subjective than FPS. People are not buying 400.00 video cards for the speed alone.
Anand glossed over/hid quality issues, the one area where subtle reductions here and there add up to large FPS gains.
People will say so what the XT gets recommended in the end why bitch?
Well its the principle, The review made the 5900 seem much closer to the XT than it actually is.
When a driver (beta one at that) improves speed that much it deserves a much closer inspection than what Anad gave.
Someone threw Anand a pass but he dropped the ball :(
I didn't care for this review for the following reasons:
Many comments on IQ in part 1, but no followup in part 2. There were so many comments that they needed to be mentioned, even if it was to say that they discovered it was some wrong setting and they fixed it.
Small cropped compressed images used for IQ comparison. If the image is compressed how can we judge it? The only way to present IQ comparisons to the reader is to show them the exact images the reviewer saw, without compression or cropping.
Apples to apples. All of the benchmarks for all games should have been done in the same format unless it was impossible to achieve certain settings on a given card at a given resolution. Changing the metric for TR:AOD was a bad idea. Both parts should also have been done on the same system. For all we know the ignored IQ issues from part 1 could have been due to the AGP implementation on the first board. We just don't know.
Gunmetal is also a very poor DX9 benchmark, since it relies on VS 2.0 and PS 1.1 only. Since most of the benefits of DX9, and the controversies for that matter, revolve around PS 2.0 this benchmark is not a good exemplar of DX9 performance. I also find the fact that Gunmetal was co-developed by Nvidia something that needs examination. IHVs have no place in developing benchmarks, they should stick to technology demos.
Now I don't know if the 52.14 drivers do what the article says they do or not. I know Digit-Life said they gave up to a 20% improvement in some cases, and some improvement is certainly credible. However, this article as written does not support the conclusions that the 52.14 provide significant performance boosts with no IQ loss. I am not commenting on whether they do or not perform as advertised, only that you cannot draw that conclusion from the article.
Great job with the review. I'm so happy to finally see benchmarks in more than just FPS's. I'm always curious to see what kind of benefits can be had by upping my video card in RTS games for example. Take a rest, your brain must be fried from all that benching.
One question. I have an LCD monitor and I can't get Generals:Zero Hour to run at 1280x1024. How did you manage to get that resolution for your benches since it is not offered in the game menu?
Anadtech is a total liar. The 52.14 quality of picture sucks and so does my 5900 card sucks. Its garbage like it was before and the 52.14 drivers for sure are not helping it become better. As for the pictures i dont know how he dares even to say there is no difference in quality it for sure is a big problem the quality with all the games and programs i tried out.
B) D3D filtering suffers from the following "optimizations"
Application Mode: 1) True trilinear is never utilized at all...on any texture stages. It's now all "pseudotrilinear"
Control Panel Mode 1) Same pseudotrilear as above 2) Proper Aniso level selection is only applied to texture stage 0. No matter what the aniso level selected (2x-8x), only 2x is applied to stages 1-7.
Good to see you are reworking the Flight Sim 9 BM. Your achieved FR in part 1 were so much higher than what real simmers are getting. Many people would be happy to get a reliable 25 FPS out of the game. Once there we can worry about IQ. So you need to push the texture sliders etc to put a real load on the system in this game. A continuing argument with this game is the importance of CPU/memory relative to graphics card in achieving acceptable frame rates. Anything in your testing that could shed light on this would be invaluable.
As long as I'm bothering you, I'd like to request Halo numbers with AF for your next review/roundup. AF really spruces up IQ, IMO, and it's a shame (almost pointless) to buy $500 cards and not run at the highest IQ possible. I'd also appreciate comparison pics with AF, as well. Thanks!
Might I suggest you remove the Tech part of your Sitename ? there's not much tech anymore Anyway this is not about Nv vs Ati, it is Nv vs DX9, It still states on NVidia's site, the FX series are DX9 card which they are not!! Inform the people as it should & not as your Nv-Paymaster is telling you!!
"We were told by NVIDIA that these new drivers would not only improve performance, but that they would be made publicly available the very same week we tested with them. Obviously, that didn't happen, and it ended up taking another month before the drivers were released. The performance gains were tangible, but the drivers weren't fit for release when NVIDIA provided them to the press and honestly shouldn't have been used.
Hindsight being 20/20, we made a promise to ourselves that we would *not allow any further performance enhancing drivers to be used in our video card reviews unless we could make the drivers publicly available to our readers immediately.*"
"If you think that's wrong, then you have a problem."
If anand had not stated that he would not do benchmarks with unreleased drivers then you might have a point. Especially if an in-depth examination of the filtering and antialiasing quality had been undertaken. Too bad that didnt happen, despite what was claimed in the introduction.
Making oneself a hypocrite to appease the publics' whims by using unavailible software (for one manufacturer, but not another) is a direct shot to your own credibility.
But credibility is no longer important to this site, and many of the other premiere hardware review sites.
Page hits and advertisments pay the bills, do whatever it takes.
This is the most interesting sentence I found in this review: "It just so happens that the default Microsoft compiler generates code that runs faster on ATI's hardware than on NVIDIA's."
So it is M$'s fault that nVidia's card runs slower? How do you know? Plus the numerous problem mention by other people, I can't believe this is from a famous harware review site. I don't think this review will earn nVidia any credit, instead Anandtech lose credit for it.
BTW, I don't think a driver update can improve performance by 50%, as the 52.* driver did, unless the old 45.* driver were written by a new graduate. There MUST be some trick in the 52.* driver.
Hey Mr. NostraDUMASS (#93), you want some cheese to go with that whine? NV40 will be 3D king?! Only in a 4D world, dreamer. Looking at the graphs and the numbers pretty much sums it all up - a crushing defeat for you know who. Anand and his crew did a decent job. They even tried to candy-coat it to appease the big N.
nvidia is coming back slowly due to stupid policy and strategy of nvidia ceo they have lost their supremacy and reputation in 3d graphics world but i'm sure next gen... nv 40 will be 3d king nvidia is smarter than ati . ati architecture is built on brute force of 8 pipelines and 256 bit memory nvidia cine fx is much more sophisticated.
You know why he used them? Simple, actually. Because people wanted him to test them, and the public at large wanted to see how they perform. If you think that's wrong, then you have a problem.
Im highly amused that anyone would flame someone for taking anand to task for using UNRELEASED DRIVERS WHICH ARE IN NO SENSE AVAILIBLE TO THE PUBLIC THROUGH THE MANUFACTURER.
Anand has said on several different occassions in the past that he would NEVER BENCHMARK WITH UNRELEASED DRIVERS.
He lied.
He did not bother to investigate the veracity of the claims nvidia made about image quality, inluding the total lack of true trilinear filering. Why not? Nobody knows, and anand won't say.
There are several very well established methods for determining the true quality of a cards' filtering and antialiasing scheme. Anand used none of them. Why? Coincidentally, everyone who has actually bothered to check the quality of the nvidia AF on any driver in the 50/51 or 52 series has found trlinear filtering simply does not work as advertised.
Indeed, far from predicting future DX9 title performance, the new test suite appears to be heavily biased toward legacy DX8/8.1 pixel shading, something the FX architecture is basically built for, and excels at.
The game (TR:AOD) that probably uses the largest number of DX9-class shaders (PS2.0) was not even listed in terms of absolute numbers. Instead, we have a percent decrease!!! Why was this single game treated so differently when it would likely be the best predictor (along with Doom3 in ARB2 generic mode, and HL2) of future shader performance?
Coincidentally, the R3XX architecture dominates this test - absolute frame rates would have been heavily embarassing for nvidia. Lucky anand didnt bother to list them! Instead, he spent several paragraphs discussing water that didnt render quite right.
The GeForceFX family are decent cards, with very good performance indeed in DX8 class games. DX9 games with real DX9-class shaders is a different story altogether.
But you won't be getting that story here, and anyone who bothers to bring it up will be labeled an "ATI fanboy".
How sad. How childish. How utterly self-serving.
This article proves quite handily just how much of anand's credibility has been lost. Anand has jumped on the Thomas Pabst bandwagon. Next stop, [H]ardOCP. Toot, Toot!
what happened 2 ur glasses anand ?? just by looking at it u can see clear differences in the way NV and ATI render aquamark. and what about the missing FPS in TRAOD ?? instead of writing summin like "NVIDIA really gets crushed by ATIs PS2 performance." ur saying " It is very clear that the way ATI handles rendering TRAOD's PS2.0 code is more efficient." and only show sum _absolute irrelevant_ ps2 ratios not correlating with each card in ANY WAY ?!? ooookay, so ur praccing 4 ur politician-career, arent ya ? honestly, that article feels so biased towards NV its just not decent anymore...anandtech was my favourite site when it comes 2 reviews but after that i rather stick 2 another one when it comes 2 gfx cards it seems. i really wondered how anand could be the first site benching a gfx5950 and here we got the answer it seems...gn8
To 90, perhaps you should be a writer for Anandtech instead, since you seem to be much more critical, although it would appear you don't run a world reknowned hardware site.
I think Anandtech should consider that quantity doesn't equal quality.
If it was only fps, then lots of games is perfect. But we were supposed to be talking IQ in this part. Well... I saw very little actually written about IQ in the article.
On the very first page, I immediately notice big differences in IQ. On the tank I see lots of places were the NVidia drivers don't seem to do any AA. And the details around the explosion are far more blurry. Those things are obvious allthough I'm looking at a reduced size JPG! And I see no remark about IQ at all !?!?!?
Going to F1 challenge to the 4xAA/8xAF. Or is it? I see absolutely no AA *anywhere* with the NVidia drivers. And it really really really obvious in these shots!! Again, no comment about IQ at all?
Do I need to go on? Gunmetal also shows AA differences.
Finally, with Homeworld Anandtech notices too that there's no AA with the NVidia drivers. Luckily, it's a know issue...
Jedi Knight. Well, here the NVidia does do some AA. But it's really little compared to ATI's. No IQ difference???
Neverwinter. Look at the pilar to the right, and the plants on the left. Do we again see clear AA differences, like NVidia seem to forget those objects?
But it gets worse...
How on earth can anyone place these utterly rediculous Tomb Raider and Wolfenstein screenshots?????? Come on! I've seen beautifull screenshots of streets in Paris in Tomb Raider. Perfect for testing IQ. And all we get is a 90% black screen? Same goes for Wolfenstein...
What where you guys thinking?????????
This is so incredibly far below the high standards that I'm used from Anandtech... Please look again at this article, and do a proper job at assesing IQ.
#41, maybe you and your wife should start a website, you could benchmark ATI cards exclusively. That way ATI would always wind up on top. Admittedly, I'm an ATI junkie (I own a Radeon 8500 and plan to buy a 9600XT ASAP), but enough is enough. (By the way, what's up with the bread/butter analogy? You seem very fond of it.) Seriously, though, either of these cards are really fast and aside from IQ differences, you couldn't tell a difference. A little question for anyone who would know, though: How much does IQ drop going from PS2.0 to PS1.4? I have Halo and I'm wondering how much better it would look on a DX9 card instead of DX8.1.
if you look at the gunmetal screenshots, that is my only beef with ATI, the scenes are not rendering completely or properly it has happened to me in a lot of games, black areas.
The article does seem somewhat comprehensive that is true, but: a)other sites reviewing the software did not come to the same conclusions, mainly problems with trilinear and AF again.... b)I have yet to see a review that claims to be unbiased have this much opinion sprinkled all over, mainly pro nVidia which relies on IQ comparison which i refer to in a c)the drivers are beta and not whql so who knows what we'll get as consumers d)the hardware is not yet anounced formally by nVidia e)it seems the choice of what to show on graphs is very subjective,TRAOD shows percentage drops with PS 2.0 but what are the framerates? I do hope this review is correct because it means nvidia are back but due to the above stated qualms I have I can't trust this review.
The article is extremely comprehensive, as one would expect from Anandtech. Some issues of note:
1. It was pointed out that the 5900 and the 5950, in many areas, performed almost identically. This doesn't pose well for nVidia. 2. I'm bothered by the tremendous frame rate difference between ATi and nVidia in some of the titles. It leads me to believe there's something underlying going on, and it's not just a simple card/driver issue. 3. It's nice to see the IQ back to where it should be, as visual quality should never be compromised for performance, unless the user makes the adjustments to do so. 4. I will admit it sort of seems that there is some bias towards ATi, but it's not flamingly apparent. Again, it is just my perception, and doesn't necessarily mean that there is. 5. The most accurate remark made in this review is simply that we are not in the world of DX9 games...yet. To that end, DX9 performance is not nearly as important as it will be. When it is, I think things will step up a few notches.
Nicely detailed article, and I appreciate the additional games for benchmarking. Any chance we could seee the use of a flight/combat sim program like IL-2 or Mechwarrior?
I don't know why everyone is believing the IQ results (or even trying to use Photoshop to check the differences). These pics are JPG's! They're already manipulated by the compression logic, and who's to say these pics are true?
Excellent work Anand and Co. I found the article very informative, and although certain folks don't enjoy reading your "opinions" on some of the benchmarks, I thought they were very appropriate. It will be interesting to see how the official driver releases function under the latest and greatest DX9 and OpenGL games...
I seriously suggest that you upgrade everything else in your machine, reinstal drivers, game and defrag.
Mine runs perfectly at 1280*1024 with the max AF and displays between 40-60fps all the way using the cg_draw command and that's GAMEPLAY framerates .... with sound, AI and all the whistles. I see no need for AA at that resolution thou (not a nice IQ/performance trade there)....at 1024 it does wonders thou.
#67 I think the lightsaber glow is horrible on the Nvidia cards. They glow shines THROUGH the players head. Looks to me like a bug. I like the ATI saber much better. (Most peoples heads aren't empty, so light does not shine trough. Maybe your experience is different? ;-)))
#76 Couldn't agree more. The blurry AA in aquamark is crystal clear even in those tiny images. So how could the authors possible miss that and proclaim that there is no IQ issues? Especially since they have looked at the fullscreen images and spend days on the article?
Also you can immediately see in all the small images that in general AA is better on the ATI card. This is nothing new, and not considered cheating by Nvidia. It's just that most know that there is a quality difference. But shouldn't that at least be mentioned in an article that is focused on image quality?
Why no screenshots on splinter cell? We should just believe the authors on that? With the aquamark pictures they have shown that we can't take their word for it. So I'd really like to see those screenshots too. Same for EVE.
And I was really suprised that they didn't know that the water issue in NWN was NOT Ati's fault. They claim that they have surfed forums on NWN issues. In that case they should have known that. (one look at rage3d would have been enough)
And on top of this the TRAOD part. It seems they typed more text on TROAD then they did in the entire rest of the article. No wonder that people frown at the TROAD part.
All in all, I can see that much work went into the article, but I feel that it could have been much better. As it is now it is left to the reader to find the image issues in the small pictures. But I would expect the author to point me to the image issues.
#74, conclusions are one thing, objective journalism is another. There are clear differences in even the small and relatively badly chosen images posted with the article, yet all we get to read is "there are no IQ issues".
Thus, either the authors of the article are not competent enough (maybe they were simply too tired after the testing...) , or they are intentionally ignoring the differences.
I just can't stay aside and not to thank the authors. The job they've done in this article is amazing, and the site was and will be my all-time favourite! Thank you! :)
I am extremely confused with the posts here. Many ATI guys seem to think AT unfairly favored the nVidia cards. Did we read the same article? In the end I came away with the opinion that while the new Det 52.xx help, things may get better for nVidia, the ATI is still a better choice today. Did I miss something?
Additionally for all the guys claiming TR:AOD is a great game. Yeah, we all know only the truely *great* games pull a %51 rating over on www.gamerankings.com (based on 21 media reviews).
Just what kind of world do we live in before a guy has to say why he's not a fanboy before they express their opinion, anyway? The worst part is, you people who do this, you're completely justified in your actions, because if you don't explain why you're not an ATi/nVidia fanboy then people call you one.
God.. can't we argue without calling others fanboys for once?
What kind of biased crappy unproffesional review shows percentage drops for enabeling ps 2.0 without showing framerates? if fps are around 30 to begin with the % of fps drop makes no difference cause the game is rendered unplayable! and who benchmarks beta drivers not available to the public on hardware not yet anounced?this reeks with $ payoff and seems like anadtech have thrown thier integrity to waste.I wish that on the 10th when nvidia anounces the nv38 they also release these drivers to the public than some seious review site can actually test the hardware (and software, forgive my skeptisicm but nVidia sure earned it this past year) and show us what nVidia is bringing to the graphic's field. Disapointed by nVidia and now by Anandtech
Not everyone talking about IQ differences here is a fanboy.
Look at the images at the bottom of the Aquamark 3 IQ page (highest quality AA, 8xAF). The nVidia 52.14 image is blurred, much detail is lost especially around the explosion. The Catalyst 3.7 image is way sharper, yet its AA is smoother (look at the car body above the wheels), and it loses much less detail around the explosion. The differences are much more than "barely noticeable".
The tiny images don't give much credit to the article, though.
(Before anyone calls me az ATI fanboy: I have a GeForce FX 5600 dual DVI.)
Didn't anyone notice that Ati doesn't do dynamic glows in Jedi Academy with the 3.7 cats!? Look at the lightsabre and it's clearly visible. They only work with the 3.6 cats and then they REALLY kill performance (It's barley playable in 800*600 here on my Radeon 9700 PRO)
funny to see that ati fanboys can't believe that nvidia can bring drivers without cheats. And nobody talk about the issues in TRAOD with ATI cards, really very nice...
WTH did you benchmark one card with unreleased drivers (something you said you would never, ever do in the past) and use micro-sized pictures for IQ comparisons? You might as well have used 256 colors.
The Catalyst 3.8's came out today - the 51.75 drivers will not be availible for an indeterminate amount of time. Yet you bench with the Cat 3.7's and use a set of unreleased and unavailible drivers for the competition.
I suggest you peruse this article: http://www.3dcenter.org/artikel/detonator_52.14/ from 3DCenter (german) to learn just how one goes about determining how IQ differs at different settings with the Nvidia 45's, 51's, and 52's. Needless to say, everyone else who has compared full-sized frames in a variety of games and applications has found the 5X.XX nvidia drivers (all of them) do selective rendering, and especially lighting.
And why claim the lack of shiny water in NWN is ATi's fault?
Bioware programmed the game using an nvidia exclusive instruction and did not bother to program for the generic case until enough ATI and other brand users complained. This is the developer's fault, not a problem with the hardware or drivers.
Nice article. I like that you benched so many games.
Unfortunately you missed that the Det52.14 driver does no real Trilinear Filtering in *any* DirectX game, regardless of whether you're using anisotropic filtering or not. This often can't be seen in screenshots but in motion only. Please have a look here:
TR: AOD is a fine game, you just have to play it...
Sure there are some graphical issues on the later levels but there's nothing wrong with the game as such and considering that it has made its way into a lot of bundles (sapphire and creative audigy 2 ZS to name two) I believe it will recieve a fair share of gameplay.
You guys need to stop talking about gabe newell...for such a supposed good programmer he sure needs to learn about network security...We all know he's got his head up ATI's rearend. The funny part is that they are bundling hl2 with the 9800xt (a coupon) when it isn't coming out until april now. Who's to say who will have the better hardware then? Doom 3 will likely be out by then. In 4 months when the new cards are out you guys won't care who makes the better card the 12 year old fan-boys will be up in arms in support of their company. I owned the 5900u and sold it on ebay after seeing the hl2 numbers. I then bought a 9800pro from newegg and on the first tried ordering the 9800xt from ati which said it was in stock. 2 days later they told me my order was on backorder and hassled me when I wanted to cancel. One thing I'd point out is that war3 looks much better on the 5900u then the 9800. It looks really dull on the 9800 where it's bright and cartoony (like it should be) on the geforce. Either way who knows what the future will hold for both companies but let's hope they both succeed to keep our prices low....
IQ part was crappy at best. small screenshots in open not-so-detailed areas and sometimes there was no option for a big one to check.
You can call me what you want, but there are quite a few reviews there that will disagree BIG time with what has been posted about IQ here. And it is impossible all of them are wrong on this at the same time.
HomeWorld has shadow issues in ATI cards with cat 3.7, yet that ain't shown there anyways....this goes for both ways.
If you ask me, NVidia got his DX9 wrapper to work fine this time.
Um what happened to post #41 where the guy detailed all the inconsistencies of the IQ comparisons? Please don't tell me you guys actually modded that post....
I haven't had the chance to go through everything yet but those few I did, I definitely saw differences even in these miniscule caps (how about putting up some full size links next time guys?).. particularly in the AA+AF ones. It's obvious theres still quite a difference in their implementations.
I was also surprised at the number of shots that weren't even of the same frame. Honestly, how can you do a IQ test if you aren't even going to use the same frames? A split second difference is enough to change the output because of the data/buffer/angle differences etc.
Personally I wonder what happened to the old school 400% zoom IQ tests that Anand was promising and I'm fairly disappointed despite the number of games in this article.
That said, I am glad that Nvidia didn't botch up everything entirely and hopefully they'll have learned their lesson for NV4x.
I'm impressed. I've never seen a review that actually has the games I play most frequently in it. I've been un-interested in FPS games since Quake II.
In particular, I like Neverwinter Nights, C&C Generals, SimCity 4, and to some extent WarCraft III (and by extention, their expansions). I was under the impression that SimCity 4 was CPU bound under almost all circumstances, it's useful to have that shot down.
I also like AA and AF. You can imagine the slideshows I play with my Athlon 2100+, 1GB DDR, and Radeon 64MB DDR (a.k.a. 7200)
Now I just need to see the ATI AIW 9600 Pro reach general availability.
Thank you so much for this review... the detail is spectacular. After reading and lookig at all 60 pages... I am really tired. Thanks again for your dedication!
Um why are there no comparisons using two monitors with diffrent cards running . Gabe of valve said there is a set of drivers that detect when an screen shot is being taken. Or did anand just get duped by nvidia
1. Why fps was left out of TRAOD 2. Why the weirdo never seen before TRAOD PS2.0 percent loss graph? How about giving us good ole fps which is what we have been seeing for years and what we are use to, at least have both if you are going to introduce new graphs.
3. How the reveiwer seems to know "Nvidia is aware of it" and never seems to know if ATI is aware of problems? I mean he would have had to talk to Nvidia to know this. Did Nvidia pre read the review and then tell him they are aware of a problem and will fix it??
4. What motiviation does the reviewers have for helping Nvidia or at least seem optimistic. What has Nvidia done to earn this tip toeing around type of review? If anyting they have dug themselves a well deserved hole. I'm talking about Nvidias horrid behaviour as a company in the past 6 months. Why would they reward a company that pulls the stunts they have lately? Do they feel sorry for them?
All I can say is the tone of this review leads me to think there is more to this than meets the eye.
#52 yeah im sure people play games in window mode. How can u see the differences from such a small screen shots. Its well known that Nvidia hacks or shall I say "optimises" for benchmarks giving no thought to IQ. This article displays Blatant nvidia @ss kissing. There was good reason Gabe didn't want his game to be benched with det.50xx, take a guess, more hackery from nVidia. Also Anand mentions certain anomalties with the geforce fx on certain games but does not try to exlpore what those errors are and assumes nothings wrong. In homeworl the Fx isn't even doing FSSA. Geez wish the nvidia fanboys would get a clue and crawl out from under that rock the've been hiding under.
This is the most interesting article I have ever read for sometime.. First of all, I agree with #41.. I think including this many games into benchmark prohibits Anand/Derek to make detailed analysis of the games.. But there is something more interesting..
It seems that Anand and Derek tried to put an article that hides the problems with both cards. They also deliberately try to avoid giving one company favor. In one sentence, they claim ATI is best, in the next line, they state otherwise. As for the IQ comparison, many of screen captures are either dark or can not reflect what AF+AA intended to do.. If I just check the small pictures, I would say that the IQ are really similar. However, more detailed analysis reveals other problems. Besides, the review of the TROAD is the wrost I have ever seen.. If they post the frame rates, I am pretty sure that everybody will be shocked to see the results.. How won't they.. Think about it, the performance percentage loss of FX5950 is 77.5% for 1024x768 noAA/AF. Even if the game runs at 50 fps with PS1.1, the frame rate would drop to 10 fps when you switch to ps2.0 in this case.. However, refering to Beyond3d is interesting, because that site has a very detailed benchmarks of both 5900 and 9800 with this game ( I strongly recommend to anyone to see these articles who really wants to learn the actual performance of NV5900 and R9800 in the PS2.0 scenarios)
But I totally disagree with Anand in one thing.. TROAD performance is a real indicator for the future games that will uses PS2.0 by default. The games v49 patch also uses HLSL to compile directly to ps2_0_x which is actually the Nvidia's NV30 architecture, and the compiled code runs faster than Cg compiled code. Even in this case, 9800Pro still runs much faster that 5900 ( I am talking about 70 fps vs. 35 fps.).
I guess nobody want to see that his/her 500$ graphics card would crawl in the new games which uses ps2.0 by default just one year after he puchased the card.. And no! I am not a ATI fanboy.. Just a tech fan who does not tolerate to see how some sites really misdirects the readers because of their connections to the IHVs.
Oh, come on, fanboys, stop yelling at Anand for not making nVidia look bad enough. His job is to benchmark, not to rant. Jesus Christ, you people annoy me. Try printing out the three images from any given test WITHOUT looking at which one's the Radeon.
And no, I'm no nVidia fanboy, nor am I defending nVidia. I use a softmodded Radeon 9500 and I absolutely love it. I have never, ever put a GeForce FX in my system, and I'm happy to say this. But can't you people just let go?
Thats an interesting question, I suspect he does though my question is "who wants to know?" ; )
In regard to your other question. "Why can't we have a true winner now?". As for myself, I'm going to give Dereck and Anand the benefit of the doubt.
It seems to me that they realize that NVIDIA attempted to do somthing unique with it's 5000 series being that it does not exactly hold to the Direct X 9.1 spec. For instance it has a 16 bit and 32 bit rendering mode while DX 9.1 requires 24 bit - which ATI does (refer to Halflife 2 and DOOM III reviews). In the sharder area NVIDIA holds FAR more code (micro ops) than ATI - also if you check back to Anand's original post on the ATI and NVIDIA shootout(s) where there is a comparison between AA and AF NVIDIA was a CLEAR winner. I seem to recall a while ago that NVIDIA claimed ATI didn't do TRUE AF so they were therefore CHEATING. Boy did that one come back around with teeth, huh?
What I'm saying is NVIDIA tried to one up ATI by tring to do more, unfortunately it seems they tried to do TOO much and ended up doing SHADY maneuvers like the whole Future Mark mess. They should of instead focused on the spec. DX 9.1 and the Microsoft shader/pixel code path and not tried to pull a GLIDE like 3DFX (excuse the parsed english).
So, hopefully NVIDIA learns from it's mistakes modifies it's silicon to the spec. and gives us all BETTER cards to choose from come March/April.
As far as the authors are concerned, Anand and Derick seem to be attempting JUSTICE (helping the party who needs the most help, and treating all parties equally) - which in this case seems to be NVIDIA. The authors are helping NVIDIA by dropping HEAVY hints like what you stated " Next year will be the year of DX9 titles, and it will be under the next generation of games that we will finally be able to crown a true DX9 winner. Until then, anyone's guess is fair game." and " If NVIDIA can continue to extract the kinds of performance gains from unoptimized DX9 code as they have done with the 52.14 drivers (without sacrificing image quality), they will be well on their way to taking the performance crown back from ATI by the time NV40 and R400 drop.". If NVIDIA takes head of these CONSTRUCTIVE statements then the entire gaming community could benifit - in better prices, higher quality to which the customer usually benifits (AMD vs INTEL sound familiar?).
So, let us be easy and enjoy the night. Time will tell.
Cheers, aka #37
PS: Dereck please excuse me for leaving out your name before. The article was well written.
Why didn't you guys wait for Catalyst 3.8? It's out tomorrow and is reported to fix many IQ problems in games like NWN. What would a couple of days have hurt, especially since this article is going to be irrelevant after the Cat drivers are released tomorrow?
Note: the AA/AF and noAA/AF images of Warcraft3 have been mixed up for the NV52.14.
It tells a lot about the value of the screenshots that it takes careful inspection to find this error. I have played a lot of War3 recently and the difference is very noticeable in game, even with this GF4.
#18 Its not a problem figuring out the graphs its just weird that he would choose that type of graph excluding FPS.
BTW I own a 5900U and a 9700pro.
I don't like people avoiding ps2.0 tests. My 5900 sucks at it. I paid too much for what I got in the 5900. I try to get a good bang for the buck. The 5900 is not.
Secondly, Anand and I both put a great deal of work into this article, and I am very glad to see the responses it has generated.
Many of the image quality issues from part 1 were due to rendering problems that couldn't be captured in a screen shot (like jerkiness in X2 and F1), or a lack of AA. For some of the tests, we just didn't do AA performance benchmarks if one driver or the other didn't do what it was supposed to. There were no apples to anything other than apples tests in this review. The largest stretch was X2 where the screen was jerky and the AA was subpar. But we definitly noted that.
TRAOD isn't a very high quality game, and certainly isn't the only DX9 (with PS2.0) test on the list. Yes, ATI beat NV in that bench. But its also true that ATI won most of the other benchmarks as well.
Anyway, thanks again for the feedback, sorry BF1942 couldn't make it in, and we'll be bring back a flight sim game as soon as we tweak it out.
Didn't Gabe Newell complain about screen capture "issues" with the Nvidia 50.xx drivers that show better image quality in screenshots than actually shows up in game?
Anand spoke about image quality problems in almost every test in part 1, but i see almost nothing wrong with the screencaps in part 2.
It's funny how Anand and Derek did not comment on this. Maybe they missed it because they based their comparison off of those tiny images. Ah, so that's what the need of full-sized images are for?!
#33, that's what came to my mind as soon as I read this article. I think that Anand may have just provided some input, done testing, or just edited it slightly...
Some of them are cropped out so that you can't see a lot of details: UT2003, Aquamark3, Wolfenstein.
Some of them are set up so that you wouldn't get any possible artifacts with texture filtering, because of the high camera angle: Warcraft3, C&C Generals.
The Tomb Raider, Aquamark and Wolf screenshots are also too dark to notice anything. And I don't see any sign of a DX9 shader in either the Halo or the TR shots, so we have no idea of DX9 image quality.
But kudos for all the testing you've done, must have been a lot of hard work.
#30 ATI has not released performance drivers for a long time now and they already said don't hold your breath on those performance increases coming in the 3.8s either. The main focus since the 3.1s have mainly it seems been bug fixes with slight performance improvements in various games. 3.8 = more features and bug fixes with probably slight performance improvements here and there in specific games.
Would all the fanboys please take a deep breath or troll elsewhere? I swear to god some of you people will go out of your way to look for bias where there isn't any.
I own a 9800 Pro and I for one am glad that it seems like Nvidia has closed the gap considerably, their customers deserve it.
To those of you mentioned Anand a few times, you should also note this was written by two authors. Or atleast worked on together by two authors, so you should try and understand that you may different "types" of responses and analyses (sp?) of similar results if they're done by different people. I think we should wait for the 3.8 Cat. article before we jump to too many conclusions.
This is the way I take screen shots in final fantasy XI benchmark 2.
- Use Hypersnap-dx - Enable directx capture in Hypersnap - Change Hypersnap “Quick Save” settings to repeat capture every 5 seconds - Launch Final Fantasy XI benchmark 2 menu - When you click the “START” button press “Print Screen” once resolution changes.
Wow this is the biggest video card review I have ever read: Awesome!!
>Right now NVIDIA is at a disadvantage; ATI's >hardware is much easier to code for and the >performance on Microsoft's HLSL compiler clearly >favors the R3x0 over the NV3x ever heard from the ps2_a compiler target?
why is anand bashing Tomb raider and whats up with PS 2.0 graph why not just post the fps, makes it seem like nvidia is beating Ati. Also why are beta drivers being tested with nvidia. Should have used cat 3.8 for Radeon.
im not biased towards either card. i myself own a 9800pro. what concerns me is the immaturity shown by other ATI card owners. you guys act like nvidia can never measure up to ATI (which is so untrue). there was none/little difference in the IQ and benchmark results (with few exceptions, but explanations were given for the most part). also keep in mind that the 9800xt specs are higher than the 5900/5950 and it still managed to get beat in some of the tests. anyway, good job nvidia. you guys are certainly headed in the right direction. i was a bit sad to see my card excluded though :( ... they said they'll benchmark the value cards soon...i hope to see mine there ;)
ps: i could be wrong about the specs, but i do remember anand saying the XT had higher memory bandwidth (which could've accounted for some performance differences).
all in all, a good review, ill be waiting for more updates.
18, he says he saw it saw it, he doesnt know why it was there, there is no reason to exclude regular fps graphs, especially since people want to know the fps of this game, since it is the ONLY truly dx9 game in the entire suite
Haha, yet again, we see fanATIcs (#10, #14) coming out of the woodworks to claim that Anandtech's review is either biased or NVIDIA is still cheating. lmao, losers!
And by the way #14, you're plain dumb if you couldn't figure out that the TR:AOD graphs were showing a percentage difference. Christ, read the review.
You need to look at the FSAA each card empolys...go back and look again at the screenies, this time looking at all the jaggis on each card....especially in F1, it doesn't even look like nVidia is using FSAA....while on the ATI, it's smooth ......I don't think it's a driver comparison, just the fact that ATI FSAA is far better at doing the job....At least I think that's what he's talking about..hard to tell any IQ differences when the full size screenies are not working, but poor FSAA kinda jumps out at you (If your'e used to smooth FSAA)
Also worth noting, nVidia made great jumps in performance in DX9, but nothing that actually used PS2.0 shaders : (
I like the way he discounts Tomb raider. Saying it is just not a good game. Thats a matter of opinion. It almost seems like he trys to undermine that game before revealing any benches.
And the benches for that game are not done in FPS but on percentage lost on PS2.0.
On first inspection of the graphs it appears that Nvidia is leading in tombraider. But if you look at the blurred print on the graph it does say "lower is better" Very clever!
Why no FPS in that game?
Nice information in this review but it almost seems that he is going out of his way to excuse Nvidia.
#3, #7: If you take the screens into photoshop and observe the result of their 'difference', you'll see that there's a fairly significant difference between the 45's and 3.7's, but almost no difference whatsoever between the 52's and 3.7's. In most of those screenshots it's impossible to do this since the shots aren't neccessarily from the exact same position each time. Try the ut2k3 ones for example. Also these are jpeg's, so there'll be a little fuzz due to the differences in compression.
Also, if I need to take two screenshots into photoshop to be able to discern any difference between them, that's really saying alot. And since we can't refer to a reference software shot, it could be ati's driver that's off for all we know.
In any event I'm pleasantly surprised with nvidia. Their IQ has definitely caught up, and their performance is quickly improving. Hopefully the cat3.8's will pull a similar stunt.
They must have a reason for choosing those drivers. Anandtech has been around long enough for that :)
The reason is probably along the lines of when they started this benchmarking because they did soooo many games, resolutions, AA and AF levels, times the number of different cards, etc. That takes quite some time. Had they waited for the newer ATI drivers, it may have delayed this article one, or even two weeks till publishing. Also, they did mention they will do a followup articles with the new drivers, so patience is the key here.
LOL, the ATI fanboys are already coming out of the woodwork. Listen #3 and #7, it's a fact, there is no IQ difference at all between the 50 Dets and the 3.7 CATs. And if you honestly believe you're going to see much of a difference with the CAT 3.8's....you're just stupid.
yea I agree with what #3 said..hardly any commentary on the IQ... and If I'm not mistaken, aren't the new cats going to be out tomorrow???? If so, you might as well do the whole thing over again...
I saw more opinions in this article leaning towards Nvidia, especially around the Tomb Raider benchmarks. More specifically, starting with the page regarding compilers. I liked this articles, but there could have been less opinionated viewpoints on this. Also, it might be better if we get names of authors with what they typed, so we know who typed what, because some of this sure didn't sound like something Mr. Shimpi has written in the past...
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
117 Comments
Back to Article
Anonymous User - Tuesday, October 21, 2003 - link
Why are they using Flash to do this. I can't see the performance charts (or whatever they are)?Anonymous User - Tuesday, October 14, 2003 - link
what a crap this article was. More games - sure. More data. Bot no brains to interprete it right, obviously.Anonymous User - Monday, October 13, 2003 - link
#114Its more than just difference in visuals. By removing some of the visuals the card will run faster. The Nvidia drivers for example do not do trilinear filtering in dx they do some fake bilinear. That makes the card better than it really is.
The whining is how the reveiwers missed all this stuff. People are not getting the true story here.
Anonymous User - Monday, October 13, 2003 - link
Okay now I'm new here and all but DAMN do some of you whine! You act like any visual diffrences between the Nvidia cards and the Ati (of which I can't see at all) are astronomically huge! It's not, this is the first and last time I post here looks like half of the people here are fanboys!Anonymous User - Monday, October 13, 2003 - link
I don't understand why the obvious differences in IQ in the Aquamark 3 4xAA/8xAF shots, for example, are totally ignored by the reviewer.Just looked at the fuzziness in the plants surrounding the explosion in the nvidia shot.
Anonymous User - Sunday, October 12, 2003 - link
Well I hope it was worth it.You spend all that time on a review and you end up getting caught being in someones back pocket.
Guess you can't have your cake and eat it to :(
Anonymous User - Saturday, October 11, 2003 - link
Here's is part of an addendum to the 3DCenter article direclty addressing this comparison:"AnandTech made an extremely extensive article about the performance and image quality of the current high-end graphic cards like Radeon 9800XT and GeForceFX 5950 Ultra (NV38). Beside the game benchmarks with 18 games, the image quality tests made with each of those games are strongly worth to be mentioned. AnandTech uses the Catalyst 3.7 on ATi side and the Detonator 52.14 on the nVidia side to compare the image quality. In contrast to the statements of our youngest driver comparison, AnandTech didn’t notice any general differences of the image quality between the Detonator 52.14 and 45.23 and therefore AnandTech praises the new driver a little into the sky.
This however not even absolutely contradicts itself with our realizations. The nVidia-"optimizations" of the anisotropic filter with texture stages 1 till 7 in Control panel mode (only a 2x anisotropic filter is uses, regardless if there were made higher settings) are only to find with proper searching for it, besides most image quality comparisons by AnandTech were concerned without the anisotropic filter and therefore it’s impossible to find any differences on those pictures. The generally forced "optimization" of the trilinear filter into a pseudo trilinear filter by the Detonator 52.14 is besides nearly not possible to see on fixed images of real games, because the trilinear filter was created in order to prevent nearly only the MIP-Banding which can be seen in motion.
Thus it can be stated that the determined "optimizations" of the Detonator 52.14 won’t be recognized with the view of screenshots, if you do not look for them explicitly (why however AnandTech awards the driver 52.14 a finer filter quality than the driver 51.75 is a mystery for us, then the only difference between them is a correctly working Application mode of the Detonator 52.14). Thus the "optimizations" of nVidia are not to be really seen, whereby there is also a clear exception as for example Tron 2.0 (screenshots will follow). Whether this is now a reason to excuse the "optimizations" of nVidia about it, one can surely argue."
All on-line computer journalists should strive to inform their viewing public like these folks do.
Once again: 51/52.XX nvidia drivers do *not* apply trilinear filtering in D3D when AF is on. The 51.75 at least, applies trilinear to the first (0) stage, though not *AT ALL* to any other stage - the 52 series does not apply trilinear filtering to any stage in D3D, regardless.
Bing! Bing! Try again!
May I suggest the filtering tester used by 3dCenter, and perhaps a mipmap shading program (as used by everyone in the known universe), and rthdribl to discern *ACTUAL* image quality via high dynamic range light source rendering.
http://www.daionet.gr.jp/~masa/rthdribl/index.html
Anonymous User - Saturday, October 11, 2003 - link
http://www.3dcenter.de/artikel/detonator_52.14/ind...Take a look at this if you think the 5x.xx drivers have the same image quality of the 45s.
Anonymous User - Saturday, October 11, 2003 - link
The only thing I give Anand credit for is allowing us freely write about his review. I mean he did not have to allow us to reply in a open forum.After reading it I am not at all suprised at the heat he is taking, I hope he was not either.
The review had potential but was squandered.
Todays cards all all fast enough to do Dx8 games.
The question is can they do it will all the goodies turned on?
The main reason to buy a ATI 96-9800/5900U is to clean up the graphics but not at the expense of speed. If you don't care about image quality stick with your GF4 or 8500 as they both are horrid vs the new gen cards.
An old Gforce 4 kicks butt in mnay games so long as you don't have FSAA turned on.
Most people know that the 5x.xx detonator drivers do reduce image quality in many areas. This is not a driver bug its what Nvidia choose to keep pace. Image quality is much more subjective than FPS. People are not buying 400.00 video cards for the speed alone.
Anand glossed over/hid quality issues, the one area where subtle reductions here and there add up to large FPS gains.
People will say so what the XT gets recommended in the end why bitch?
Well its the principle, The review made the 5900 seem much closer to the XT than it actually is.
When a driver (beta one at that) improves speed that much it deserves a much closer inspection than what Anad gave.
Someone threw Anand a pass but he dropped the ball :(
Anonymous User - Saturday, October 11, 2003 - link
I didn't care for this review for the following reasons:Many comments on IQ in part 1, but no followup in part 2. There were so many comments that they needed to be mentioned, even if it was to say that they discovered it was some wrong setting and they fixed it.
Small cropped compressed images used for IQ comparison. If the image is compressed how can we judge it? The only way to present IQ comparisons to the reader is to show them the exact images the reviewer saw, without compression or cropping.
Apples to apples. All of the benchmarks for all games should have been done in the same format unless it was impossible to achieve certain settings on a given card at a given resolution. Changing the metric for TR:AOD was a bad idea. Both parts should also have been done on the same system. For all we know the ignored IQ issues from part 1 could have been due to the AGP implementation on the first board. We just don't know.
Gunmetal is also a very poor DX9 benchmark, since it relies on VS 2.0 and PS 1.1 only. Since most of the benefits of DX9, and the controversies for that matter, revolve around PS 2.0 this benchmark is not a good exemplar of DX9 performance. I also find the fact that Gunmetal was co-developed by Nvidia something that needs examination. IHVs have no place in developing benchmarks, they should stick to technology demos.
Now I don't know if the 52.14 drivers do what the article says they do or not. I know Digit-Life said they gave up to a 20% improvement in some cases, and some improvement is certainly credible. However, this article as written does not support the conclusions that the 52.14 provide significant performance boosts with no IQ loss. I am not commenting on whether they do or not perform as advertised, only that you cannot draw that conclusion from the article.
Anonymous User - Saturday, October 11, 2003 - link
Great job with the review. I'm so happy to finally see benchmarks in more than just FPS's. I'm always curious to see what kind of benefits can be had by upping my video card in RTS games for example. Take a rest, your brain must be fried from all that benching.One question. I have an LCD monitor and I can't get Generals:Zero Hour to run at 1280x1024. How did you manage to get that resolution for your benches since it is not offered in the game menu?
Anonymous User - Friday, October 10, 2003 - link
Anadtech is a total liar. The 52.14 quality of picture sucks and so does my 5900 card sucks. Its garbage like it was before and the 52.14 drivers for sure are not helping it become better. As for the pictures i dont know how he dares even to say there is no difference in quality it for sure is a big problem the quality with all the games and programs i tried out.Anonymous User - Friday, October 10, 2003 - link
The TRYTH about the det. 52.14 driver:A) OpenGL filtering appears to be just fine.
B) D3D filtering suffers from the following "optimizations"
Application Mode:
1) True trilinear is never utilized at all...on any texture stages. It's now all "pseudotrilinear"
Control Panel Mode
1) Same pseudotrilear as above
2) Proper Aniso level selection is only applied to texture stage 0. No matter what the aniso level selected (2x-8x), only 2x is applied to stages 1-7.
Anonymous User - Thursday, October 9, 2003 - link
I'd like to know what happened to all the CPU scaling analysis? I hope there's a part 3 which includes these same cards.Anonymous User - Thursday, October 9, 2003 - link
Good to see you are reworking the Flight Sim 9 BM. Your achieved FR in part 1 were so much higher than what real simmers are getting. Many people would be happy to get a reliable 25 FPS out of the game. Once there we can worry about IQ. So you need to push the texture sliders etc to put a real load on the system in this game. A continuing argument with this game is the importance of CPU/memory relative to graphics card in achieving acceptable frame rates. Anything in your testing that could shed light on this would be invaluable.scott s.
.
Pete - Thursday, October 9, 2003 - link
As long as I'm bothering you, I'd like to request Halo numbers with AF for your next review/roundup. AF really spruces up IQ, IMO, and it's a shame (almost pointless) to buy $500 cards and not run at the highest IQ possible. I'd also appreciate comparison pics with AF, as well. Thanks!Pete - Thursday, October 9, 2003 - link
Whoops, that was the URL of the inline pic, which works. This is the (broken) link to the large pic: http://images.anandtech.com/reviews/video/roundups...Pete - Thursday, October 9, 2003 - link
Derek, the Halo Det45.23 large pic link doesn't work: http://images.anandtech.com/reviews/video/roundups...Anonymous User - Thursday, October 9, 2003 - link
Might I suggest you remove the Tech part of yourSitename ? there's not much tech anymore
Anyway this is not about Nv vs Ati, it is Nv vs DX9, It still states on NVidia's site, the FX series are DX9 card which they are not!!
Inform the people as it should & not as your Nv-Paymaster is telling you!!
Anonymous User - Thursday, October 9, 2003 - link
"We still urge our readers not to buy a card until the game they want to play shows up on the street."What if we want to play DNF? Should I just wait till it's done?
Sounds like you're in nVidia's pocket more like it. Most people like games, not just one... so upgrading has immediate effects for most.
Anonymous User - Thursday, October 9, 2003 - link
"We were told by NVIDIA that these new drivers would not only improve performance, but that they would be made publicly available the very same week we tested with them. Obviously, that didn't happen, and it ended up taking another month before the drivers were released. The performance gains were tangible, but the drivers weren't fit for release when NVIDIA provided them to the press and honestly shouldn't have been used.Hindsight being 20/20, we made a promise to ourselves that we would *not allow any further performance enhancing drivers to be used in our video card reviews unless we could make the drivers publicly available to our readers immediately.*"
-Anand Lal Shimpi
October 17, 2001
Ah how the world turns.
Anonymous User - Wednesday, October 8, 2003 - link
"If you think that's wrong, then you have a problem."If anand had not stated that he would not do benchmarks with unreleased drivers then you might have a point.
Especially if an in-depth examination of the filtering and antialiasing quality had been undertaken. Too bad that didnt happen, despite what was claimed in the introduction.
Making oneself a hypocrite to appease the publics' whims by using unavailible software (for one manufacturer, but not another) is a direct shot to your own credibility.
But credibility is no longer important to this site, and many of the other premiere hardware review sites.
Page hits and advertisments pay the bills, do whatever it takes.
Anonymous User - Wednesday, October 8, 2003 - link
This is the most interesting sentence I found in this review:"It just so happens that the default Microsoft compiler generates code that runs faster on ATI's hardware than on NVIDIA's."
So it is M$'s fault that nVidia's card runs slower? How do you know? Plus the numerous problem mention by other people, I can't believe this is from a famous harware review site. I don't think this review will earn nVidia any credit, instead Anandtech lose credit for it.
BTW, I don't think a driver update can improve performance by 50%, as the 52.* driver did, unless the old 45.* driver were written by a new graduate. There MUST be some trick in the 52.* driver.
Anonymous User - Wednesday, October 8, 2003 - link
Hey Mr. NostraDUMASS (#93), you want some cheese to go with that whine?NV40 will be 3D king?! Only in a 4D world, dreamer.
Looking at the graphs and the numbers pretty much sums it all up - a crushing defeat for you know who. Anand and his crew did a decent job. They even tried to candy-coat it to appease the big N.
Anonymous User - Wednesday, October 8, 2003 - link
nvidia is coming back slowly due to stupid policy and strategy of nvidia ceo they have lost their supremacy and reputation in 3d graphics world but i'm sure next gen... nv 40 will be 3d king nvidia is smarter than ati . ati architecture is built on brute force of 8 pipelines and 256 bit memory nvidia cine fx is much more sophisticated.RoninCS - Wednesday, October 8, 2003 - link
You know why he used them? Simple, actually. Because people wanted him to test them, and the public at large wanted to see how they perform. If you think that's wrong, then you have a problem.Anonymous User - Wednesday, October 8, 2003 - link
Im highly amused that anyone would flame someone for taking anand to task for using UNRELEASED DRIVERS WHICH ARE IN NO SENSE AVAILIBLE TO THE PUBLIC THROUGH THE MANUFACTURER.Anand has said on several different occassions in the past that he would NEVER BENCHMARK WITH UNRELEASED DRIVERS.
He lied.
He did not bother to investigate the veracity of the claims nvidia made about image quality, inluding the total lack of true trilinear filering.
Why not? Nobody knows, and anand won't say.
There are several very well established methods for determining the true quality of a cards' filtering and antialiasing scheme.
Anand used none of them.
Why?
Coincidentally, everyone who has actually bothered to check the quality of the nvidia AF on any driver in the 50/51 or 52 series has found trlinear filtering simply does not work as advertised.
Indeed, far from predicting future DX9 title performance, the new test suite appears to be heavily biased toward legacy DX8/8.1 pixel shading, something the FX architecture is basically built for, and excels at.
The game (TR:AOD) that probably uses the largest number of DX9-class shaders (PS2.0) was not even listed in terms of absolute numbers. Instead, we have a percent decrease!!!
Why was this single game treated so differently when it would likely be the best predictor (along with Doom3 in ARB2 generic mode, and HL2) of future shader performance?
Coincidentally, the R3XX architecture dominates this test - absolute frame rates would have been heavily embarassing for nvidia.
Lucky anand didnt bother to list them! Instead, he spent several paragraphs discussing water that didnt render quite right.
The GeForceFX family are decent cards, with very good performance indeed in DX8 class games. DX9 games with real DX9-class shaders is a different story altogether.
But you won't be getting that story here, and anyone who bothers to bring it up will be labeled an "ATI fanboy".
How sad.
How childish.
How utterly self-serving.
This article proves quite handily just how much of anand's credibility has been lost. Anand has jumped on the Thomas Pabst bandwagon.
Next stop, [H]ardOCP.
Toot, Toot!
Anonymous User - Wednesday, October 8, 2003 - link
what happened 2 ur glasses anand ?? just by looking at it u can see clear differences in the way NV and ATI render aquamark. and what about the missing FPS in TRAOD ?? instead of writing summin like "NVIDIA really gets crushed by ATIs PS2 performance." ur saying " It is very clear that the way ATI handles rendering TRAOD's PS2.0 code is more efficient." and only show sum _absolute irrelevant_ ps2 ratios not correlating with each card in ANY WAY ?!? ooookay, so ur praccing 4 ur politician-career, arent ya ? honestly, that article feels so biased towards NV its just not decent anymore...anandtech was my favourite site when it comes 2 reviews but after that i rather stick 2 another one when it comes 2 gfx cards it seems.i really wondered how anand could be the first site benching a gfx5950 and here we got the answer it seems...gn8
RoninCS - Wednesday, October 8, 2003 - link
To 90, perhaps you should be a writer for Anandtech instead, since you seem to be much more critical, although it would appear you don't run a world reknowned hardware site.Talk about critical...sheesh.
Anonymous User - Wednesday, October 8, 2003 - link
I think Anandtech should consider that quantity doesn't equal quality.If it was only fps, then lots of games is perfect. But we were supposed to be talking IQ in this part. Well... I saw very little actually written about IQ in the article.
On the very first page, I immediately notice big differences in IQ. On the tank I see lots of places were the NVidia drivers don't seem to do any AA. And the details around the explosion are far more blurry. Those things are obvious allthough I'm looking at a reduced size JPG! And I see no remark about IQ at all !?!?!?
Going to F1 challenge to the 4xAA/8xAF. Or is it? I see absolutely no AA *anywhere* with the NVidia drivers. And it really really really obvious in these shots!! Again, no comment about IQ at all?
Do I need to go on? Gunmetal also shows AA differences.
Finally, with Homeworld Anandtech notices too that there's no AA with the NVidia drivers.
Luckily, it's a know issue...
Jedi Knight. Well, here the NVidia does do some AA. But it's really little compared to ATI's. No IQ difference???
Neverwinter. Look at the pilar to the right, and the plants on the left. Do we again see clear AA differences, like NVidia seem to forget those objects?
But it gets worse...
How on earth can anyone place these utterly rediculous Tomb Raider and Wolfenstein screenshots??????
Come on! I've seen beautifull screenshots of streets in Paris in Tomb Raider. Perfect for testing IQ. And all we get is a 90% black screen? Same goes for Wolfenstein...
What where you guys thinking?????????
This is so incredibly far below the high standards that I'm used from Anandtech...
Please look again at this article, and do a proper job at assesing IQ.
Anonymous User - Wednesday, October 8, 2003 - link
#41, maybe you and your wife should start a website, you could benchmark ATI cards exclusively. That way ATI would always wind up on top. Admittedly, I'm an ATI junkie (I own a Radeon 8500 and plan to buy a 9600XT ASAP), but enough is enough. (By the way, what's up with the bread/butter analogy? You seem very fond of it.) Seriously, though, either of these cards are really fast and aside from IQ differences, you couldn't tell a difference. A little question for anyone who would know, though: How much does IQ drop going from PS2.0 to PS1.4? I have Halo and I'm wondering how much better it would look on a DX9 card instead of DX8.1.Anonymous User - Wednesday, October 8, 2003 - link
if you look at the gunmetal screenshots, that is my only beef with ATI, the scenes are not rendering completely or properly it has happened to me in a lot of games, black areas.Anonymous User - Wednesday, October 8, 2003 - link
The article does seem somewhat comprehensive that is true, but: a)other sites reviewing the software did not come to the same conclusions, mainly problems with trilinear and AF again.... b)I have yet to see a review that claims to be unbiased have this much opinion sprinkled all over, mainly pro nVidia which relies on IQ comparison which i refer to in a c)the drivers are beta and not whql so who knows what we'll get as consumers d)the hardware is not yet anounced formally by nVidia e)it seems the choice of what to show on graphs is very subjective,TRAOD shows percentage drops with PS 2.0 but what are the framerates?I do hope this review is correct because it means nvidia are back but due to the above stated qualms I have I can't trust this review.
Anonymous User - Wednesday, October 8, 2003 - link
The article is extremely comprehensive, as one would expect from Anandtech. Some issues of note:1. It was pointed out that the 5900 and the 5950, in many areas, performed almost identically. This doesn't pose well for nVidia.
2. I'm bothered by the tremendous frame rate difference between ATi and nVidia in some of the titles. It leads me to believe there's something underlying going on, and it's not just a simple card/driver issue.
3. It's nice to see the IQ back to where it should be, as visual quality should never be compromised for performance, unless the user makes the adjustments to do so.
4. I will admit it sort of seems that there is some bias towards ATi, but it's not flamingly apparent. Again, it is just my perception, and doesn't necessarily mean that there is.
5. The most accurate remark made in this review is simply that we are not in the world of DX9 games...yet. To that end, DX9 performance is not nearly as important as it will be. When it is, I think things will step up a few notches.
Anonymous User - Wednesday, October 8, 2003 - link
Nicely detailed article, and I appreciate the additional games for benchmarking. Any chance we could seee the use of a flight/combat sim program like IL-2 or Mechwarrior?Anonymous User - Wednesday, October 8, 2003 - link
I don't know why everyone is believing the IQ results (or even trying to use Photoshop to check the differences). These pics are JPG's! They're already manipulated by the compression logic, and who's to say these pics are true?Anonymous User - Wednesday, October 8, 2003 - link
Loooooong time reader, new poster.Excellent work Anand and Co. I found the article very informative, and although certain folks don't enjoy reading your "opinions" on some of the benchmarks, I thought they were very appropriate. It will be interesting to see how the official driver releases function under the latest and greatest DX9 and OpenGL games...
Thanks for all your hard work and effort!
Mike
Anonymous User - Wednesday, October 8, 2003 - link
#78The shots has not been taken in the same frame.
Gunmetal, contrary to Aquamark don't have such option....that's why so many screenshots are taken at the beginning of a scene or a dead spot.
Anonymous User - Wednesday, October 8, 2003 - link
#67I seriously suggest that you upgrade everything else in your machine, reinstal drivers, game and defrag.
Mine runs perfectly at 1280*1024 with the max AF and displays between 40-60fps all the way using the cg_draw command and that's GAMEPLAY framerates .... with sound, AI and all the whistles. I see no need for AA at that resolution thou (not a nice IQ/performance trade there)....at 1024 it does wonders thou.
capodeloscapos - Wednesday, October 8, 2003 - link
Why nobody said anything about IQ in GUN METAL???Only NvIdia 52.14 shows the fire in Mech's Gun.
What happened there???
Anonymous User - Wednesday, October 8, 2003 - link
#67 I think the lightsaber glow is horrible on the Nvidia cards. They glow shines THROUGH the players head. Looks to me like a bug. I like the ATI saber much better.(Most peoples heads aren't empty, so light does not shine trough. Maybe your experience is different? ;-)))
#76 Couldn't agree more. The blurry AA in aquamark is crystal clear even in those tiny images. So how could the authors possible miss that and proclaim that there is no IQ issues? Especially since they have looked at the fullscreen images and spend days on the article?
Also you can immediately see in all the small images that in general AA is better on the ATI card. This is nothing new, and not considered cheating by Nvidia. It's just that most know that there is a quality difference.
But shouldn't that at least be mentioned in an article that is focused on image quality?
Why no screenshots on splinter cell? We should just believe the authors on that? With the aquamark pictures they have shown that we can't take their word for it. So I'd really like to see those screenshots too. Same for EVE.
And I was really suprised that they didn't know that the water issue in NWN was NOT Ati's fault. They claim that they have surfed forums on NWN issues. In that case they should have known that. (one look at rage3d would have been enough)
And on top of this the TRAOD part. It seems they typed more text on TROAD then they did in the entire rest of the article. No wonder that people frown at the TROAD part.
All in all, I can see that much work went into the article, but I feel that it could have been much better.
As it is now it is left to the reader to find the image issues in the small pictures. But I would expect the author to point me to the image issues.
Anonymous User - Wednesday, October 8, 2003 - link
#74, conclusions are one thing, objective journalism is another.There are clear differences in even the small and relatively badly chosen images posted with the article, yet all we get to read is "there are no IQ issues".
Thus, either the authors of the article are not competent enough (maybe they were simply too tired after the testing...) , or they are intentionally ignoring the differences.
Iger - Wednesday, October 8, 2003 - link
I just can't stay aside and not to thank the authors. The job they've done in this article is amazing, and the site was and will be my all-time favourite! Thank you! :)Malichite - Wednesday, October 8, 2003 - link
I am extremely confused with the posts here. Many ATI guys seem to think AT unfairly favored the nVidia cards. Did we read the same article? In the end I came away with the opinion that while the new Det 52.xx help, things may get better for nVidia, the ATI is still a better choice today. Did I miss something?Additionally for all the guys claiming TR:AOD is a great game. Yeah, we all know only the truely *great* games pull a %51 rating over on www.gamerankings.com (based on 21 media reviews).
Anonymous User - Wednesday, October 8, 2003 - link
Just what kind of world do we live in before a guy has to say why he's not a fanboy before they express their opinion, anyway? The worst part is, you people who do this, you're completely justified in your actions, because if you don't explain why you're not an ATi/nVidia fanboy then people call you one.God.. can't we argue without calling others fanboys for once?
Anonymous User - Wednesday, October 8, 2003 - link
i forgot to add the j/k part... i dont want you taking my poor attempt at humor the wrong way... ;)anyways, i dont know what all the commotion is about.. shouldnt u (ATI-folk) be happy that nvidia is making vast improvements?
i would feel sympathetic for people who THOUGHT they wasted $400+ dollars on a card that didn't seem to deliver the performance it promised...
Anonymous User - Wednesday, October 8, 2003 - link
What kind of biased crappy unproffesional review shows percentage drops for enabeling ps 2.0 without showing framerates? if fps are around 30 to begin with the % of fps drop makes no difference cause the game is rendered unplayable! and who benchmarks beta drivers not available to the public on hardware not yet anounced?this reeks with $ payoff and seems like anadtech have thrown thier integrity to waste.I wish that on the 10th when nvidia anounces the nv38 they also release these drivers to the public than some seious review site can actually test the hardware(and software, forgive my skeptisicm but nVidia sure earned it this past year) and show us what nVidia is bringing to the graphic's field.
Disapointed by nVidia and now by Anandtech
Anonymous User - Wednesday, October 8, 2003 - link
sure u do... ;)Anonymous User - Wednesday, October 8, 2003 - link
Not everyone talking about IQ differences here is a fanboy.Look at the images at the bottom of the Aquamark 3 IQ page (highest quality AA, 8xAF). The nVidia 52.14 image is blurred, much detail is lost especially around the explosion. The Catalyst 3.7 image is way sharper, yet its AA is smoother (look at the car body above the wheels), and it loses much less detail around the explosion. The differences are much more than "barely noticeable".
The tiny images don't give much credit to the article, though.
(Before anyone calls me az ATI fanboy: I have a GeForce FX 5600 dual DVI.)
Anonymous User - Wednesday, October 8, 2003 - link
TR: AOD is a terrible game, most people just like it because it's such a hot spot for all this benchmarking shite.At times like this, I'm glad I use a Matrox Millenium II! .. okay, kidding, but still.
Anonymous User - Wednesday, October 8, 2003 - link
Didn't anyone notice that Ati doesn't do dynamic glows in Jedi Academy with the 3.7 cats!? Look at the lightsabre and it's clearly visible. They only work with the 3.6 cats and then they REALLY kill performance (It's barley playable in 800*600 here on my Radeon 9700 PRO)Anonymous User - Wednesday, October 8, 2003 - link
funny to see that ati fanboys can't believe that nvidia can bring drivers without cheats. And nobody talk about the issues in TRAOD with ATI cards, really very nice...Anonymous User - Wednesday, October 8, 2003 - link
WTH did you benchmark one card with unreleased drivers (something you said you would never, ever do in the past) and use micro-sized pictures for IQ comparisons?You might as well have used 256 colors.
The Catalyst 3.8's came out today - the 51.75 drivers will not be availible for an indeterminate amount of time. Yet you bench with the Cat 3.7's and use a set of unreleased and unavailible drivers for the competition.
I suggest you peruse this article:
http://www.3dcenter.org/artikel/detonator_52.14/
from 3DCenter (german) to learn just how one goes about determining how IQ differs at different settings with the Nvidia 45's, 51's, and 52's.
Needless to say, everyone else who has compared full-sized frames in a variety of games and applications has found the 5X.XX nvidia drivers (all of them) do selective rendering, and especially lighting.
And why claim the lack of shiny water in NWN is ATi's fault?
Bioware programmed the game using an nvidia exclusive instruction and did not bother to program for the generic case until enough ATI and other brand users complained.
This is the developer's fault, not a problem with the hardware or drivers.
Anonymous User - Wednesday, October 8, 2003 - link
Nice article. I like that you benched so many games.Unfortunately you missed that the Det52.14 driver does no real Trilinear Filtering in *any* DirectX game, regardless of whether you're using anisotropic filtering or not. This often can't be seen in screenshots but in motion only. Please have a look here:
http://www.3dcenter.de/artikel/detonator_52.14/
There is *NO* way for a GeForceFX user to enable full trilinear filtering when using Det52.14. No wonder the performance increased...
Anonymous User - Wednesday, October 8, 2003 - link
TR: AOD is a fine game, you just have to play it...Sure there are some graphical issues on the later levels but there's nothing wrong with the game as such and considering that it has made its way into a lot of bundles (sapphire and creative audigy 2 ZS to name two) I believe it will recieve a fair share of gameplay.
Anonymous User - Wednesday, October 8, 2003 - link
You guys need to stop talking about gabe newell...for such a supposed good programmer he sure needs to learn about network security...We all know he's got his head up ATI's rearend. The funny part is that they are bundling hl2 with the 9800xt (a coupon) when it isn't coming out until april now. Who's to say who will have the better hardware then? Doom 3 will likely be out by then. In 4 months when the new cards are out you guys won't care who makes the better card the 12 year old fan-boys will be up in arms in support of their company. I owned the 5900u and sold it on ebay after seeing the hl2 numbers. I then bought a 9800pro from newegg and on the first tried ordering the 9800xt from ati which said it was in stock. 2 days later they told me my order was on backorder and hassled me when I wanted to cancel. One thing I'd point out is that war3 looks much better on the 5900u then the 9800. It looks really dull on the 9800 where it's bright and cartoony (like it should be) on the geforce. Either way who knows what the future will hold for both companies but let's hope they both succeed to keep our prices low....Anonymous User - Wednesday, October 8, 2003 - link
i took over #41's original post... i didnt like his tone :|Anonymous User - Wednesday, October 8, 2003 - link
IQ part was crappy at best. small screenshots in open not-so-detailed areas and sometimes there was no option for a big one to check.You can call me what you want, but there are quite a few reviews there that will disagree BIG time with what has been posted about IQ here. And it is impossible all of them are wrong on this at the same time.
HomeWorld has shadow issues in ATI cards with cat 3.7, yet that ain't shown there anyways....this goes for both ways.
If you ask me, NVidia got his DX9 wrapper to work fine this time.
Anonymous User - Wednesday, October 8, 2003 - link
Um what happened to post #41 where the guy detailed all the inconsistencies of the IQ comparisons? Please don't tell me you guys actually modded that post....I haven't had the chance to go through everything yet but those few I did, I definitely saw differences even in these miniscule caps (how about putting up some full size links next time guys?).. particularly in the AA+AF ones. It's obvious theres still quite a difference in their implementations.
I was also surprised at the number of shots that weren't even of the same frame. Honestly, how can you do a IQ test if you aren't even going to use the same frames? A split second difference is enough to change the output because of the data/buffer/angle differences etc.
Personally I wonder what happened to the old school 400% zoom IQ tests that Anand was promising and I'm fairly disappointed despite the number of games in this article.
That said, I am glad that Nvidia didn't botch up everything entirely and hopefully they'll have learned their lesson for NV4x.
Anonymous User - Wednesday, October 8, 2003 - link
Where can i get the 52.14 drivers?Anonymous User - Wednesday, October 8, 2003 - link
I'm impressed. I've never seen a review that actually has the games I play most frequently in it. I've been un-interested in FPS games since Quake II.In particular, I like Neverwinter Nights, C&C Generals, SimCity 4, and to some extent WarCraft III (and by extention, their expansions). I was under the impression that SimCity 4 was CPU bound under almost all circumstances, it's useful to have that shot down.
I also like AA and AF. You can imagine the slideshows I play with my Athlon 2100+, 1GB DDR, and Radeon 64MB DDR (a.k.a. 7200)
Now I just need to see the ATI AIW 9600 Pro reach general availability.
Anonymous User - Tuesday, October 7, 2003 - link
Thank you so much for this review... the detail is spectacular. After reading and lookig at all 60 pages... I am really tired. Thanks again for your dedication!Anonymous User - Tuesday, October 7, 2003 - link
Um why are there no comparisons using two monitors with diffrent cards running . Gabe of valve said there is a set of drivers that detect when an screen shot is being taken. Or did anand just get duped by nvidiaAnonymous User - Tuesday, October 7, 2003 - link
I would like to know:1. Why fps was left out of TRAOD
2. Why the weirdo never seen before TRAOD PS2.0 percent loss graph? How about giving us good ole fps which is what we have been seeing for years and what we are use to, at least have both if you are going to introduce new graphs.
3. How the reveiwer seems to know "Nvidia is aware of it" and never seems to know if ATI is aware of problems? I mean he would have had to talk to Nvidia to know this. Did Nvidia pre read the review and then tell him they are aware of a problem and will fix it??
4. What motiviation does the reviewers have for helping Nvidia or at least seem optimistic. What has Nvidia done to earn this tip toeing around type of review? If anyting they have dug themselves a well deserved hole. I'm talking about Nvidias horrid behaviour as a company in the past 6 months. Why would they reward a company that pulls the stunts they have lately? Do they feel sorry for them?
All I can say is the tone of this review leads me to think there is more to this than meets the eye.
Anonymous User - Tuesday, October 7, 2003 - link
#52 yeah im sure people play games in window mode.How can u see the differences from such a small screen shots. Its well known that Nvidia hacks or shall I say "optimises" for benchmarks giving no thought to IQ. This article displays Blatant nvidia @ss kissing. There was good reason Gabe didn't want his game to be benched with det.50xx, take a guess, more hackery from nVidia. Also Anand mentions certain anomalties with the geforce fx on certain games but does not try to exlpore what those errors are and assumes nothings wrong. In homeworl the Fx isn't even doing FSSA. Geez wish the nvidia fanboys would get a clue and crawl out from under that rock the've been hiding under.
Anonymous User - Tuesday, October 7, 2003 - link
This is the most interesting article I have ever read for sometime.. First of all, I agree with #41.. I think including this many games into benchmark prohibits Anand/Derek to make detailed analysis of the games.. But there is something more interesting..It seems that Anand and Derek tried to put an article that hides the problems with both cards.
They also deliberately try to avoid giving one company favor. In one sentence, they claim ATI is best, in the next line, they state otherwise.
As for the IQ comparison, many of screen captures are either dark or can not reflect what AF+AA intended to do.. If I just check the small pictures, I would say that the IQ are really similar. However, more detailed analysis reveals other problems. Besides, the review of the TROAD is the wrost I have ever seen.. If they post the frame rates, I am pretty sure that everybody will be shocked to see the results.. How won't they.. Think about it, the performance percentage loss of FX5950 is 77.5% for 1024x768 noAA/AF. Even if the game runs at 50 fps with PS1.1, the frame rate would drop to 10 fps when you switch to ps2.0 in this case.. However, refering to Beyond3d is interesting, because that site has a very detailed benchmarks of both 5900 and 9800 with this game ( I strongly recommend to anyone to see these articles who really wants to learn the actual performance of NV5900 and R9800 in the PS2.0 scenarios)
But I totally disagree with Anand in one thing.. TROAD performance is a real indicator for the future games that will uses PS2.0 by default. The games v49 patch also uses HLSL to compile directly to ps2_0_x which is actually the Nvidia's NV30 architecture, and the compiled code runs faster than Cg compiled code. Even in this case, 9800Pro still runs much faster that 5900 ( I am talking about 70 fps vs. 35 fps.).
I guess nobody want to see that his/her 500$ graphics card would crawl in the new games which uses ps2.0 by default just one year after he puchased the card.. And no! I am not a ATI fanboy.. Just a tech fan who does not tolerate to see how some sites really misdirects the readers because of their connections to the IHVs.
Anonymous User - Tuesday, October 7, 2003 - link
Oh, come on, fanboys, stop yelling at Anand for not making nVidia look bad enough. His job is to benchmark, not to rant. Jesus Christ, you people annoy me. Try printing out the three images from any given test WITHOUT looking at which one's the Radeon.And no, I'm no nVidia fanboy, nor am I defending nVidia. I use a softmodded Radeon 9500 and I absolutely love it. I have never, ever put a GeForce FX in my system, and I'm happy to say this. But can't you people just let go?
Anonymous User - Tuesday, October 7, 2003 - link
FIFA 2004 !!! That alone make this worth while !!!Rogodin2 - Tuesday, October 7, 2003 - link
You should use IL-2 Forgotten Battles with "perfect" detail settings (pixel shaded water and a system knee-bringer) for a simulation benchmark.rogo
Dasterdly - Tuesday, October 7, 2003 - link
I could see IQ differences on the dune buggy left side top. The ATI pic has better detail.Please add 3dmark benchmark.
Good review so far almost 1/2 way thru :)
Anonymous User - Tuesday, October 7, 2003 - link
#41 "[...] who butters your bread???"Thats an interesting question, I suspect he does though my question is "who wants to know?" ; )
In regard to your other question. "Why can't we have a true winner now?". As for myself, I'm going to give Dereck and Anand the benefit of the doubt.
It seems to me that they realize that NVIDIA attempted to do somthing unique with it's 5000 series being that it does not exactly hold to the Direct X 9.1 spec. For instance it has a 16 bit and 32 bit rendering mode while DX 9.1 requires 24 bit - which ATI does (refer to Halflife 2 and DOOM III reviews). In the sharder area NVIDIA holds FAR more code (micro ops) than ATI - also if you check back to Anand's original post on the ATI and NVIDIA shootout(s) where there is a comparison between AA and AF NVIDIA was a CLEAR winner. I seem to recall a while ago that NVIDIA claimed ATI didn't do TRUE AF so they were therefore CHEATING. Boy did that one come back around with teeth, huh?
What I'm saying is NVIDIA tried to one up ATI by tring to do more, unfortunately it seems they tried to do TOO much and ended up doing SHADY maneuvers like the whole Future Mark mess. They should of instead focused on the spec. DX 9.1 and the Microsoft shader/pixel code path and not tried to pull a GLIDE like 3DFX (excuse the parsed english).
So, hopefully NVIDIA learns from it's mistakes modifies it's silicon to the spec. and gives us all BETTER cards to choose from come March/April.
As far as the authors are concerned, Anand and Derick seem to be attempting JUSTICE (helping the party who needs the most help, and treating all parties equally) - which in this case seems to be NVIDIA. The authors are helping NVIDIA by dropping HEAVY hints like what you stated
" Next year will be the year of DX9 titles, and it will be under the next generation of games that we will finally be able to crown a true DX9 winner. Until then, anyone's guess is fair game." and
" If NVIDIA can continue to extract the kinds of performance gains from unoptimized DX9 code as they have done with the 52.14 drivers (without sacrificing image quality), they will be well on their way to taking the performance crown back from ATI by the time NV40 and R400 drop.".
If NVIDIA takes head of these CONSTRUCTIVE statements then the entire gaming community could benifit - in better prices, higher quality to which the customer usually benifits (AMD vs INTEL sound familiar?).
So, let us be easy and enjoy the night. Time will tell.
Cheers,
aka #37
PS: Dereck please excuse me for leaving out your name before. The article was well written.
Anonymous User - Tuesday, October 7, 2003 - link
Regarding my previous post #44, I wanted to write:...the difference **between AA/AF and noAA/AF** is very noticeable in the game...
Jeff7181 - Tuesday, October 7, 2003 - link
Can you say "highly programmable GPU?" I can =)Anonymous User - Tuesday, October 7, 2003 - link
Why didn't you guys wait for Catalyst 3.8? It's out tomorrow and is reported to fix many IQ problems in games like NWN. What would a couple of days have hurt, especially since this article is going to be irrelevant after the Cat drivers are released tomorrow?Anonymous User - Tuesday, October 7, 2003 - link
Note: the AA/AF and noAA/AF images of Warcraft3 have been mixed up for the NV52.14.It tells a lot about the value of the screenshots that it takes careful inspection to find this error. I have played a lot of War3 recently and the difference is very noticeable in game, even with this GF4.
Anonymous User - Tuesday, October 7, 2003 - link
#18 Its not a problem figuring out the graphs its just weird that he would choose that type of graph excluding FPS.BTW I own a 5900U and a 9700pro.
I don't like people avoiding ps2.0 tests. My 5900 sucks at it. I paid too much for what I got in the 5900. I try to get a good bang for the buck. The 5900 is not.
Anonymous User - Tuesday, October 7, 2003 - link
...DerekWilson - Tuesday, October 7, 2003 - link
First off... Thanks Pete ;-) ...Secondly, Anand and I both put a great deal of work into this article, and I am very glad to see the responses it has generated.
Many of the image quality issues from part 1 were due to rendering problems that couldn't be captured in a screen shot (like jerkiness in X2 and F1), or a lack of AA. For some of the tests, we just didn't do AA performance benchmarks if one driver or the other didn't do what it was supposed to. There were no apples to anything other than apples tests in this review. The largest stretch was X2 where the screen was jerky and the AA was subpar. But we definitly noted that.
TRAOD isn't a very high quality game, and certainly isn't the only DX9 (with PS2.0) test on the list. Yes, ATI beat NV in that bench. But its also true that ATI won most of the other benchmarks as well.
Anyway, thanks again for the feedback, sorry BF1942 couldn't make it in, and we'll be bring back a flight sim game as soon as we tweak it out.
J Derek Wilson
Anonymous User - Tuesday, October 7, 2003 - link
Didn't Gabe Newell complain about screen capture "issues" with the Nvidia 50.xx drivers that show better image quality in screenshots than actually shows up in game?Anand spoke about image quality problems in almost every test in part 1, but i see almost nothing wrong with the screencaps in part 2.
Can you verify this Anand?
Anonymous User - Tuesday, October 7, 2003 - link
No difference in IQ, huh? Am I the only person to notice an IQ difference between the AA+8xAF pics of Aquamark3?http://images.anandtech.com/reviews/video/roundups...
http://images.anandtech.com/reviews/video/roundups...
It's funny how Anand and Derek did not comment on this. Maybe they missed it because they based their comparison off of those tiny images. Ah, so that's what the need of full-sized images are for?!
Anonymous User - Tuesday, October 7, 2003 - link
How very balanced of you #30.Let us be patient; Anand is asking questions on OUR behalf in order to REVEAL truth.
I'm focused on the questions and the answers. Where is your focus?
AgaBooga - Tuesday, October 7, 2003 - link
#33, that's what came to my mind as soon as I read this article. I think that Anand may have just provided some input, done testing, or just edited it slightly...Anonymous User - Tuesday, October 7, 2003 - link
The IQ shots are not the best I could imagine.Some of them are cropped out so that you can't see a lot of details: UT2003, Aquamark3, Wolfenstein.
Some of them are set up so that you wouldn't get any possible artifacts with texture filtering, because of the high camera angle: Warcraft3, C&C Generals.
The Tomb Raider, Aquamark and Wolf screenshots are also too dark to notice anything. And I don't see any sign of a DX9 shader in either the Halo or the TR shots, so we have no idea of DX9 image quality.
But kudos for all the testing you've done, must have been a lot of hard work.
Anonymous User - Tuesday, October 7, 2003 - link
#30 ATI has not released performance drivers for a long time now and they already said don't hold your breath on those performance increases coming in the 3.8s either. The main focus since the 3.1s have mainly it seems been bug fixes with slight performance improvements in various games. 3.8 = more features and bug fixes with probably slight performance improvements here and there in specific games.Anonymous User - Tuesday, October 7, 2003 - link
Derek probably wrote the whole article while Anand was behind him cracking his whip. So I dunno about this "supposed" two authors!Anonymous User - Tuesday, October 7, 2003 - link
Would all the fanboys please take a deep breath or troll elsewhere? I swear to god some of you people will go out of your way to look for bias where there isn't any.I own a 9800 Pro and I for one am glad that it seems like Nvidia has closed the gap considerably, their customers deserve it.
Anonymous User - Tuesday, October 7, 2003 - link
Great review, I love the IQ shots. I too am waiting to see the 9600xt review though.AgaBooga - Tuesday, October 7, 2003 - link
To those of you mentioned Anand a few times, you should also note this was written by two authors. Or atleast worked on together by two authors, so you should try and understand that you may different "types" of responses and analyses (sp?) of similar results if they're done by different people. I think we should wait for the 3.8 Cat. article before we jump to too many conclusions.PKIte - Tuesday, October 7, 2003 - link
This is the way I take screen shots in final fantasy XI benchmark 2.- Use Hypersnap-dx
- Enable directx capture in Hypersnap
- Change Hypersnap “Quick Save” settings to repeat capture every 5 seconds
- Launch Final Fantasy XI benchmark 2 menu
- When you click the “START” button press “Print Screen” once resolution changes.
Wow this is the biggest video card review I have ever read: Awesome!!
Anonymous User - Tuesday, October 7, 2003 - link
>Right now NVIDIA is at a disadvantage; ATI's >hardware is much easier to code for and the >performance on Microsoft's HLSL compiler clearly >favors the R3x0 over the NV3xever heard from the ps2_a compiler target?
Anonymous User - Tuesday, October 7, 2003 - link
Wasn't Anand allowed to use ShaderMark v2.0 for det. the pixel shader performace?Anonymous User - Tuesday, October 7, 2003 - link
lol read this article take me a 1/2 hour. this article is great but it can be improvedAnonymous User - Tuesday, October 7, 2003 - link
why anand didnt review bf1942 :(Anonymous User - Tuesday, October 7, 2003 - link
why is anand bashing Tomb raiderand whats up with PS 2.0 graph
why not just post the fps, makes it seem
like nvidia is beating Ati. Also why are beta drivers being tested with nvidia. Should have used
cat 3.8 for Radeon.
Anonymous User - Tuesday, October 7, 2003 - link
im still waiting that a site post a review for an 9600 XTPete - Tuesday, October 7, 2003 - link
Overall, a good read. Thanks, Derek and Anand.Anonymous User - Tuesday, October 7, 2003 - link
im not biased towards either card. i myself own a 9800pro. what concerns me is the immaturity shown by other ATI card owners. you guys act like nvidia can never measure up to ATI (which is so untrue). there was none/little difference in the IQ and benchmark results (with few exceptions, but explanations were given for the most part). also keep in mind that the 9800xt specs are higher than the 5900/5950 and it still managed to get beat in some of the tests. anyway, good job nvidia. you guys are certainly headed in the right direction. i was a bit sad to see my card excluded though :( ... they said they'll benchmark the value cards soon...i hope to see mine there ;)ps: i could be wrong about the specs, but i do remember anand saying the XT had higher memory bandwidth (which could've accounted for some performance differences).
all in all, a good review, ill be waiting for more updates.
Anonymous User - Tuesday, October 7, 2003 - link
18, he says he saw it saw it, he doesnt know why it was there, there is no reason to exclude regular fps graphs, especially since people want to know the fps of this game, since it is the ONLY truly dx9 game in the entire suiteAnonymous User - Tuesday, October 7, 2003 - link
Hey Anand did nvidias check arrive yetAnonymous User - Tuesday, October 7, 2003 - link
Haha, yet again, we see fanATIcs (#10, #14) coming out of the woodworks to claim that Anandtech's review is either biased or NVIDIA is still cheating. lmao, losers!And by the way #14, you're plain dumb if you couldn't figure out that the TR:AOD graphs were showing a percentage difference. Christ, read the review.
Anonymous User - Tuesday, October 7, 2003 - link
You need to look at the FSAA each card empolys...go back and look again at the screenies, this time looking at all the jaggis on each card....especially in F1, it doesn't even look like nVidia is using FSAA....while on the ATI, it's smooth ......I don't think it's a driver comparison, just the fact that ATI FSAA is far better at doing the job....At least I think that's what he's talking about..hard to tell any IQ differences when the full size screenies are not working, but poor FSAA kinda jumps out at you (If your'e used to smooth FSAA)Also worth noting, nVidia made great jumps in performance in DX9, but nothing that actually used PS2.0 shaders : (
Anonymous User - Tuesday, October 7, 2003 - link
#14 Blurred? Are you not wearing your glasses or something? Nice and sharp for me...Anonymous User - Tuesday, October 7, 2003 - link
of course you do, you're a fanATIc...Anonymous User - Tuesday, October 7, 2003 - link
I like the way he discounts Tomb raider. Saying it is just not a good game. Thats a matter of opinion. It almost seems like he trys to undermine that game before revealing any benches.And the benches for that game are not done in FPS but on percentage lost on PS2.0.
On first inspection of the graphs it appears that Nvidia is leading in tombraider. But if you look at the blurred print on the graph it does say "lower is better" Very clever!
Why no FPS in that game?
Nice information in this review but it almost seems that he is going out of his way to excuse Nvidia.
I smell a rat in this review.
Anonymous User - Tuesday, October 7, 2003 - link
#3, #7: If you take the screens into photoshop and observe the result of their 'difference', you'll see that there's a fairly significant difference between the 45's and 3.7's, but almost no difference whatsoever between the 52's and 3.7's. In most of those screenshots it's impossible to do this since the shots aren't neccessarily from the exact same position each time. Try the ut2k3 ones for example. Also these are jpeg's, so there'll be a little fuzz due to the differences in compression.Also, if I need to take two screenshots into photoshop to be able to discern any difference between them, that's really saying alot. And since we can't refer to a reference software shot, it could be ati's driver that's off for all we know.
In any event I'm pleasantly surprised with nvidia. Their IQ has definitely caught up, and their performance is quickly improving. Hopefully the cat3.8's will pull a similar stunt.
Anonymous User - Tuesday, October 7, 2003 - link
No he's just a "fanny"AgaBooga - Tuesday, October 7, 2003 - link
They must have a reason for choosing those drivers. Anandtech has been around long enough for that :)The reason is probably along the lines of when they started this benchmarking because they did soooo many games, resolutions, AA and AF levels, times the number of different cards, etc. That takes quite some time. Had they waited for the newer ATI drivers, it may have delayed this article one, or even two weeks till publishing. Also, they did mention they will do a followup articles with the new drivers, so patience is the key here.
Anonymous User - Tuesday, October 7, 2003 - link
#8 seems like a fanboy himselfdvinnen - Tuesday, October 7, 2003 - link
well #8, Nvidia was able to do it with the wonder driver, I dn't see why Ati can'tAnonymous User - Tuesday, October 7, 2003 - link
LOL, the ATI fanboys are already coming out of the woodwork. Listen #3 and #7, it's a fact, there is no IQ difference at all between the 50 Dets and the 3.7 CATs. And if you honestly believe you're going to see much of a difference with the CAT 3.8's....you're just stupid.Anonymous User - Tuesday, October 7, 2003 - link
yea I agree with what #3 said..hardly any commentary on the IQ... and If I'm not mistaken, aren't the new cats going to be out tomorrow???? If so, you might as well do the whole thing over again...Anonymous User - Tuesday, October 7, 2003 - link
#3, go away you fanboy. There was absolutely no IQ difference between the CAT 3.7 and 52.14 drivers.Anonymous User - Tuesday, October 7, 2003 - link
i'm sure intel will be happy to hear that you 'upgraded' your prescott to an FX.dvinnen - Tuesday, October 7, 2003 - link
errr, a lot of the piture links aren't wrking, like teh halo iq one and TR:AoDAnonymous User - Tuesday, October 7, 2003 - link
Um...Why was there nothing noting the difference between nVidia and ATI's in the IQ section? nVidia looks absolutely horrid compared to the ATI.
AgaBooga - Tuesday, October 7, 2003 - link
I saw more opinions in this article leaning towards Nvidia, especially around the Tomb Raider benchmarks. More specifically, starting with the page regarding compilers. I liked this articles, but there could have been less opinionated viewpoints on this. Also, it might be better if we get names of authors with what they typed, so we know who typed what, because some of this sure didn't sound like something Mr. Shimpi has written in the past...Anonymous User - Tuesday, October 7, 2003 - link
i'm still reading..............