Very funny stuff. OK, I don't want a porsche now...LOL. I think I get it. If you own a porsche, you spend so much time working to pay for it you're too wore out to have sex? :)
I get your point on the obvious. I guess I just like to show someone a printout to go along with the stuff I tell them. They seem to like it better when they can SEE it also. Years of dealing with mechanics and appliance repair people (and bad PC people I guess) have taught most people that step into a PC store that the guy selling them anything must be lying or stupid. Which is true in a lot of cases. Can't really blame them for being skeptics. Then again some people are just VISUAL and can't seem to get it when the same thing is SAID to them instead of shown to them.
In any case, I hope they do another review soon with more benches.
But maybe that's just because you have to be rich to afford one. Then you're old.
#38, (Also ThePlagiarmaster) they were given limited time with the card. With a limited number of cards to pass around, and a deadline to meet, ATi didn't just give them the card and say, “send it back when you’re done.” In such a situation, Anandtech would need to try and do the most important tests. And which would we rather see? A comparison where every card has the exact same framerates +- .1 frames? Or bench in a fashion where we can actually TELL which card is better?
I totally understand where you're coming from. I just moved from 10x7 up to 12x10 on my 19 incher only about 3 months ago, myself, but you need to face facts, too. You're a smart person, who obviously has a lot of experience with computers, and like you said, we influence the people we talk to. So tell them the obvious point you brought up. That a weaker cpu is gonna limit how fast the gpu will be (to an extent). We all know that, here. Most of us keep that actively in mind when we're reading the graphs. We can figure that out (though not to an exact amount) on our own. It’s more important (especially for a limited-time initial review) that we find out what the graphics card is capable of.
I’m sure once retail boards start hitting shelves at a store near you, there will be plenty of “real-world” tests that will check things out at the level you’re talking about, but you can’t expect them to do what you’re asking for in an initial review of a reference card.
ROFL. Depends on your video card AND your monitor. Mine has GREAT text at 120hz@1024x768. If you'd have read my post you'd already know I said it goes to 150 but looks like crap. Do you honestly think I wouldn't know to try other refresh rates? LOL. I'm an A+ certified PC Tech (among other certs, this is the relevant one here) and own a PC business. I've owed a computer of some sort since the apple // (back in the green/amber monitor days). I'd say you don't know that many people if everyone you know runs above 1024x768. I however, SELL these things every day.
I didn't say anandtech shouldn't publish the the numbers they did. I merely suggested they add one resolution to the picture. If the world was using higher resolutions as a whole, the web would be optimized for it. But it's not IS IT? Apparently you don't play many games online. At 1600x1200 a LOT of games out there can't run without tanking. Even without being online a top end machine can't cut it at 1600x1200 (with all the candy turned on) in EVERY game out there as you suggest. Your experience would definitely not be described as BUTTER SMOOTH. I'd further think you don't play that many games, or your just full of crap. Unless everything in your machine is overclocked you don't have a prayer of doing what you say. Congrats, you own a machine that the rest of us haven't figured out how to create yet.
If you don't like people talking back in a comments forum then leave. I merely responded to people that replied to my posts. Discussing the review is what this is all about, or did you not get that? What makes you think your favorite res is more important than mine (and the rest of the world)? Jeez, do you actually think the world revolves around you? Do you actually think most of the world owns the top cpu/gpu in their machines? Get real. I'd venture to guess that less than 5% of anandtech readers own both the top cpu (amd or intel) and the top gpu (Nvidia or Ati). Apparently you have no idea that the "middle class" even exists in our society. Sales of Intel/AMD's top cpu's say I'm right. They're so low they won't even tell us how many they sold in a quarterly report.
ThePlagiarmaster ENOUGH. We get your argument. I have a 21" and I NEVER play a game under 1600*1200. When the rig can't handle it anymore it is time to upgrade, end of story. This is why I have a pc and not a console. I think that everyone I know plays at higher than 1024*768, even my dad on his 17" plays @ 1152*864. Thank you Anandtech for publishing numbers that I can use. 1024*768 are useless for me, as equally useless the higher res ones are for ThePlagiarmaster. By the way, running ultra high refresh rates tends to make text more blurry than it would be at 75 or 80 hertz. Try it once and you will see what I mean. Try setting your desktop to 1152*864, you will probably like that too.
Actually I have 20/15 vision with almost 20/10 in my left eye. Lasik surgery is great for $3000 I can almost see like an eagle. If you'd read my post you'd already know I don't enjoy headaches. As such I run at 120hz. My Viewsonic P225F will actually run at 150hz@1024x768 but my vidcard doesn't handle that too well. This is another excuse to run at a lower res (no, not 800x600 or 640x480 mind you), you get a really high refresh rate. Most 19's can't handle 100hz at 1600x1200. Like TrogdorJW said, there isn't much difference in the look above 1024x768. Can you really see the difference at 1280x1024? In most games I can't. I'd rather have a super high refresh rate, and never see that dip in fps that happens in some games when the action really heats up.
Anandtech readers encompass a small number of people. However we advise many people (as is the case with my customers). If I sell someone a $500 videocard and it runs no faster than their neighbors $200 card (because the guy is CPU limited in his games) I look like a fool or worse, get bitched out. Sadly a lot of people do by 17's, and with a $200 vid card or more sometimes. I'd like to have a better idea of where the cpu/gpu line is drawn in the sand.
I'm not saying throw away the high end benchmarks. Just saying I'd like to see the res a HUGE portion of the population runs in tested. Change your res to 1600x1200 and look at anandtech's website (or any other for that matter). See those huge bars of wasted space on the sides? Why does anandtech (and everyone else) optimize for 1024x768? Because 90% of the world runs in this res! On top of this, most don't like switching to a different res for each game depending on the fps they can get in each. It's just a pain in the A$$.
PrinceGaz
I agree CRT's are superior. But don't agree 1600x1200 is the best res on a 19 or a 21 (well 21 maybe, but only if you're using graphics apps, or cad type apps where higher res is VERY important and useful). You really like browsing the web while losing 1/3 of your screen? I take it you don't mind switching res all day (I highly doubt you browse that high). Most cards can't cut it at 1600x1200 without major frame hits (only the latest and greatest and even then you'll switch to lower res often). The TI4200 (I have a 4400 in one machine) certainly is a card where you must be switching all day long on a game by game basis. That gets old to me, and I wouldn't even want to go there with my PC ILLITERATE customers (as #32 called them - perhaps rightly so).
Perhaps you're happy switching, and i'm not trying to take that away from you here. I'd just like to see a res that benefits recommendations to the average user (the hugest population of pc users that is). Is a hardware review site supposed to cater to the top 5% of the population, or the other 95% of the world? Don't get me wrong I love knowing what the advantage is at the high cpu/high gpu end is, but I don't get the opportunity to recommend that stuff very often. Do game designers make their games for the top 5% of pc users or aim them at the masses? Witness the popularity (still! ouch) of Geforce4MX cards and you'll see my point. I'm not asking them to throw out the highend stuff, nor to add 640x480 or 800x600 (ugh!). But 1024x768 is NORMAL. Of all the people you know, how many have 19's or bigger? If it's more than 50% you have a small circle of affluent people apparently. Again, not all that normal. While high res can be argued on large monitors, I'd argue right back that most monitors sold are 17in or smaller. The percentages just don't lie.
Actually I *don't* have an LCD monitor myself, I was just saying that many people do. My main monitor is a 22" CRT and I would never consider exchanging it for even a top-of-the range 20" LCD (same viewable area) as I feel CRTs are superior.
As #32 said, anyone who buys a 19" or worse still a 21" and only uses it at 1024x768 is nuts. 1600x1200 is usually the best resolution for 21/22" CRTs, and 1280x960 or 1280x1024 for 19" CRTs.
I generally play recent games at 1280x960, or 1280x1024 if that is all that is offered, but do sometimes need to drop that to 1024x768, and even 800x600 for Doom 3 as that is all my Ti4200 can manage. No point my upgrading it as I'm about to build a PCI-e system. In older games I play at 1600x1200 if it is available and it looks great. If not available I play at the highest resolution offered and crank up the AA. There is no point playing at a lower resolution if your card and monitor are capable of working well at a higher resolution.
#33- TrogdorJW
I assume you use an ATI rather than an nVidia card then? If you do use an nVidia card then theres an option in the drivers (since 55.xx I believe) in nView Display Modes -> Device Settings button -> Device adjustments... -> Display Timing tab, where you can tick 'Enable doublescan for lower resolution modes'. For me that makes 800x600 scan just like 1600x1200, and 640x480 is like 1280x960. They look *far* better with doublescan enabled than without on my 22" CRT. It just extends what is done normally at 512x384 to higher resolutions. For me, 1024x768 is unaffected by it because I choose a high refresh-rate (well above what my monitor or card could do at 2048x1536).
If ATI don't have that option available, then they should add it as it can't be very difficult to implement. Like I say the drivers do it anyway at up to 512x384 so its just a case of extending it.
32 - Hey, back off the 21" users! ;) I have a 21" monitor that I routinely use at 1024x768 in games. The difference between that and 1280x960/1024 is not that great, and 1600x1200 is really still too small for my liking. Performance is also an issue. If I can run 1280x1024 at good frame rates, I will, but I also have no problem running 1024x768 where required. 800x600 and lower, of course, are a different story. I start to see horizontal lines on my monitor at those resolutions.
Anyway, the nice thing is that a $200 card is coming out that will have about the same performance as the 9800 Pro, and in some cases better performance. Hmmm... but my 9800 Pro cost $200 back in April. Heheh. The added features might be nice, but I'm not that concerned. If you want a 6600GT or X700XT in AGP flavor, the 9800 Pro is still a viable option if you can find it for $200 or less. JMHO.
You made your point, but you're forgeting that most of your computer illiterate customers are not reading this site.
People who buy 21" monitors to run them at 1024x768, must have a few screws loose in their heads or are suffering from a serious vision impairment. I suppose you also run it at 50 Hz or sth like that.
Anyho' I bet most AT readers run at least 1280x1024 on a 19" monitor and that includes their games.
And anyway if a customer refuses to part with $50 in return for a much bettter monitor, what makes you think they will surrender $200 for a graphics card???
They deserve Intel's extremely sh*ty integrated graphics engine and nothing else.
So what you're saying is, everyone buys LCD's? NO? So everyone buys laptops to play games then? Neither of these is true. Most laptops are sold to business users. It's a rare person that specifically buys a laptop for games. LCD's are too expensive in large sizes (I know I don't sell many), and suck for games anyway. Only a FEW can run fast enough to play games that don't give you headaches (eye aches?..whatever). I hate BLUR. 1280x960 is not common. Unless you think a ton of people have widescreen lcd's at home (or widescreen laptops?) and bought them to play games?
Apparently you missed most of the point of the post. Is it worth upgrading from older cards at a normal resolution (meaning, 1024x768, do a poll, you'll find most run here). Most people buy these things then wonder why they don't run much faster. With which gpu's are we cpu limited at 1024x768? By throwing the 9700pro (and lower) into these super high resolutions it might make someone (eh, a lot of people) think they're card is junk and an upgrade to one of these will solve all problems. NOT...If you tossed a few 1024x768 tests in, someone might find they're cpu limited with the current card already. Tossing on an even more powerful card is pointless to these people. Too bad they wouldn't figure that out in a review such as this.
Why do you think people used to always run in 640x480 when testing cpus (which I hated, it isn't real-world)? Because some games are/were GPU limited above this. In order to eliminate the GPU they would test at 640x480. So yea, running in a lower resolution will sometimes let your cpu run wild (eh, produce high fps :) ). The point was, we're pretty much seeing the highest of both ends here. How does that help someone trying to figure out if a $200 card will help them get more fps? Look at #29's question.
I have a 21in and a 19in, both are in 1024x768. My dad has a 21in he runs in the same res. Most web pages are designed for this res, a lot of games are too. So most people run in this res. A Porsche is designed to do about 150+mph but do you see anyone doing that on the highway? No, but that doesn't mean getting from 0-60 is any less fun now does it? Even though you don't run it at 150mph it still gets the women doesn't it? Not too many high performance cars advertise their top speed. Why? Because nobody uses it anyway.
PC's weren't designed to play games. But some of them sure are fun today eh?
#28, I know that, you know that, but most of the world still saves a buck or two on the monitor. As much as I push 19inchers, people are just CHEAP. I still sell a good number of 15's! Even when I tell them a decent 19 would only cost them $50 more and they have to live with it for the next 5yrs or so at 15in. Even on my 21 I don't see how 1024x768 is tunnel vision though. The web is great, pics are fine, I don't have to screw with font sizes all the time to get some things right, game interfaces are ALL designed to make 1024x768 LOOK perfect. They may design for others also, but they make sure this res is working as it's the most used.
Anyway, 1280x960 or 1280x1024 is becoming the more common resolution used by many people with fairly recent systems, even if its only because 1280x1024 is the native resolution of their LCD display, so anything else looks inferior.
How fast someone's CPU is really only determines the maximum framerate that can be achieved in any given game sequence regardless of resolution. The CPU itself won't churn out frames more quickly just because the graphics-card is rendering at a lower resolution. That answers the first half or so of your post.
As the X700 series are upper mid-range cards, they are intended to be used at quite high resolutions, not 1024x768 or less. The tests showed the X700XT was easily capable of producing a more than satisfactory framerate at 1280x1024 in every game tried including Doom 3, so why run more tests at 1024x768? Only if it were a slower which could only manage 30-40fps or less at 1280x960 would tests at lower resolutions be worthwhile.
Since these are now "low-end" cards, it would be great to see how they perform with slower cpus. I still have a lowly XP 2400+ thoroughbred...and I'd rather spend money on my Video card than another MB/CPU, if it can perform (at 1024 x 768).
I don't know about you, #27, but I think 10x7 is tunnel vision. Decent sized monitors are not all that expensive, and they allow you to do so much more with the space.
What I want to know is how everything performs at 1024x768 with and without 4xaa/8xan. Lets face it 95% of the people running these games are NOT doing it in anything higher. To cut this res out of everything but doom3 (an oddball engine to begin with) is ridiculous. Sure higher shows us bandwidth becomes a big issue. But for most people running at 1024x768 (where most of us have cpu's that can keep a decent fps), does bandwidth really matter at all? Is a 9700pro still good at this res? You have to test 1024x768, because all you're doing here is showing one side of the coin. People who have the FASTEST FREAKING CPU's (eh, most don't - raise your hand if you have an athlonFX 53 or A64 3400+ or better? - Or even a P4 of 3.4ghz or faster? - I suspect most hands are DOWN now), to go with the fastest GPU's. Most people cut one or the other. So you need to show how a card does at a "NORMAL" res. I usually can't even tell the difference between 1024x768 and 1600x1200. At the frenetic pace you get in a FPS you don't even see the little details. Most of us don't hop around in different resolutions for every different game either. Most of my customers couldn't even tell you what resolution IS! No, I'm not kidding. They take it home in the res I put it in and leave it there forever (1024x768). If you're like me you pick a res all games run in without tanking the fps. Which for me is 1024x768. I don't have to care what game I run, I just run it. No drops during heated action. I hope you re-bench with the res most people use so people can really see, is it worth the money or not at the res the world uses? Why pay $200-400 for a new card if the 9700pro still rocks at 1024x768, and that expensive card only gets you another couple fps this low. I know it gets tons better with much higher res's but at the normal persons res does it show its value or not? In doom it seems to matter, but then this game is a graphical demo. No other engine is quite this punishing on cards. A good 70% or so of my customers still buy 17inchers! Granted some games have multi-res interfaces, but some get really small at larger resolutions on a 17in. This article is the complete opposite of running cpu tests in 640x480 but yeilds the same results. If nobody runs at 640x480 how real-world is it? If "almost" nobody runs in 1600x1200 should we spend more time looking at 1024x768 where 90% or so run? That's more real world right? 1600x1200 is for the fastest machines on the planet. Which is NOT many people I know, and I sell pc's...LOL.
Derek I think your GPU scores urgently need updateing. We need to be able to compare new cards to old ones and we just can't do that reliably right now. Have a look at xbitlabs test results.
ATI has stated that they will be bridging the RV410 back to AGP from PCIe -- they will not be running seperate silicon. They didn't have any kind of date they could get us, but they did indicate that it should be available before the end of the year. It's just hard to trust having such distant dates thrown around when both ATI and NVIDIA have shown that they have problems filling the channel with currently announced products.
#18:
This is likely a result of the fact that only the X700 XT, 6600 GT, and X600 XT were run with the most recent drivers -- the 6800 series cards are still running on 61.xx while the 6600 GT was powered by the 65.xx drivers. We are looking into a driver regression test, and we will take another look at performance with the latest drivers as the dust starts to settle.
OK, I phrased the first part of my post VERY badly. In my defense, I had not yet had any coffee. ;)
What I was trying to get across was that ATI does OK competing with NVidia in DX games, but still gets killed in OpenGL. They used to smoke NVidia in DX, but now NVidia has fixed whatever issues they had with that and are making a very good competitive card to ATI's offering. The 6600GT is clearly the better card here, for either D3 or HL2 engines.
Didn't the article say the pro (256mb) was the same price as the XT (128mb)? It does seem odd that the 6600s are only pci-e. Especially since nVidia only makes motherboards with AGP slots, right?
What i think is ATi are doing what nVidia did in the high end market, they brought out the X700Pro, which is very close to the X700XT, but cheaper, and probably highly moddable.
Buy a X700Pro with 5 - 10% loss of performance for $60 less?
What mystifies me (still) is the performance discrepancy between the 6800 and 6600 GT. In some cases, the 6600 GT is whooping up on it. The 6600GT preview article made some allusions to 12 pipes not be as effeicient as 8 and 16, etc. But if the performance is really so close between them, the 6800 is probably going to go the way of the 9500 Pro. That's too bad, my 6800 clocked at 400/825 is pretty nice. If anyone could clear up why the 6600 GT is faster than the 6800, that would be nice. The fill rates should be nearly identical, I guess. But doesn't the 6800 retain it's 6 vertex shaders and wouldn't the extra memory bandwidth make a noticeable difference?
Just wish nVidia would come out with the NF4 NOW with PCI-Express, etc. a board with two 16x slots, one 6600Gt now, and one later is looking pretty awesome.
Looks like ATI dropped the ball - 12 months or more kicking nVidias ass and now they are the ones lagging behind.
Oh well, I am not in the market for a graphics card at the moment (bought a 9800XT last year) but if I was, I'd be switching to nVidia.
I do have to say that the move away from AGP is annoying. What about the people that want to upgrade their components? Are we supposed to ditch kit that is less than 6 months old?
I must agree all things considered the 6600GT really comes out the winner...I mean, look at the x800/6800 launch, the x800Pro looked like it just massacred the 6800GT, and now no one thinks twice at the 400$ price point which is better because nV put out some massive driver increases. Considering the 6600GT already has the performance AND feature advantage over the x700, there's just no contest when you add in what the nV driver team is going to do for its perf. Can't wait to dual up two 6600GT's (not SLI, multimonitor =) )
#11: Probably because you won't have much money left for a video card after you buy all the new crap you need for a Prescott system. ;)
Anyway, this quote made me wonder a bit.
"From this test, it looks like the X700 is the better card for source based games unless you want to run at really high quality settings."
Er, if I can get great graphics at a decent framerate (42fps is pretty good for 16x12 with AA/AF, if you ask me (beats the hell out of Halo's horribly designed engine)), why WOULDN'T I turn on all the goodies? Then again, I used to enable AA/AF with my Ti4200 too, so my opinion may be slightly biased. ;)
#10 - I agree entirely! These are midrange cards. Yet they're released first as PCIe parts. Which is only available as a high-end Intel solution. Why does this make sense?
The real issue is this, nVidia, has dedicate die space to shadowing functions specificaly requested by John Carmak for use with the Doom 3 engine. Nvidia obliged, yes ATI's openGL drivers are POS's but even if they were up to snuff Doom 3 will still favor nVidia. That said it all boils down to where do you thing the better mods/engine licenses will go Doom 3, or Half-Life 2, and is the small discrepency between the 6600 GT, and 700 XT realy worth those few extra frames in HL2, as compared to the significant frame rate difference in Doom 3, and the subsiquent games based on that engine. Not to mention PS 3.0 support. I'll gladly spend $10 extra for a better card.
#3: I think it would be fairer to say that under DX ATI/nV are in a situation of win some, lose some. I wouldn't say the ATI cards are superior - read the Conclusion again.
However, like Derek said it remains to be seen how the cards perform with a mid-range system.
So now long before these mid-range solutions are available in AGP? Seems incredibly silly to me that they weren't first released in AGP form! I can't use either nVidia's or ATI's midrange solutions in my midrange system (A64 3000+). Strangly though, if I wanted to blow $400 on a video card I could always get an x800!
The specs on paper look good, but for some reason the x700 doesnt perform.
With aa/af enabled you'd expect the x700 to beat the 6600gt in dx games thanks to ati's optimizations/cheats, but it doesnt. Go figure.
Again we see ATI=DX, nVidia=OpenGL. It's interesteing that the gap in DirectX games is narrowing. ATI needs to get better OpenGL support somehow and do it quick. These cards are pretty evenly matched (diff of only 2-4 FPS avg.) - until you get to OpenGL. NVidia comes out on top by 15-20 FPS in those benches.
Please pay careful attention to the test page -- the 9700 Pro was tested on a (more suited to gaming) Athlon 64 system which makes the results not absolutely comparable.
The Athlon 64 system is our video test rig, and rerunning all our cards on a p4ee system when the A64 gives results we can use as a reference just didn't make sense.
As stated on the test page, the directly comparable cards are the GeForce 6600 GT, the Radeon X700 XT, and the Radeon X600 XT.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
40 Comments
Back to Article
ThePlagiarmaster - Friday, September 24, 2004 - link
#39 ianwhthseVery funny stuff. OK, I don't want a porsche now...LOL. I think I get it. If you own a porsche, you spend so much time working to pay for it you're too wore out to have sex? :)
I get your point on the obvious. I guess I just like to show someone a printout to go along with the stuff I tell them. They seem to like it better when they can SEE it also. Years of dealing with mechanics and appliance repair people (and bad PC people I guess) have taught most people that step into a PC store that the guy selling them anything must be lying or stupid. Which is true in a lot of cases. Can't really blame them for being skeptics. Then again some people are just VISUAL and can't seem to get it when the same thing is SAID to them instead of shown to them.
In any case, I hope they do another review soon with more benches.
ianwhthse - Thursday, September 23, 2004 - link
#31, ThePlagiarmaster:Actually, check this out.
http://www2.autospies.com/article/index.asp?articl...
But maybe that's just because you have to be rich to afford one. Then you're old.
#38, (Also ThePlagiarmaster) they were given limited time with the card. With a limited number of cards to pass around, and a deadline to meet, ATi didn't just give them the card and say, “send it back when you’re done.” In such a situation, Anandtech would need to try and do the most important tests. And which would we rather see? A comparison where every card has the exact same framerates +- .1 frames? Or bench in a fashion where we can actually TELL which card is better?
I totally understand where you're coming from. I just moved from 10x7 up to 12x10 on my 19 incher only about 3 months ago, myself, but you need to face facts, too. You're a smart person, who obviously has a lot of experience with computers, and like you said, we influence the people we talk to. So tell them the obvious point you brought up. That a weaker cpu is gonna limit how fast the gpu will be (to an extent). We all know that, here. Most of us keep that actively in mind when we're reading the graphs. We can figure that out (though not to an exact amount) on our own. It’s more important (especially for a limited-time initial review) that we find out what the graphics card is capable of.
I’m sure once retail boards start hitting shelves at a store near you, there will be plenty of “real-world” tests that will check things out at the level you’re talking about, but you can’t expect them to do what you’re asking for in an initial review of a reference card.
ThePlagiarmaster - Thursday, September 23, 2004 - link
#37 blckgrffnROFL. Depends on your video card AND your monitor. Mine has GREAT text at 120hz@1024x768. If you'd have read my post you'd already know I said it goes to 150 but looks like crap. Do you honestly think I wouldn't know to try other refresh rates? LOL. I'm an A+ certified PC Tech (among other certs, this is the relevant one here) and own a PC business. I've owed a computer of some sort since the apple // (back in the green/amber monitor days). I'd say you don't know that many people if everyone you know runs above 1024x768. I however, SELL these things every day.
I didn't say anandtech shouldn't publish the the numbers they did. I merely suggested they add one resolution to the picture. If the world was using higher resolutions as a whole, the web would be optimized for it. But it's not IS IT? Apparently you don't play many games online. At 1600x1200 a LOT of games out there can't run without tanking. Even without being online a top end machine can't cut it at 1600x1200 (with all the candy turned on) in EVERY game out there as you suggest. Your experience would definitely not be described as BUTTER SMOOTH. I'd further think you don't play that many games, or your just full of crap. Unless everything in your machine is overclocked you don't have a prayer of doing what you say. Congrats, you own a machine that the rest of us haven't figured out how to create yet.
If you don't like people talking back in a comments forum then leave. I merely responded to people that replied to my posts. Discussing the review is what this is all about, or did you not get that? What makes you think your favorite res is more important than mine (and the rest of the world)? Jeez, do you actually think the world revolves around you? Do you actually think most of the world owns the top cpu/gpu in their machines? Get real. I'd venture to guess that less than 5% of anandtech readers own both the top cpu (amd or intel) and the top gpu (Nvidia or Ati). Apparently you have no idea that the "middle class" even exists in our society. Sales of Intel/AMD's top cpu's say I'm right. They're so low they won't even tell us how many they sold in a quarterly report.
blckgrffn - Thursday, September 23, 2004 - link
ThePlagiarmaster ENOUGH. We get your argument. I have a 21" and I NEVER play a game under 1600*1200. When the rig can't handle it anymore it is time to upgrade, end of story. This is why I have a pc and not a console. I think that everyone I know plays at higher than 1024*768, even my dad on his 17" plays @ 1152*864. Thank you Anandtech for publishing numbers that I can use. 1024*768 are useless for me, as equally useless the higher res ones are for ThePlagiarmaster. By the way, running ultra high refresh rates tends to make text more blurry than it would be at 75 or 80 hertz. Try it once and you will see what I mean. Try setting your desktop to 1152*864, you will probably like that too.Staples - Thursday, September 23, 2004 - link
I am kind of worried about Xbox 2. ATI is doing a horrible job of putting a good price/performance card out there.ThePlagiarmaster - Thursday, September 23, 2004 - link
#32 AtaStrumfActually I have 20/15 vision with almost 20/10 in my left eye. Lasik surgery is great for $3000 I can almost see like an eagle. If you'd read my post you'd already know I don't enjoy headaches. As such I run at 120hz. My Viewsonic P225F will actually run at 150hz@1024x768 but my vidcard doesn't handle that too well. This is another excuse to run at a lower res (no, not 800x600 or 640x480 mind you), you get a really high refresh rate. Most 19's can't handle 100hz at 1600x1200. Like TrogdorJW said, there isn't much difference in the look above 1024x768. Can you really see the difference at 1280x1024? In most games I can't. I'd rather have a super high refresh rate, and never see that dip in fps that happens in some games when the action really heats up.
Anandtech readers encompass a small number of people. However we advise many people (as is the case with my customers). If I sell someone a $500 videocard and it runs no faster than their neighbors $200 card (because the guy is CPU limited in his games) I look like a fool or worse, get bitched out. Sadly a lot of people do by 17's, and with a $200 vid card or more sometimes. I'd like to have a better idea of where the cpu/gpu line is drawn in the sand.
I'm not saying throw away the high end benchmarks. Just saying I'd like to see the res a HUGE portion of the population runs in tested. Change your res to 1600x1200 and look at anandtech's website (or any other for that matter). See those huge bars of wasted space on the sides? Why does anandtech (and everyone else) optimize for 1024x768? Because 90% of the world runs in this res! On top of this, most don't like switching to a different res for each game depending on the fps they can get in each. It's just a pain in the A$$.
PrinceGaz
I agree CRT's are superior. But don't agree 1600x1200 is the best res on a 19 or a 21 (well 21 maybe, but only if you're using graphics apps, or cad type apps where higher res is VERY important and useful). You really like browsing the web while losing 1/3 of your screen? I take it you don't mind switching res all day (I highly doubt you browse that high). Most cards can't cut it at 1600x1200 without major frame hits (only the latest and greatest and even then you'll switch to lower res often). The TI4200 (I have a 4400 in one machine) certainly is a card where you must be switching all day long on a game by game basis. That gets old to me, and I wouldn't even want to go there with my PC ILLITERATE customers (as #32 called them - perhaps rightly so).
Perhaps you're happy switching, and i'm not trying to take that away from you here. I'd just like to see a res that benefits recommendations to the average user (the hugest population of pc users that is). Is a hardware review site supposed to cater to the top 5% of the population, or the other 95% of the world? Don't get me wrong I love knowing what the advantage is at the high cpu/high gpu end is, but I don't get the opportunity to recommend that stuff very often. Do game designers make their games for the top 5% of pc users or aim them at the masses? Witness the popularity (still! ouch) of Geforce4MX cards and you'll see my point. I'm not asking them to throw out the highend stuff, nor to add 640x480 or 800x600 (ugh!). But 1024x768 is NORMAL. Of all the people you know, how many have 19's or bigger? If it's more than 50% you have a small circle of affluent people apparently. Again, not all that normal. While high res can be argued on large monitors, I'd argue right back that most monitors sold are 17in or smaller. The percentages just don't lie.
PrinceGaz - Thursday, September 23, 2004 - link
#30- The PlagiarmasterActually I *don't* have an LCD monitor myself, I was just saying that many people do. My main monitor is a 22" CRT and I would never consider exchanging it for even a top-of-the range 20" LCD (same viewable area) as I feel CRTs are superior.
As #32 said, anyone who buys a 19" or worse still a 21" and only uses it at 1024x768 is nuts. 1600x1200 is usually the best resolution for 21/22" CRTs, and 1280x960 or 1280x1024 for 19" CRTs.
I generally play recent games at 1280x960, or 1280x1024 if that is all that is offered, but do sometimes need to drop that to 1024x768, and even 800x600 for Doom 3 as that is all my Ti4200 can manage. No point my upgrading it as I'm about to build a PCI-e system. In older games I play at 1600x1200 if it is available and it looks great. If not available I play at the highest resolution offered and crank up the AA. There is no point playing at a lower resolution if your card and monitor are capable of working well at a higher resolution.
#33- TrogdorJW
I assume you use an ATI rather than an nVidia card then? If you do use an nVidia card then theres an option in the drivers (since 55.xx I believe) in nView Display Modes -> Device Settings button -> Device adjustments... -> Display Timing tab, where you can tick 'Enable doublescan for lower resolution modes'. For me that makes 800x600 scan just like 1600x1200, and 640x480 is like 1280x960. They look *far* better with doublescan enabled than without on my 22" CRT. It just extends what is done normally at 512x384 to higher resolutions. For me, 1024x768 is unaffected by it because I choose a high refresh-rate (well above what my monitor or card could do at 2048x1536).
If ATI don't have that option available, then they should add it as it can't be very difficult to implement. Like I say the drivers do it anyway at up to 512x384 so its just a case of extending it.
TrogdorJW - Wednesday, September 22, 2004 - link
32 - Hey, back off the 21" users! ;) I have a 21" monitor that I routinely use at 1024x768 in games. The difference between that and 1280x960/1024 is not that great, and 1600x1200 is really still too small for my liking. Performance is also an issue. If I can run 1280x1024 at good frame rates, I will, but I also have no problem running 1024x768 where required. 800x600 and lower, of course, are a different story. I start to see horizontal lines on my monitor at those resolutions.Anyway, the nice thing is that a $200 card is coming out that will have about the same performance as the 9800 Pro, and in some cases better performance. Hmmm... but my 9800 Pro cost $200 back in April. Heheh. The added features might be nice, but I'm not that concerned. If you want a 6600GT or X700XT in AGP flavor, the 9800 Pro is still a viable option if you can find it for $200 or less. JMHO.
AtaStrumf - Wednesday, September 22, 2004 - link
ThePlagiarmaster, enough with the 1024x768 rant.You made your point, but you're forgeting that most of your computer illiterate customers are not reading this site.
People who buy 21" monitors to run them at 1024x768, must have a few screws loose in their heads or are suffering from a serious vision impairment. I suppose you also run it at 50 Hz or sth like that.
Anyho' I bet most AT readers run at least 1280x1024 on a 19" monitor and that includes their games.
And anyway if a customer refuses to part with $50 in return for a much bettter monitor, what makes you think they will surrender $200 for a graphics card???
They deserve Intel's extremely sh*ty integrated graphics engine and nothing else.
ThePlagiarmaster - Wednesday, September 22, 2004 - link
#30 PrinceGazSo what you're saying is, everyone buys LCD's? NO? So everyone buys laptops to play games then? Neither of these is true. Most laptops are sold to business users. It's a rare person that specifically buys a laptop for games. LCD's are too expensive in large sizes (I know I don't sell many), and suck for games anyway. Only a FEW can run fast enough to play games that don't give you headaches (eye aches?..whatever). I hate BLUR. 1280x960 is not common. Unless you think a ton of people have widescreen lcd's at home (or widescreen laptops?) and bought them to play games?
Apparently you missed most of the point of the post. Is it worth upgrading from older cards at a normal resolution (meaning, 1024x768, do a poll, you'll find most run here). Most people buy these things then wonder why they don't run much faster. With which gpu's are we cpu limited at 1024x768? By throwing the 9700pro (and lower) into these super high resolutions it might make someone (eh, a lot of people) think they're card is junk and an upgrade to one of these will solve all problems. NOT...If you tossed a few 1024x768 tests in, someone might find they're cpu limited with the current card already. Tossing on an even more powerful card is pointless to these people. Too bad they wouldn't figure that out in a review such as this.
Why do you think people used to always run in 640x480 when testing cpus (which I hated, it isn't real-world)? Because some games are/were GPU limited above this. In order to eliminate the GPU they would test at 640x480. So yea, running in a lower resolution will sometimes let your cpu run wild (eh, produce high fps :) ). The point was, we're pretty much seeing the highest of both ends here. How does that help someone trying to figure out if a $200 card will help them get more fps? Look at #29's question.
I have a 21in and a 19in, both are in 1024x768. My dad has a 21in he runs in the same res. Most web pages are designed for this res, a lot of games are too. So most people run in this res. A Porsche is designed to do about 150+mph but do you see anyone doing that on the highway? No, but that doesn't mean getting from 0-60 is any less fun now does it? Even though you don't run it at 150mph it still gets the women doesn't it? Not too many high performance cars advertise their top speed. Why? Because nobody uses it anyway.
PC's weren't designed to play games. But some of them sure are fun today eh?
#28, I know that, you know that, but most of the world still saves a buck or two on the monitor. As much as I push 19inchers, people are just CHEAP. I still sell a good number of 15's! Even when I tell them a decent 19 would only cost them $50 more and they have to live with it for the next 5yrs or so at 15in. Even on my 21 I don't see how 1024x768 is tunnel vision though. The web is great, pics are fine, I don't have to screw with font sizes all the time to get some things right, game interfaces are ALL designed to make 1024x768 LOOK perfect. They may design for others also, but they make sure this res is working as it's the most used.
PrinceGaz - Wednesday, September 22, 2004 - link
#27 The Plagrimaster:Do you know what a paragraph is?
Anyway, 1280x960 or 1280x1024 is becoming the more common resolution used by many people with fairly recent systems, even if its only because 1280x1024 is the native resolution of their LCD display, so anything else looks inferior.
How fast someone's CPU is really only determines the maximum framerate that can be achieved in any given game sequence regardless of resolution. The CPU itself won't churn out frames more quickly just because the graphics-card is rendering at a lower resolution. That answers the first half or so of your post.
As the X700 series are upper mid-range cards, they are intended to be used at quite high resolutions, not 1024x768 or less. The tests showed the X700XT was easily capable of producing a more than satisfactory framerate at 1280x1024 in every game tried including Doom 3, so why run more tests at 1024x768? Only if it were a slower which could only manage 30-40fps or less at 1280x960 would tests at lower resolutions be worthwhile.
kmmatney - Wednesday, September 22, 2004 - link
Since these are now "low-end" cards, it would be great to see how they perform with slower cpus. I still have a lowly XP 2400+ thoroughbred...and I'd rather spend money on my Video card than another MB/CPU, if it can perform (at 1024 x 768).Chuckles - Wednesday, September 22, 2004 - link
I don't know about you, #27, but I think 10x7 is tunnel vision. Decent sized monitors are not all that expensive, and they allow you to do so much more with the space.ThePlagiarmaster - Wednesday, September 22, 2004 - link
What I want to know is how everything performs at 1024x768 with and without 4xaa/8xan. Lets face it 95% of the people running these games are NOT doing it in anything higher. To cut this res out of everything but doom3 (an oddball engine to begin with) is ridiculous. Sure higher shows us bandwidth becomes a big issue. But for most people running at 1024x768 (where most of us have cpu's that can keep a decent fps), does bandwidth really matter at all? Is a 9700pro still good at this res? You have to test 1024x768, because all you're doing here is showing one side of the coin. People who have the FASTEST FREAKING CPU's (eh, most don't - raise your hand if you have an athlonFX 53 or A64 3400+ or better? - Or even a P4 of 3.4ghz or faster? - I suspect most hands are DOWN now), to go with the fastest GPU's. Most people cut one or the other. So you need to show how a card does at a "NORMAL" res. I usually can't even tell the difference between 1024x768 and 1600x1200. At the frenetic pace you get in a FPS you don't even see the little details. Most of us don't hop around in different resolutions for every different game either. Most of my customers couldn't even tell you what resolution IS! No, I'm not kidding. They take it home in the res I put it in and leave it there forever (1024x768). If you're like me you pick a res all games run in without tanking the fps. Which for me is 1024x768. I don't have to care what game I run, I just run it. No drops during heated action. I hope you re-bench with the res most people use so people can really see, is it worth the money or not at the res the world uses? Why pay $200-400 for a new card if the 9700pro still rocks at 1024x768, and that expensive card only gets you another couple fps this low. I know it gets tons better with much higher res's but at the normal persons res does it show its value or not? In doom it seems to matter, but then this game is a graphical demo. No other engine is quite this punishing on cards. A good 70% or so of my customers still buy 17inchers! Granted some games have multi-res interfaces, but some get really small at larger resolutions on a 17in. This article is the complete opposite of running cpu tests in 640x480 but yeilds the same results. If nobody runs at 640x480 how real-world is it? If "almost" nobody runs in 1600x1200 should we spend more time looking at 1024x768 where 90% or so run? That's more real world right? 1600x1200 is for the fastest machines on the planet. Which is NOT many people I know, and I sell pc's...LOL.AtaStrumf - Wednesday, September 22, 2004 - link
At the very least have a look at Far Cry and Halo results. They realy seem to be upside down.I don't know who's making the mistake here, but it's something that needs looking into.
AtaStrumf - Wednesday, September 22, 2004 - link
Derek I think your GPU scores urgently need updateing. We need to be able to compare new cards to old ones and we just can't do that reliably right now. Have a look at xbitlabs test results.http://www.xbitlabs.com/articles/video/display/ati...
Relative positions between 9800 XT and X700 XT are more often then not different from your results.
In their results it seems like R9800 XT fares much better relative to X700 XT. We might be making the wrong conclusions based on your scores.
Da3dalus - Tuesday, September 21, 2004 - link
Quite clearly a win for nVidia in this match :)Hey Derek, are you gonna do a big Fall 2004 Video Card Roundup like you did last year? That would be really nice :)
jm0ris0n - Tuesday, September 21, 2004 - link
#17 My thoughts exactly ! :)DerekWilson - Tuesday, September 21, 2004 - link
#8:ATI has stated that they will be bridging the RV410 back to AGP from PCIe -- they will not be running seperate silicon. They didn't have any kind of date they could get us, but they did indicate that it should be available before the end of the year. It's just hard to trust having such distant dates thrown around when both ATI and NVIDIA have shown that they have problems filling the channel with currently announced products.
#18:
This is likely a result of the fact that only the X700 XT, 6600 GT, and X600 XT were run with the most recent drivers -- the 6800 series cards are still running on 61.xx while the 6600 GT was powered by the 65.xx drivers. We are looking into a driver regression test, and we will take another look at performance with the latest drivers as the dust starts to settle.
Aquila76 - Tuesday, September 21, 2004 - link
OK, I phrased the first part of my post VERY badly. In my defense, I had not yet had any coffee. ;)What I was trying to get across was that ATI does OK competing with NVidia in DX games, but still gets killed in OpenGL. They used to smoke NVidia in DX, but now NVidia has fixed whatever issues they had with that and are making a very good competitive card to ATI's offering. The 6600GT is clearly the better card here, for either D3 or HL2 engines.
Entropy531 - Tuesday, September 21, 2004 - link
Didn't the article say the pro (256mb) was the same price as the XT (128mb)? It does seem odd that the 6600s are only pci-e. Especially since nVidia only makes motherboards with AGP slots, right?Drayvn - Tuesday, September 21, 2004 - link
However, on this site http://www.hothardware.com/viewarticle.cfm?article... it shows the X700XT edged out a win overall.What i think is ATi are doing what nVidia did in the high end market, they brought out the X700Pro, which is very close to the X700XT, but cheaper, and probably highly moddable.
Buy a X700Pro with 5 - 10% loss of performance for $60 less?
blckgrffn - Tuesday, September 21, 2004 - link
What mystifies me (still) is the performance discrepancy between the 6800 and 6600 GT. In some cases, the 6600 GT is whooping up on it. The 6600GT preview article made some allusions to 12 pipes not be as effeicient as 8 and 16, etc. But if the performance is really so close between them, the 6800 is probably going to go the way of the 9500 Pro. That's too bad, my 6800 clocked at 400/825 is pretty nice. If anyone could clear up why the 6600 GT is faster than the 6800, that would be nice. The fill rates should be nearly identical, I guess. But doesn't the 6800 retain it's 6 vertex shaders and wouldn't the extra memory bandwidth make a noticeable difference?Resh - Tuesday, September 21, 2004 - link
Just wish nVidia would come out with the NF4 NOW with PCI-Express, etc. a board with two 16x slots, one 6600Gt now, and one later is looking pretty awesome.rf - Tuesday, September 21, 2004 - link
Looks like ATI dropped the ball - 12 months or more kicking nVidias ass and now they are the ones lagging behind.Oh well, I am not in the market for a graphics card at the moment (bought a 9800XT last year) but if I was, I'd be switching to nVidia.
I do have to say that the move away from AGP is annoying. What about the people that want to upgrade their components? Are we supposed to ditch kit that is less than 6 months old?
ZobarStyl - Tuesday, September 21, 2004 - link
I must agree all things considered the 6600GT really comes out the winner...I mean, look at the x800/6800 launch, the x800Pro looked like it just massacred the 6800GT, and now no one thinks twice at the 400$ price point which is better because nV put out some massive driver increases. Considering the 6600GT already has the performance AND feature advantage over the x700, there's just no contest when you add in what the nV driver team is going to do for its perf. Can't wait to dual up two 6600GT's (not SLI, multimonitor =) )LocutusX - Tuesday, September 21, 2004 - link
Just to be clear, I think #3's statement was invalid simply because Nvidia is winning half the Direct3D games as well as all the OGL games.LocutusX - Tuesday, September 21, 2004 - link
#3: "Again we see ATI=DX, nVidia=OpenGL. "Nah, don't think so. Here are the notes I took while I read the article;
6600gt
d3 (big win) - OGL
far cry (with max AA/AF) - DX9
halo - DX9
jedi academy (big win) - OGL
UT (tie) - DX8/DX9
x700xt
far cry (with NO aa/af) - DX9
source engine (small win) - DX9
UT (tie) - DX8/DX9
I'm sorry to say it, but the X700XT is a disapointment. I'm not an "nvidiot"; check my forum profile, I'm an ATI owner.
Shinei - Tuesday, September 21, 2004 - link
#11: Probably because you won't have much money left for a video card after you buy all the new crap you need for a Prescott system. ;)Anyway, this quote made me wonder a bit.
"From this test, it looks like the X700 is the better card for source based games unless you want to run at really high quality settings."
Er, if I can get great graphics at a decent framerate (42fps is pretty good for 16x12 with AA/AF, if you ask me (beats the hell out of Halo's horribly designed engine)), why WOULDN'T I turn on all the goodies? Then again, I used to enable AA/AF with my Ti4200 too, so my opinion may be slightly biased. ;)
Woodchuck2000 - Tuesday, September 21, 2004 - link
#10 - I agree entirely! These are midrange cards. Yet they're released first as PCIe parts. Which is only available as a high-end Intel solution. Why does this make sense?AlphaFox - Tuesday, September 21, 2004 - link
if you have a PCIe system, why would you waste your $$ on an entry level card???? these cards should be released on AGP if they want any to sell.manno - Tuesday, September 21, 2004 - link
The real issue is this, nVidia, has dedicate die space to shadowing functions specificaly requested by John Carmak for use with the Doom 3 engine. Nvidia obliged, yes ATI's openGL drivers are POS's but even if they were up to snuff Doom 3 will still favor nVidia. That said it all boils down to where do you thing the better mods/engine licenses will go Doom 3, or Half-Life 2, and is the small discrepency between the 6600 GT, and 700 XT realy worth those few extra frames in HL2, as compared to the significant frame rate difference in Doom 3, and the subsiquent games based on that engine. Not to mention PS 3.0 support. I'll gladly spend $10 extra for a better card.coldpower27 - Tuesday, September 21, 2004 - link
That makes more sense since ATI's has to make a seperate core for the AGP version while Nvidia doesn't and can use thier HSI.chilled - Tuesday, September 21, 2004 - link
#5: HardOCP's conclusion states that the AGP version of the X700 will be not be available soon, but before Christmas.I read somnewhere(?) that the AGP version of the 6600 would be available sometime in October...
Make of that what you will.
chilled - Tuesday, September 21, 2004 - link
#3: I think it would be fairer to say that under DX ATI/nV are in a situation of win some, lose some. I wouldn't say the ATI cards are superior - read the Conclusion again.However, like Derek said it remains to be seen how the cards perform with a mid-range system.
Locutus4657 - Tuesday, September 21, 2004 - link
So now long before these mid-range solutions are available in AGP? Seems incredibly silly to me that they weren't first released in AGP form! I can't use either nVidia's or ATI's midrange solutions in my midrange system (A64 3000+). Strangly though, if I wanted to blow $400 on a video card I could always get an x800!shabby - Tuesday, September 21, 2004 - link
The specs on paper look good, but for some reason the x700 doesnt perform.With aa/af enabled you'd expect the x700 to beat the 6600gt in dx games thanks to ati's optimizations/cheats, but it doesnt. Go figure.
Aquila76 - Tuesday, September 21, 2004 - link
Again we see ATI=DX, nVidia=OpenGL. It's interesteing that the gap in DirectX games is narrowing. ATI needs to get better OpenGL support somehow and do it quick. These cards are pretty evenly matched (diff of only 2-4 FPS avg.) - until you get to OpenGL. NVidia comes out on top by 15-20 FPS in those benches.DerekWilson - Tuesday, September 21, 2004 - link
Please pay careful attention to the test page -- the 9700 Pro was tested on a (more suited to gaming) Athlon 64 system which makes the results not absolutely comparable.The Athlon 64 system is our video test rig, and rerunning all our cards on a p4ee system when the A64 gives results we can use as a reference just didn't make sense.
As stated on the test page, the directly comparable cards are the GeForce 6600 GT, the Radeon X700 XT, and the Radeon X600 XT.
skunkbuster - Tuesday, September 21, 2004 - link
in some cases the X700 XT scored worse than the 9700 pro... i think ati needs to work on their drivers.