Lets hope that this endevor by ATI doesn't end up in the same situation at the RAGE Fury Maxx they came up with. The last time they tried this they screwed it up and then cut support for the card like it never existed.
Looks promising, but I could see problems if they allow different hardware to run in a combined mode as well.
Competition is good so maybe this will bring down the price of the boards and the cards now.
Those 2 X850XT PE cards add up to $1000+ alone while price of 6800U are going down.
I think the biggest prob is not if this works or not, is if the mainboard performance is sacrificed due to the ATI north-bridge. I don't gave 90% of the time on my system, I can't sacrifice losing system performance for gaming perf.
what's funny is that a few months after the xfire is released, nvidia will probably announce SLI v2.0 and then everyone will talk about how that's so cool
#49 & #50 - The Uli 1573 we've seen paired with Crossfire DOES suport NCQ. this was confirmed this afternoon with engineers here at Computex. The upcoming ULI 1575 southtridge supports both Sata 2 and NCQ.
We also saw demos of Splinter Cell on Crossfire with the 2.0 Shader. The demos were at 1280x1024 with all eye candy enabled. Frame rates in the various demos were 118 to 120. Since we did not have reference benches for Splinter Cell, it didn't make much sense to publish these results in the launch article. What we have seen is very promising, but we need more "hands-on" benchmarking before we can say much more.
#51 That is the real question to be answered. My guess is that ATI will work on a SLI board and visa versa unless there is something specifically in hard coded that prevents the second slot from being used by anything other the chipsets manufactures video card, which is highly unlikely. From the BIOS and driver standpoint the MB is either has 1x16 PCIex slot or 2x8 PCIex slots.
#47
I had the Abit board with the AMD chipset on it. Worked flawlessly. I gave it to a friend when I upgraded that box and it was running until last year when he upgraded his box.
Interesting... They gave AT stock photos of a Intel based motherboard, but the benchmarking was done on an AMD rig. Anyone know if the chipset(s) support Athlon X2 and/or Pentium D?
To the person that mentioned that Doom3 is not a good benchmark for ATi: My guess is that Doom3 is probably a good benchmark to use for this purpose. ATi is most assuredly GPU bound in Doom3, so any increase in GPU power will yield a positive result. Whereas in HL2 where ATi has really strong single performance, I would imgaine that the Crossfire rig is CPU lmited, so there is not as drastic of an increase.
Either way, Crossfire looks to be pretty interesting. Can't wait to see some in depth benches and some screenies of the super AA modes.
A request to AT: how about some benches in standard and widescreen resolutions. I know that SLI had compatibility issues with widescreen in the past, and it would be nice to know if those are still around and/or if Crossfire also suffers from this.
Nice article, especially with the limitation of not being able to run a full suite of benchmarks.
So what about the fact the new mode does not work in OpenGL? I happened to love playing with directX api but knowing that half the engines will not be able to use the new filtering? I like the idea of the new chip they are putting on the board, I'm disguested most of the ATI stuff is marketing not hardware I have to develope for the hardware in peoples machines and the baises that people have toward tech and i'm getting sick of finding a new way of doing something not being able to use it because it only works on nvidia and ati bashes till they sorta get it working then they claim it's the best thing since sliced bread. Lets try and force nvidia and ati marking people to focus on what is there not what there side has.
I suppose we should add a disclaimer to the statement about Super AA working with everything ...
From what we *hear* from ATI, all games will work with CrossFire. This means that all games will work with at least one of the performance or quality modes. Even if a game doesn't work under AFR, split, or supertiling, it should work with Super AA ...
But we will have to test compatibility for ourselves.
I can't see why they can't just use one PCIe card with a extra socket. Wheny ou ened to upgrade for more power then you buy just the chip and put the extra GPU chip in the socket to make a dual graphics card.
SLI is a waste of time in that it's a direction in motherboard layout that isn't going to last, it's a dead end road for the future. So rather than was all the time developing cards that work in tandom, make the card to work with more GPU chip ont he same daughter board.
"all games will be accelerated under any Super AA mode"
I hope Anandtech isn't pulling my leg here. I'd love to see Halo PC or Splinter Cell using AA. Currently no form of multisampling allows AA in these games. There are probably more games out there that have the same multisampling limitation, but these are the two I know of.
And should I had that investors seems to feel like ATI is taking its place back. NVDIA stocks already dropped 59 cents witht the announcement of ATI crossfire and R520 playback display...
#37: totally agree, it is going to be a nice fight! If R520 is overall better than the G70, NVDIA will have to worry and counter it as fast as they can (which is going to be great for us).
I agree with #35. It is a little early to know which solution will be considered the best, even if it seems that ATI is bringing sone very interesting features along with their Crossfire. Not everybody care about playing Doom3 at 110 fps. I rather play at 80fps and have all the eye candies, and even more, since they are going to propose heavier FSAA.
Since SLI and Crossfire will probably come to be close, it is going to see which one of the r520 and g70 will be the fastest with the most interesting features. But if ATI did as good with the R520 as they seem to have done with the Crossfire, they could take back the crown.
NVIDIA seems in hurry to put the G70 on the shelves, which seems kind of suspect, since there is are no real reasons to do so. They acually do have the most popular cards and the fastest configuration with SLI.
Did they heard about ATI R520 specs and fear to be at a disadvantage? Do they need two G70 to beat it?
Wait and See
#36 2PCI-E 8X -> that is exactly what SLI is right now with NVDIA.
Compositing Engine chip -> Do you remember the discussion about the PCI-E bridge implemented on GeForce 6 cards? Experience has shown no performance drop. Instead, in ATI's "SLI" solution, it seems even better since it is not part of the die -> less heat... It is not new to them, since there are using it in professionnals products
I think the next chipset king will be the one producing the best next-gen graphics cards. if NV70 is better performing than r520 then I wouldn't think that people are going to care about going ATI in the mobo department, but if r520>nv70, then nvidia may start losing the chipset market.
this will be real interesting to watch, the first chipset war determined by graphics cards :P
Is it just me or do several things about this scream "bottleneck" and "latency"? The 2PCI-E x8 slots instead of x16 slots. The extra Compositing Engine chip. The ability to pair different cards such that it will drop clock speeds and/or pipelines to sync them up. The lack of direct chip-to-chip interconnect.
I'm curious to know just how much performance gain is realized if you pair, say, an X800XL and an X850-something, over just the X850-something. And also how much bottleneck and latency there is in this implementation over the NVidia offering of SLI.
The only upside I can see is cost/upgrade since a user can own an X800-based card (assuming they have a Crossfire compatible motherboard) and go out and buy an X850-based card later and use BOTH cards together (assuming they are both Crossfire-capable cards). Then again with those assumptions I'm not sure it's truly any more cost-effective. =\
As usual, the fanboys of both sides come to the show to spout their comments.
For everyone saying "Man, you have to buy a Crossfire that matches your card, and throw it away when you upgrade"...umm, don't you have to buy two of the exact same matching card for running nVidia SLI, and if you wish to upgrade, you have to sell both? Doesn't sound that different to me. One thing I think a lot of current ATI owners will be happy about is that they won't have to get rid of a card they already own and buy two of a new one; they can just buy a single Crossfire card (and of course a mainboard).
On the other hand, to those thinking ATI has now "0wned" nVidia, it is WAY too early to tell. The solution looks promising, but if you have to sacrifice mainboard performance (i.e., SATA hard disks, memory bandwidth, etc.) it may be a hard sell. Benchmarks in Doom 3 are also not the end-all be-all. We'll have to wait for a more comprehensive performance review, including DirectX benches, and performance/quality with older games using this new AA method, as well as game compatibility reports. We'll also need to know what TRUE pricing is (we've seen claimed pricing vary quite a bit from what it has turned out to be at product release in the past two years).
Do I hope it will beat nVidia's solution? You bet. I like ATI, but even more I like competition that drives the industry. Do we proclaim ATI the winner/loser on this one? Heck no, it isn't even a purchaseable product yet.
30 - Yes. The PCIe bus likely provides slower performance, as it is used for lots of other things (like communication between the CPU, RAM, and GPUs). I believe NVIDIA SLI works without the dongle but at slower speeds - at least, I heard that somewhere, but I haven't ever had an SLI board so I can't say for sure. Anyway, since DVI is a digital signle, using DVI in/out seems about as good as the SLI bridge - at least in theory. Now we just need to wait and see how theories pan out. :)
I was under the impression they were going to use the PCI-E bus for transferring data between the cards. Is the external dongle going to handle that instead?
I really don't see how the xfire is better than sli based on hardware compatibility. Sure, you don't need the exact same cards, but you will likely buy only one x850 type card per x850 xfire. It would be extremely unlikely that someone upgrades from x850 xt pro -> x850 xt pe.
Basically, in the end, you will buy a specific xfire tailored to your gfx card, and throw it away with the next generation of cards.
#21, yes it is. This is what hurts ATI the most, Nvidia already had 4 release cycles of experience with motherboards(2 of those being highly popular, highly recommended boards) before attempting SLI. ATI has a previous launch for a board almost universally ignored. I would not use an ATI board at this time, so I would also not consider CrossFire. ATI needs to get CrossFire working on Nvidia's boards to have a fighting chance this round.
"ATI should be focused on the overall platform, not necessarily building up support for their South Bridge. Although we do think it is a bit embarrassing to have to turn to another chipset vendor to provide working South Bridges for your motherboard partners. It would be one thing if this were ATI's first chipset, but it most definitely is not. "
AMD first chipset (AMD 760 for Slot A Athlon, or Irongate, I think) had also non working USB support (or very buggy). Most mainboard manufacturers offered USB thru an add in PCI card, in order not to use the one included in the southbridge
In theroy since It connets to the other card through DVI, I could use my old 9700pro in Crossfire mode with the newer card; even better is what if I could use an NVIDA card and ATI card in Crossfire! All I need is that moterhboard (if forgot the make and model) that supports PCI-e and Ture AGP! (not pci based agp)
This is bogus, remember accelerator cards, mid-90's... poor solution then, same poor solution again... don't waste your hard earned money on this cerebral shortfall, the next gen will soon be upon us...
Performance looks promising, sure, but I wonder what will be shown when AT gets hold of a sample for longer than a few benchmark runs--an 85% improvement at 1600x1200 seems a bit strange, particularly for hardware known for wheezing in the benchmarked game...
#15: You need to read more carefully. Notice how they said that it was the vendor's PC and not their own. So, obviously they had no choice. They had to go by whatever the vendor was offering at the time.
I was a little bummed after reading that the Crossfire + Xpress 200 would also have 2xPCI-e slots instead of just one like the current msi rs480m2-il. I was even more disappointed to here about the current state of the sb450. I thought the sb450 was supposed to fix the bugginess of sb400 which it is replacing? Oh well, no big suprise I guess considering their history in that department. So here's hoping for another save from uli.
#9, #10, and #11: That will never happen. The traces between the GPU and the memory need to be UBER short. The socket would increase trace lengths too much. Plus there is so many kinds of graphics memory with different bus widths.
What would be better is a motherboard that has a built in GPU socket and you could buy the GPU's just like CPU's. Then there would be no need for video cards, but rather just video RAM and the GPU core.
What I really want is a graphics card with extra sockets for a 2nd GPU and more RAM. So I can start with one board with a single GPU and 256MB RAM, then I can upgrade either the existing GPU with a faster one, and/or upgrade the RAM from 256MB to 512MB, and/or slap a second GPU into the extra socket and effectively double performance. That would rock.
so... does this man I can use the CrossFire tech on my DFI nF4 Ultra-D? since the limiting factor in making it SLi worthy was the SLi bridge, ATI's doesnt need one... and is it fully compatible with nForce4?
looks good but with next gen consoles around the corner I think im gonna put my money towards a AMD 64X2 and a PS3/XBOX 360 rather than shelling out a grand for some extra frames.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
57 Comments
Back to Article
Panndor - Monday, June 13, 2005 - link
Lets hope that this endevor by ATI doesn't end up in the same situation at the RAGE Fury Maxx they came up with. The last time they tried this they screwed it up and then cut support for the card like it never existed.Looks promising, but I could see problems if they allow different hardware to run in a combined mode as well.
Competition is good so maybe this will bring down the price of the boards and the cards now.
vision33r - Wednesday, June 1, 2005 - link
Those 2 X850XT PE cards add up to $1000+ alone while price of 6800U are going down.I think the biggest prob is not if this works or not, is if the mainboard performance is sacrificed due to the ATI north-bridge. I don't gave 90% of the time on my system, I can't sacrifice losing system performance for gaming perf.
xsilver - Wednesday, June 1, 2005 - link
what's funny is that a few months after the xfire is released, nvidia will probably announce SLI v2.0 and then everyone will talk about how that's so coolWesley Fink - Wednesday, June 1, 2005 - link
#49 & #50 - The Uli 1573 we've seen paired with Crossfire DOES suport NCQ. this was confirmed this afternoon with engineers here at Computex. The upcoming ULI 1575 southtridge supports both Sata 2 and NCQ.We also saw demos of Splinter Cell on Crossfire with the 2.0 Shader. The demos were at 1280x1024 with all eye candy enabled. Frame rates in the various demos were 118 to 120. Since we did not have reference benches for Splinter Cell, it didn't make much sense to publish these results in the launch article. What we have seen is very promising, but we need more "hands-on" benchmarking before we can say much more.
Wesley Fink
mkruer - Wednesday, June 1, 2005 - link
#51 That is the real question to be answered. My guess is that ATI will work on a SLI board and visa versa unless there is something specifically in hard coded that prevents the second slot from being used by anything other the chipsets manufactures video card, which is highly unlikely. From the BIOS and driver standpoint the MB is either has 1x16 PCIex slot or 2x8 PCIex slots.elecrzy - Tuesday, May 31, 2005 - link
its possible to have Crossfire work on the NF4/945/955. Its just that ATI won't support them through the drivers. Sigh...kmmatney - Tuesday, May 31, 2005 - link
Does the ULi southbridge have NCQ support?weblizard - Tuesday, May 31, 2005 - link
No Sata II or NCQ support. That's all I need to know to NOT want a crossfire system.bob661 - Tuesday, May 31, 2005 - link
#47I had the Abit board with the AMD chipset on it. Worked flawlessly. I gave it to a friend when I upgraded that box and it was running until last year when he upgraded his box.
sprockkets - Tuesday, May 31, 2005 - link
26# That was the 760MPX or the dual processor chipset. Don't recall any irongate issues (that was 750, 760 was the DDR version, right?)Pollock - Tuesday, May 31, 2005 - link
Looks interesting to me, at least so far. I'll agree that we still have to see how things turn out.Eug - Tuesday, May 31, 2005 - link
Crossfire? Meh. About 0.1% of the population will buy dual GPU setups. Crossfire is essentially just a marketecture exercise.The really interesting part is the H.264 acceleration, which will have much, much more impact for the general computing world than Crossfire.
nitromullet - Tuesday, May 31, 2005 - link
Interesting... They gave AT stock photos of a Intel based motherboard, but the benchmarking was done on an AMD rig. Anyone know if the chipset(s) support Athlon X2 and/or Pentium D?To the person that mentioned that Doom3 is not a good benchmark for ATi: My guess is that Doom3 is probably a good benchmark to use for this purpose. ATi is most assuredly GPU bound in Doom3, so any increase in GPU power will yield a positive result. Whereas in HL2 where ATi has really strong single performance, I would imgaine that the Crossfire rig is CPU lmited, so there is not as drastic of an increase.
Either way, Crossfire looks to be pretty interesting. Can't wait to see some in depth benches and some screenies of the super AA modes.
A request to AT: how about some benches in standard and widescreen resolutions. I know that SLI had compatibility issues with widescreen in the past, and it would be nice to know if those are still around and/or if Crossfire also suffers from this.
Nice article, especially with the limitation of not being able to run a full suite of benchmarks.
jiulemoigt - Tuesday, May 31, 2005 - link
So what about the fact the new mode does not work in OpenGL? I happened to love playing with directX api but knowing that half the engines will not be able to use the new filtering? I like the idea of the new chip they are putting on the board, I'm disguested most of the ATI stuff is marketing not hardware I have to develope for the hardware in peoples machines and the baises that people have toward tech and i'm getting sick of finding a new way of doing something not being able to use it because it only works on nvidia and ati bashes till they sorta get it working then they claim it's the best thing since sliced bread. Lets try and force nvidia and ati marking people to focus on what is there not what there side has.DerekWilson - Tuesday, May 31, 2005 - link
I suppose we should add a disclaimer to the statement about Super AA working with everything ...From what we *hear* from ATI, all games will work with CrossFire. This means that all games will work with at least one of the performance or quality modes. Even if a game doesn't work under AFR, split, or supertiling, it should work with Super AA ...
But we will have to test compatibility for ourselves.
Derek Wilson
porkster - Tuesday, May 31, 2005 - link
I can't see why they can't just use one PCIe card with a extra socket. Wheny ou ened to upgrade for more power then you buy just the chip and put the extra GPU chip in the socket to make a dual graphics card.SLI is a waste of time in that it's a direction in motherboard layout that isn't going to last, it's a dead end road for the future. So rather than was all the time developing cards that work in tandom, make the card to work with more GPU chip ont he same daughter board.
.
AdamK47 3DS - Tuesday, May 31, 2005 - link
"all games will be accelerated under any Super AA mode"I hope Anandtech isn't pulling my leg here. I'd love to see Halo PC or Splinter Cell using AA. Currently no form of multisampling allows AA in these games. There are probably more games out there that have the same multisampling limitation, but these are the two I know of.
matthieuusa - Tuesday, May 31, 2005 - link
And should I had that investors seems to feel like ATI is taking its place back. NVDIA stocks already dropped 59 cents witht the announcement of ATI crossfire and R520 playback display...#37: totally agree, it is going to be a nice fight! If R520 is overall better than the G70, NVDIA will have to worry and counter it as fast as they can (which is going to be great for us).
matthieuusa - Tuesday, May 31, 2005 - link
I agree with #35. It is a little early to know which solution will be considered the best, even if it seems that ATI is bringing sone very interesting features along with their Crossfire. Not everybody care about playing Doom3 at 110 fps. I rather play at 80fps and have all the eye candies, and even more, since they are going to propose heavier FSAA.Since SLI and Crossfire will probably come to be close, it is going to see which one of the r520 and g70 will be the fastest with the most interesting features. But if ATI did as good with the R520 as they seem to have done with the Crossfire, they could take back the crown.
NVIDIA seems in hurry to put the G70 on the shelves, which seems kind of suspect, since there is are no real reasons to do so. They acually do have the most popular cards and the fastest configuration with SLI.
Did they heard about ATI R520 specs and fear to be at a disadvantage? Do they need two G70 to beat it?
Wait and See
#36 2PCI-E 8X -> that is exactly what SLI is right now with NVDIA.
Compositing Engine chip -> Do you remember the discussion about the PCI-E bridge implemented on GeForce 6 cards? Experience has shown no performance drop. Instead, in ATI's "SLI" solution, it seems even better since it is not part of the die -> less heat... It is not new to them, since there are using it in professionnals products
kyaku00x - Tuesday, May 31, 2005 - link
I think the next chipset king will be the one producing the best next-gen graphics cards. if NV70 is better performing than r520 then I wouldn't think that people are going to care about going ATI in the mobo department, but if r520>nv70, then nvidia may start losing the chipset market.this will be real interesting to watch, the first chipset war determined by graphics cards :P
yacoub - Tuesday, May 31, 2005 - link
Is it just me or do several things about this scream "bottleneck" and "latency"? The 2PCI-E x8 slots instead of x16 slots. The extra Compositing Engine chip. The ability to pair different cards such that it will drop clock speeds and/or pipelines to sync them up. The lack of direct chip-to-chip interconnect.I'm curious to know just how much performance gain is realized if you pair, say, an X800XL and an X850-something, over just the X850-something. And also how much bottleneck and latency there is in this implementation over the NVidia offering of SLI.
The only upside I can see is cost/upgrade since a user can own an X800-based card (assuming they have a Crossfire compatible motherboard) and go out and buy an X850-based card later and use BOTH cards together (assuming they are both Crossfire-capable cards). Then again with those assumptions I'm not sure it's truly any more cost-effective. =\
LoneWolf15 - Tuesday, May 31, 2005 - link
As usual, the fanboys of both sides come to the show to spout their comments.For everyone saying "Man, you have to buy a Crossfire that matches your card, and throw it away when you upgrade"...umm, don't you have to buy two of the exact same matching card for running nVidia SLI, and if you wish to upgrade, you have to sell both? Doesn't sound that different to me. One thing I think a lot of current ATI owners will be happy about is that they won't have to get rid of a card they already own and buy two of a new one; they can just buy a single Crossfire card (and of course a mainboard).
On the other hand, to those thinking ATI has now "0wned" nVidia, it is WAY too early to tell. The solution looks promising, but if you have to sacrifice mainboard performance (i.e., SATA hard disks, memory bandwidth, etc.) it may be a hard sell. Benchmarks in Doom 3 are also not the end-all be-all. We'll have to wait for a more comprehensive performance review, including DirectX benches, and performance/quality with older games using this new AA method, as well as game compatibility reports. We'll also need to know what TRUE pricing is (we've seen claimed pricing vary quite a bit from what it has turned out to be at product release in the past two years).
Do I hope it will beat nVidia's solution? You bet. I like ATI, but even more I like competition that drives the industry. Do we proclaim ATI the winner/loser on this one? Heck no, it isn't even a purchaseable product yet.
ElMoIsEviL - Tuesday, May 31, 2005 - link
23 - They ran Doom3.It's not an ATi game at all as we all know. And it still does REALLY well. And it's not in release stages yet.
;)
ElMoIsEviL - Tuesday, May 31, 2005 - link
hehehehe.. it's better then SLi... heheheheFigures, all the NV on here prolly aren't too happy today.
I can't wait to test out the new AA modes.. :)
vertigo1 - Tuesday, May 31, 2005 - link
This is insane, who on earth will buy this?!JarredWalton - Tuesday, May 31, 2005 - link
30 - Yes. The PCIe bus likely provides slower performance, as it is used for lots of other things (like communication between the CPU, RAM, and GPUs). I believe NVIDIA SLI works without the dongle but at slower speeds - at least, I heard that somewhere, but I haven't ever had an SLI board so I can't say for sure. Anyway, since DVI is a digital signle, using DVI in/out seems about as good as the SLI bridge - at least in theory. Now we just need to wait and see how theories pan out. :)Jalf - Tuesday, May 31, 2005 - link
I was under the impression they were going to use the PCI-E bus for transferring data between the cards. Is the external dongle going to handle that instead?Murst - Tuesday, May 31, 2005 - link
I really don't see how the xfire is better than sli based on hardware compatibility. Sure, you don't need the exact same cards, but you will likely buy only one x850 type card per x850 xfire. It would be extremely unlikely that someone upgrades from x850 xt pro -> x850 xt pe.Basically, in the end, you will buy a specific xfire tailored to your gfx card, and throw it away with the next generation of cards.
gxsaurav - Tuesday, May 31, 2005 - link
Great this just means more heat, man, even a single 6800 nU playes everygame fine, while running coolViRGE - Tuesday, May 31, 2005 - link
#21, yes it is. This is what hurts ATI the most, Nvidia already had 4 release cycles of experience with motherboards(2 of those being highly popular, highly recommended boards) before attempting SLI. ATI has a previous launch for a board almost universally ignored. I would not use an ATI board at this time, so I would also not consider CrossFire. ATI needs to get CrossFire working on Nvidia's boards to have a fighting chance this round.Calin - Tuesday, May 31, 2005 - link
"ATI should be focused on the overall platform, not necessarily building up support for their South Bridge. Although we do think it is a bit embarrassing to have to turn to another chipset vendor to provide working South Bridges for your motherboard partners. It would be one thing if this were ATI's first chipset, but it most definitely is not. "AMD first chipset (AMD 760 for Slot A Athlon, or Irongate, I think) had also non working USB support (or very buggy). Most mainboard manufacturers offered USB thru an add in PCI card, in order not to use the one included in the southbridge
Googer - Tuesday, May 31, 2005 - link
In theroy since It connets to the other card through DVI, I could use my old 9700pro in Crossfire mode with the newer card; even better is what if I could use an NVIDA card and ATI card in Crossfire! All I need is that moterhboard (if forgot the make and model) that supports PCI-e and Ture AGP! (not pci based agp)FakeName - Tuesday, May 31, 2005 - link
This is bogus, remember accelerator cards, mid-90's... poor solution then, same poor solution again... don't waste your hard earned money on this cerebral shortfall, the next gen will soon be upon us...Shinei - Tuesday, May 31, 2005 - link
Performance looks promising, sure, but I wonder what will be shown when AT gets hold of a sample for longer than a few benchmark runs--an 85% improvement at 1600x1200 seems a bit strange, particularly for hardware known for wheezing in the benchmarked game...CrystalBay - Tuesday, May 31, 2005 - link
Very sophisticated approach ATI...Hopefully the Composter doesn't turn to sh!t later on...sprockkets - Tuesday, May 31, 2005 - link
Hmmm, isn't the current SB on existing Radeon Express 200 boards buggy too?overclockingoodness - Tuesday, May 31, 2005 - link
#15: Regardless, what difference does it make? The performance would still be closer to what's presented in the article.overclockingoodness - Tuesday, May 31, 2005 - link
#15: You need to read more carefully. Notice how they said that it was the vendor's PC and not their own. So, obviously they had no choice. They had to go by whatever the vendor was offering at the time.flatblastard - Monday, May 30, 2005 - link
I was a little bummed after reading that the Crossfire + Xpress 200 would also have 2xPCI-e slots instead of just one like the current msi rs480m2-il. I was even more disappointed to here about the current state of the sb450. I thought the sb450 was supposed to fix the bugginess of sb400 which it is replacing? Oh well, no big suprise I guess considering their history in that department. So here's hoping for another save from uli.bob661 - Monday, May 30, 2005 - link
#16They weren't listed so I would imagine that they won't be compatible.
Bloodshedder - Monday, May 30, 2005 - link
Kind of makes me wonder about compatibility with All-in-Wonder cards.RadeonGuy - Monday, May 30, 2005 - link
why didnt you run it on a FX-55 and 1gig of memoryQuintin - Monday, May 30, 2005 - link
interesting....ksherman - Monday, May 30, 2005 - link
#6, Id love too, but I dont have the money right now and the cards are not availible...Brian23 - Monday, May 30, 2005 - link
#9, #10, and #11: That will never happen. The traces between the GPU and the memory need to be UBER short. The socket would increase trace lengths too much. Plus there is so many kinds of graphics memory with different bus widths.Waylay00 - Monday, May 30, 2005 - link
What would be better is a motherboard that has a built in GPU socket and you could buy the GPU's just like CPU's. Then there would be no need for video cards, but rather just video RAM and the GPU core.Waylay00 - Monday, May 30, 2005 - link
UNCjigga - Monday, May 30, 2005 - link
What I really want is a graphics card with extra sockets for a 2nd GPU and more RAM. So I can start with one board with a single GPU and 256MB RAM, then I can upgrade either the existing GPU with a faster one, and/or upgrade the RAM from 256MB to 512MB, and/or slap a second GPU into the extra socket and effectively double performance. That would rock.arfan - Monday, May 30, 2005 - link
Good Job ATIbob661 - Monday, May 30, 2005 - link
I wonder what the REAL price will be on the Crossfire cards.bob661 - Monday, May 30, 2005 - link
#5Try it and let us know.
ksherman - Monday, May 30, 2005 - link
so... does this man I can use the CrossFire tech on my DFI nF4 Ultra-D? since the limiting factor in making it SLi worthy was the SLi bridge, ATI's doesnt need one... and is it fully compatible with nForce4?Zebo - Monday, May 30, 2005 - link
Like a dongle is a big deal with the 95 cables already hangin out the back of my case...Slappy00 - Monday, May 30, 2005 - link
What ever happened to the Wicked Stepsister?looks good but with next gen consoles around the corner I think im gonna put my money towards a AMD 64X2 and a PS3/XBOX 360 rather than shelling out a grand for some extra frames.
AnnihilatorX - Monday, May 30, 2005 - link
"First" Second Postmattsaccount - Monday, May 30, 2005 - link
Bout time ATI gave us something.error4656 - Saturday, December 5, 2020 - link
this shit is old