Final Words

SLI was not without its hiccups at the outset. Since NVIDIA launched its dual GPU solution a year ago, things have only gotten better. The first few months of CrossFire have shown us that ATI is struggling down the same road that NVIDIA set out on with SLI. The first version of CrossFire wasn't as refined as we had hoped it would be, but everyone has to start somewhere. The X1800 CrossFire Edition card shows that ATI is making significant progress in the multi GPU arena.

The system level problems and difficulty we had with the initial version of CrossFire were largely solved this time around, as (after a proper BIOS update) we had no trouble enabling CrossFire or getting it running. The reboot required for getting the card to work with the 256MB X1800 XL is a bit of a step backward, but it's nothing major. Super AA modes work, but aren't worth it in games that see a performance benefit from more than one GPU. These modes haven't changed from the original X800 CrossFire release, but now AA mode blending is done in hardware as well.

Unfortunately, there are still some real down sides. Aside from not hurting performance, there should be absolutely no game that suffers a performance hit from enabling CrossFire. This kind of problem was one of the issues we saw with NVIDIA's SLI when it first came out and selecting the wrong mode could cause performance issues (though we never saw anything as dramatically catastrophic as CrossFire's performance under Black and White 2). Hopefully ATI can fix all of this with some additions or changes to the application detection in Catalyst A.I.

There isn't as much capacity for customization of how CrossFire affects a particular game. Sure, we could probably figure out how to perform some sort of registry hack to force CrossFire to enable a specific mode, but NVIDIA's drop down menu is so much more efficient. I do agree that it is usually best to let the driver and application figure out what to do, but clearly there are some cases where things can go wrong. More finely grained customization is something we have been looking for from ATI in general, but as systems get more complex (such as in the case of multi GPU solutions), it is even more important to give the end user more control with solid default functionality.

We still don't like the external dongle as much as NVIDIA's internal bridge (especially the flexible kind). While our lab sports an excessive number of cables running everywhere, we are sure that less clutter behind the machine is better for everyone. And GPU to GPU communication does have board space and other advantages as well. Thankfully, this time around we are able to squeeze dual-link DVI bandwidth in to and out of the cable. While mastercardless CrossFire is in ATI's future, it is unclear yet what kind of performance or quality impact this will have on the overall solution.

Performance is one of the high points of CrossFire in general. In many cases, the X1800 XT in CrossFire performs between the 7800 GTX SLI and 7800 GTX 512 SLI setups. Things may heat up even more when ATI brings out its R500 series refresh part to compete more directly with NVIDIA's top of the line king of the hill GeForce 7800 GTX 512.

But price is still an issue for ATI at the high end. With the 7800 GTX selling for between $450 and $500, finding two that fall somewhere between $900 and $1000 isn't a difficult task. With the X1800 CrossFire Edition at about $600 and the price of standard X1800 XT running between $500 and $550, shelling out $1100 or more dollars for CrossFire isn't a stretch. Unless the system is just made to eat money, one or two hundred dollars is a good savings for a comparable solution that is much more mature. And for those who want the best of the best (and really can afford to burn money), that's still going to be the $1400 dual 7800 GTX 512 setup.

And above all of this looms the shadow of availability. After an early morning look around, it doesn't look good. Some vendors have X1800 CrossFire Edition cards listed on their site, but they are all showing out of stock, back ordered, or early January ETAs on parts. With previous NVIDIA launches, we have seen product available for purchase before we published. With the 7800 GT launch, we even had parts listed for sale in our RTPE the weekend before we could talk about it. Without seeing anywhere to physically buy the hardware just hours before it is supposed to be publicly available does not give us a warm and fuzzy feeling about ATI's promises. But we are good sports, so we will keep checking throughout the day for any sign of an online vendor actually selling parts. ATI knows how important this launch is in earning some level of trust back from the press and their customers. All we can do at this point is hope that translates into results.

UPDATE: To put things bluntly, we aren't exactly certain how to react to this luanch. On Tuesday, there was very limited availability at one or two online retailers. Most major sites listed availability as "out of stock" or "backorder", but if you looked hard enough you could find a card for sale. We have to give ATI credit for this, as its much more than we've seen in the past. The day after launch more sites have the X1800 CrossFire edition available for immediate purchase. The situation is a little better, but not quite on par with the 7800 GT or 6800 GS launches. We all know the 7800 GTX 512 is all but impossible to find right now, but people who got in there quickly were able to get product at launch. Neither the 7800 GTX 512 situation nor this X1800 CrossFire launch are what we really want: immediate availability of parts in high enough quantities to at least to meet demand. Yes, it is quite a bit to ask, but NVIDIA has shown us it can be done with 7800 GTX, GT, and 6800 GS launches. Yes, it is the holiday season, so we understand that the pressure and difficulty of delievering what we want is increased orders of magnitude over the already difficult process of hard launching a part. So what's our final assesment?

For now, this is enough. We commend ATI for coming through and getting some cards out there at launch. But we still want to see more improvement in the future.


Quake 4 Performance
Comments Locked

40 Comments

View All Comments

  • t3h l337 n3wb - Wednesday, December 21, 2005 - link

    The only place you can get one is Ebay, where there are 2 listings, and they're like $700+...
  • DjDiff - Wednesday, December 21, 2005 - link

    I'm curious whether crossfire would increase AVIVO performance or not. If not, will there be drivers in the future that will benefit from crossfire when using AVIVO?
  • dualblade - Friday, December 23, 2005 - link

    referring to playback, or the hardware encoding feature?

    playback is already at 1080p with a single x1800 of any sort so i don't think that needs improvement. crossfire hardware assisted encoding might be a really good thing. i imagine a dual core crossfire setup could become a real encoding/rendering powerhouse
  • Scarceas - Wednesday, December 21, 2005 - link

    bleh no product... Why is it so hard to launch? Just don't announce your product until you've already shipped it. DURRR!!!
  • Thalyn - Tuesday, December 20, 2005 - link

    While I'm still here, I thought I'd point out what seemed to be a strange anomaly in the Quake 4 benches to see if someone can provide an answer.

    Under 4x FSAA, the GTX 512 cards are listed as performing better in 1920x1440 than in 1600x1200. Oddly enough, the results are almost right in the middle of the 1280x1024 and 1600x1200 scores, though if you re-plot the graph with the 1600 and 1920 results reversed it doesn't match the trends set by any other hardware in the list.

    Is this a typo, or something more sinister? And, more curiously, why didn't Derek make any mention of it at all?

    -Jak
  • Leper Messiah - Tuesday, December 20, 2005 - link

    Yeah, I mentioned this a bit higher...haven't gotten an answer yet...
  • Thalyn - Tuesday, December 20, 2005 - link

    One thing I would be curious to see is how the ATi cards fare with a small tweak done under B&W2. There's a setting which can be changed in one of the .INI files which makes the game run exponentially better on most hardware I've seen it "trying" to run on - including my own X800 Pro AGP, an two mate's 6600GT AGP and 5800 Ultra AGP.

    I believe the file is called "graphics.ini" in the data subdirectory - change the detail settings to be 3 1 3 instead of 3 0 3. It does disable two of the options in the ingame graphics menu (and I have heard it can result in "squares" under fields and such), but the performance increase is substantial, to say the least. Oddly enough, just disabling these two options on their own doesn't make anywhere near as much of a difference.

    Sadly, once it's running well you quickly find out that it wasn't worth all the effort, but I would still be curious to see the results from tests under such conditions. NVidia apparently fixed this bug with one of their post-release drivers (hence the disparity of scores), and there's also a 1.2 patch being prepared as we speak which will hopefully level things off somewhat, but in the meantime this is the best we've got.

    -Jak
  • Beenthere - Tuesday, December 20, 2005 - link

    ...card-of-the week mentality. So I finally decided to do some research to see what the B.F.D. was with having one or more $700+ video cards in a PC. I went out and bought the Lanparty UT SLI Mobo, (2) FX57s so I could find the fastest O/C'ing one, (2) Asus 7800 GTX 512s, (2) 520W OCZ Power Stream PSUs, 2 x 1024MB OCZ EB Platinum 4800 modules, a Corsair ice water-cooling system for the FX57 and Nvidia chipset (until I get to vapor cooling), an Antec P160 Performance case and an HP L2335 23" display.

    Everything went together fine and I spent several days overclocking the two FX57s until I was able to run almost stable at 3.9 Gig. @ 1.625V w/34 degree cold water. And to my surprise my 3Dmark 2005 showed an incredible 18,240 score !!! WOW, I was just blown away. I was starting to understand what the enthusiasm was all about for the latest-greatest-trick of the week PC hardware. After several weeks of tweaking I now have my system stable most of the time and it simply fly's !!! Not only that but the blue LEDs look so cool at night, and my friends are impressed as H*LL that for less than $6,000 I have a PC that will cook my breakfast, bring in the newspaper, make the utility company rich, heat my house, make Nvidia rich, clean my car, wash my clothes and even do word processing. I can even log on to the Net .00000000000000001 seconds faster than my old dumbazz Athlon 939 3000 that I spent $1,000 on total and which runs rock stable at 2.4 Gig. And at a resolution of 1920 x 1200 I'm able to get a frame rate in any video game of at least 60. This allows me to sit 6'-8' away from my monitor to minimize eye strain when I play video games for 18 hours or more at a time.

    Without a doubt I am one broke but very happy camper. NOW - now I understand the point of spending $700 or more on a Vid card and $1000 on a CPU and hundreds on memory, and PSUs, and trick PC cases, etc. And my friends think I am the coolest guy they know cause I got this BLING machine. Whatta life !!! If only I had known years ago...
  • AdamK47 3DS - Wednesday, December 21, 2005 - link

    You forgot the sarcasm tags <sarcasm> </sarcasm>
  • dali71 - Tuesday, December 20, 2005 - link

    quote:

    And for those who want the best of the best (and really can afford to burn money), that's still going to be the $1400 dual 7800 GTX 512 setup.


    Really? And exactly WHERE can I find this mythical $1400 setup?

Log in

Don't have an account? Sign up now