Single-board CrossFire

The Radeon HD 3870 X2 features a single CrossFire connector at the top of the PCB, meaning you'll eventually be able to add a second card to it to enable 3X or 4X CrossFire modes (depending on whether you add another 3870 X2 or just a single 3870).

Unfortunately driver support for the ATI CrossFireX technology isn't quite there yet, although AMD tells us to expect something in the March timeframe. Given that CeBIT is at the beginning of March we're guessing we'll see it at the show.

As we alluded to earlier, the fact that the 3870 X2 features two GPUs on a single board means that it doesn't rely on chipset support to enable its multi-GPU functionality: it'll work in any motherboard that would support a standard 3870.

Driver support is also seamless; you don't have to enable CrossFire or fiddle with any settings, the card just works. AMD's Catalyst drivers attempt to force an Alternate Frame Render (AFR) mode whenever possible, but be warned that if there are issues with the 3870 X2's multi-GPU rendering modes and a game you may only get single-GPU performance until AMD can fix the problem. In our testing we didn't encounter any such issues but as new games and OS revisions come out, as we saw with the GeForce 7950 GX2, there's always the chance.

AMD insists that by releasing a multi-GPU card it will encourage developers to take CrossFire more seriously. It is also committed to releasing future single-card, multi-GPU solutions but we'll just have to wait and see how true that is.

Last Minute Driver Drop: Competitive Crysis Performance

Today's launch was actually supposed to happen last week, on January 23rd. At the last minute we got an email from AMD stating that the embargo on 3870 X2 reviews had been pushed back to the 28th and we'd receive more information soon enough.

The reason for the delay was that over the weekend, before the launch on the 23rd, AMD was able to fix a number of driver issues that significantly impacted performance with the 3870 X2. The laundry list of fixes are as follows:

• Company of Heroes DX10 – AA now working on R680. Up to 70% faster at 2560x1600 4xAA
• Crysis DX10 – Improves up to ~60% on R680 and up to ~9% on RV670 on Island GPU test up to 1920x1200.
• Lost Planet DX10 – 16xAF scores on R680 improved ~20% and more. AF scores were horribly low before and should have been very close to no AF scores
• Oblivion – fixed random texture flashing
• COJ – no longer randomly goes to blackscreen after the DX10 benchmark run
• World in Conflict - 2560x1600x32 0xAA 16xAF quality=high we get 77% increase
• Fixed random WIC random crashing to desktop
• Fixed CF scaling for Colin McRae Dirt, Tiger Woods 08, and Blazing Angels2
• Fixed WIC DX9 having smearable text

With a list like that, we can understand why AMD pushed the NDA back - but most importantly, the Radeon HD 3870 X2 went from not scaling at all in Crysis to actually being competitive.

The Radeon 3800 series has always lagged behind NVIDIA when it came to performance under Crysis, and with the old driver Crysis was a black eye on an otherwise healthy track record for the 3870 X2. The new driver improved performance in Crysis by around 44 - 54% at high quality defaults depending on resolution. The driver update doesn't make Crysis any more playable at very high detail settings, but it makes the X2's launch a lot smoother than it would've been.

According to AMD, the fix in the driver that so positively impacted Crysis performance had to do with the resource management code. Apparently some overhead in the Vista memory manager had to be compensated for, and without the fix AMD was seeing quite poor scaling going to the 3870 X2.

The Test

Test Setup
CPU Intel Core 2 Extreme QX9650 @ 3.00GHz
Motherboard EVGA nForce 780i SLI
Power Measurements done on ASUS P5E3 Deluxe
Video Cards ATI Radeon HD 3870 X2
ATI Radeon HD 3870
NVIDIA GeForce 8800 GTX
NVIDIA GeForce 8800 GTS 512
NVIDIA GeForce 8800 GT (512MB)
Video Drivers ATI: 8-451-2-080123a
NVIDIA: 169.28
Hard Drive Seagate 7200.9 300GB 8MB 7200RPM
RAM 4x1GB Corsair XMS2 DDR2-800 4-4-4-12
Operating System Windows Vista Ultimate 32-bit

 

Index Bioshock
Comments Locked

74 Comments

View All Comments

  • Cyspeth - Thursday, February 21, 2008 - link

    At last a return to multi-processor graphics acceleration. More power for the simple expedient of more GPUs. I look forward to getting two of these babies and setting them up for crossfire mode. Quad GPU FTW.

    This card seems to check out pretty nicely, and I haven't heard of many problems as yet. I've heard of only one really bad X2 horror story, which was a DOA, and ATI happily replaced the defective card. Anyone who would rather an nvidia 8800 is out of their mind. Not only is it less powerful and less value for money, but nvidia is really stingy about replacement of defects, not to mention they don't have one reliable third party manufacturer, in the face of ATI's 2 really good ones (Sapphire and HIS).
    I've never had a problem with any ATI card I've owned, but every single nvidia one has had some horrible design flaw leading to it's horrible, crippling failure, usually through overheating.

    It's good to see an honest company making some headway in the market. I would hope that ATI can keep up their lead over nvidia with this (or another) series of card, but that doesn't seem likely with their comparative lack of market share
  • MadBoris - Saturday, February 2, 2008 - link

    While it's good to see AMD do something decent that performs well finally, I'm not so sure it's enough at this stage. AMD needs all the good press they can get but $450 seems a bit much when comparing it to the 8800GT @ $250 without a large towering lead. Furthermore the 8800 GT SLI gives it a pretty good stomping @ $500. Hopefully the multi-gpu solution is more painless and driver bugs get worked out soon.

    Also, it would be a big disappointment and misstep for Nvidia not to produce a single GPU real soon that ties or supersedes 8800 ultra SLI performance. It's been 15 months since 8800 ultra came out, that's normal time for a generation leap (6800,7800,8800). So unless Nvidia has been resting on it's laurels it should produce a killer GPU, and then if they make it a multi-GPU solution it should by all means smoke this card. Here's hoping Nvidia hasn't been sleeping and just make another minor 8800 revision and call it next gen, then by all means ATI will be a competitor.
  • MadBoris - Saturday, February 2, 2008 - link

    Also from a price/performance ratio it would be nice to see how it performs against a 8800GT SLI 256MB in current games. 8800 GT 256MB SLI can be had for $430 and I would guess it would perform better at low-med resolutions than this x2.
  • Giacomo - Saturday, February 2, 2008 - link

    I wouldn't spend four hundred bucks for gaming at 1280x*, one single 512MB GT or GTS is more than enough for that.

    And as you scale up to 1680*1050, I personally believe that buying 2x256MB is silly. Therefore, I believe that the 512MB one is the only reasonable 8800GT.

    Your plan sounds like low-res overdriving and doesn't seem wise to me.

    Giacomo
  • MadBoris - Saturday, February 2, 2008 - link

    It's not my plan, I just said it would be good for a price/performance comparison for something like a future review.

    Although I no longer game at 1280, doesn't mean others don't and 2-8800gt's with more performance and less price may be beneficial for them. Personally I think SLI is a waste anyway, but if you are considering an X2, 8800 SLI is an affordable comparison whether at 256 or 512.
  • Aver - Friday, February 1, 2008 - link

    I have searched through anand's archive but I couldn't find the command line parameters used for testing Unreal Tournament 3 in these reviews. Can anybody help?
  • tshen83 - Thursday, January 31, 2008 - link

    That PLX bridge chip is PCI-Express 1.1, not 2.0. Using a PCI-Express 1.1 splitter to split 16x 1.1 bandwidth into 8x 1.1 bandwidth to feed two PCI-Express 2.0 capable Radeon HD3870 chips shows AMD's lack of engineering efforts. They basically glued two HD3870 chips together. Talking about the HD3870, it is basically HD2900XT on 55nm process with a beat up 256bit memory bus. Not exactly good engineering if you ask me.(props to TSMC, definitely not to ATI)

    Scaling is ok, but not great. The review failed to review dual HD3870x2s in Crossfire, basically to show the 4 Radeon HD3870s in Crossfire mode. I have seen reports say that 4 way scaling is even worse than 2 way. It is scaling in Lg(n) mode. Adding 2 more of the Radeon 3870s is only 40% faster than two way HD3870.
  • Giacomo - Thursday, January 31, 2008 - link

    That's always curious to see users talking about "lacking of engineering efforts" and so on, as if they can teach Intel/AMD/nVIDIA how they should implement their technologies. PCI-X 2.0 is now out, and then we start seeing people who look at the previous version as it definitely should be dismissed. Folks, maybe we should at least consider that the 3870 design simply doesn't generate so much bandwidth to justify a PCI-X 2.0 bridge, don't we?

    Sorry, but I kinda laugh when I read "engineering" suggestions from users, supposedly like me, independently from how good their knowledge is. We're not talking about a Do-It-Yourself piece of silicon, we are talking about the last AMD/ATi card. And the same, of course, would apply to an nVIDIA one, to Intel stuff, and so on. The most correct approach I think it would be something like this: "Oh, see, they put a PCI-X 1 bridge in there, and look at the performance, when it's not cut by bad driver support, the scaling is amazing... So it's true that PCI-X 2.0 is still overkill, for nowadays cards".

    That one sounds rationale to me.

    About Crossfire, CrossfireX is needed for two 3870X2 to work together, and I've read in more than one place on the web that drivers for that aren't ready yet. Apart from this, before spitting on scaling and so on, don't forget that all these reviews are done with very early beta drivers...

    Giacomo
  • Axbattler - Tuesday, January 29, 2008 - link

    I seem to remember at least one review pointing out some fairly dire minimum frame rates. Basically, the high and impressive maximum frame rate allowed the card to have more a very good average frame rate despite a few instance of very dire minimum frame rate (if I remember correctly, it sometime dips below that of a single core card).

    Did Anand notice anything like this during the test?
  • Giacomo - Wednesday, January 30, 2008 - link

    That's not exactly right. I mean, the average is not pulled up by any incredible maximum, it works in a different way: the maximum are nothing incredibly higher than the normal average, simply those minimum are rare, due to particular circumstances in which the application (the game) loads up stuff in the card while it's rendering, and it shouldn't do. This is the juice of an ATi's answer posted in the review by DriverHeaven. They told ATi about these instantaneous drops and that was the answer.

    Of course, remaining the fact that other cards didn't show similar problems, I believe that the problem is more driver-related, and ATi should really work fast to fix issues like this one.

    However, DriverHeaven reported that this issue showed up mostly in scenario-loadings indeed, and therefore wasn't too annoying.

    For what I've understood.

    My two cents,
    Giacomo

Log in

Don't have an account? Sign up now