Battlefield 3

Our major multiplayer action game of our benchmark suite is Battlefield 3, DICE’s 2011 multiplayer military shooter. Its ability to pose a significant challenge to GPUs has been dulled some by time and drivers, but it’s still a challenge if you want to hit the highest settings at the highest resolutions at the highest anti-aliasing levels. Furthermore while we can crack 60fps in single player mode, our rule of thumb here is that multiplayer framerates will dip to half our single player framerates, so hitting high framerates here may not be high enough.

For our Battlefield 3 benchmark NVIDIA cards have consistently been the top performers over the years, and as a result this is one of the hardest fights for any AMD card. So how does the 290X fare? Very well, as it turns out. The slowest game for the 290X (relative to the GTX 780) has it losing to the GTX 780 by just 2%, effectively tying NVIDIA’s closest competitor. Not only is the 290X once again the first single-GPU AMD card that can break 60fps average on a game at 2560 – thereby ensuring good framerates even in heavy firefights – but it’s fully competitive with NVIDIA in doing so in what’s traditionally AMD’s worst game. At worst for AMD, they can’t claim to be competitive with GTX Titan in this one.

Moving on to 4K gaming, none of these single-GPU cards are going to cut it at Ultra quality; the averages are decent but the minimums will drop to 20fps and below. This means we either drop down to Medium quality, where 290X is now performance competitive with GTX Titan, or we double up on GPUs, which sees the 290X CF in uber mode take top honors. This game happens to be another good example of how the 290X is scaling into 4K better than the GTX 780 and other NVIDIA cards are, as not only does AMD’s relative positioning versus NVIDIA cards improve, but in heading to 4K AMD picks up a 13% lead over the GTX 780. The only weak spot here for AMD will be performance scaling for multiple GPUs, as while the 290X enjoys a 94% scaling factor at 2560, that drops to 60% at 4K, at a time where NVIDIA’s scaling factor is 76%. The 290X has enough of a performance lead for the 290X CF to hold out over the GTX 780 SLI, but the difference in scaling factors will make it cut close.

Meanwhile in an inter-AMD comparison, this is the first game in our benchmark suite where the 290X doesn’t beat the 280X by at least 30%. Falling just short at 29.5%, it’s a reminder that despite the similarities between 290X (Hawaii) and 280X (Tahiti), the performance differences between the two will not be consistent.

Looking at our delta percentages, this is another strong showing for the 290X CF, especially as compared to the 280X CF. AMD has once again halved their variance as compared to the 280X CF, bringing it down to sub-10% levels. This despite the theoretical advantage that the dedicated CFBI should give the 280X. However AMD can’t claim to have the lowest variance of any multi-GPU setup, as this is NVIDIA’s best game, with the GTX 780 SLI seeing a variance of only 6%. It’s a shame not all games can be like this (for either vendor) since there would be little reason not to go with a multi-GPU setup if this was the typical AFR experience as opposed to the best AFR experience.

Finally, looking at delta percentages under 4K shows that AMD’s variance has once again risen slightly compared to the variance at 2560x1440, but not significantly so. The 290X CF still holds under 10% here.

Bioshock Infinite Crysis 3
Comments Locked

396 Comments

View All Comments

  • pattycake0147 - Friday, October 25, 2013 - link

    Nope piroroadkill is spot on with speaking his opinion. Anand continually asks for reader feedback, and he's doing just that.

    The rate at which this article is being finished is piss poor. Ryan said it would be finished in the morning the day of posting which meant in the next 12 hr or so. The main explanatory pages took about 24 hr to be completely fleshed out, and the graphs still don't have any text explaining the trends in performance. I actually value the author's commentary more than the graphs, and looking through a review which is incomplete over 36 hr after posting is much below Anandtech standards.

    I hate to bring it up because I like reading the vast majority of content on Anandtech regardless of market or complany, but I firmly believe piroroadkill is correct in saying that a new Apple product would have had a complete and thorough review shortly after NDA was lifted.
  • HisDivineOrder - Friday, October 25, 2013 - link

    He had three R9 290X's in one system. Crossing his chest, he took out his third and slid its PCIe into the test bed. Immediately, the room began to darken and a voice spoketh, "You dare install THREE R9 290X's into one system! You hath incurred the wrath of The Fixer, demon lord of the 9.5th circle of hell! Prepare for the doooooom!"

    Then the system erupted into flames, exploding outward with rapid napalm-like flames that sent him screaming out the door. Within seconds, the entire building was burning and within minutes there was nothing left but ashes and regrets.

    Ever since, he has been locked away in a mental health ward, scribbling on a notepad, "Crossfire," over and over. Some say on the darkest nights, he even dares to whisper a single phrase, "Three-way."
  • B3an - Saturday, October 26, 2013 - link

    LOL!
  • Ryan Smith - Monday, October 28, 2013 - link

    Hahaha!

    Thanks man, I needed that.
  • yacoub35 - Friday, October 25, 2013 - link

    It's a bit silly to list the 7970 as $549 when the truth is they can be had for as little as $200. And they're easily the best deal for a GPU these days.
  • yacoub35 - Friday, October 25, 2013 - link

    To clarify: A marketing piece lists "Launch prices", a proper review compares real-world prices.
  • yacoub35 - Friday, October 25, 2013 - link

    So double the ROPs on a new architecture and an extra GB of faster GDDDR results in maybe 10-20 more frames than a 7970GE at the resolution most of us run (1920x). Somehow I don't think that's worth twice the price, let alone the full $549 for someone who already owns a 7970.
  • Jumangi - Friday, October 25, 2013 - link

    Only a clueless noob with too much money in their pocket would buy a 290x if they are running at 1920 resolution.
  • kyuu - Friday, October 25, 2013 - link

    If you're just looking to game at high details on a single 1080p monitor, then no, the 290X isn't interesting as you're spending a lot of money for power you don't need. If you're gaming at 1440p or higher and/or using Eyefinity, then it's a different story.
  • Hulk - Friday, October 25, 2013 - link

    I just wanted to thank Ryan for getting up the charts before the rest of the article. We could have either waited for the entire article or gotten the performance charts as soon as you completed them and then the text later. Thanks for thinking of us and not holding back the performance data until the article was finished. It's exactly that type of thinking that makes this site the best. I can imagine you starting to work on the text and thinking, "You know what? I have the performance data so why don't I post it instead of holding it back until the entire article is finished."

    Well done as usual.

Log in

Don't have an account? Sign up now