Battlefield 3

Our major multiplayer action game of our benchmark suite is Battlefield 3, DICE’s 2011 multiplayer military shooter. Its ability to pose a significant challenge to GPUs has been dulled some by time and drivers, but it’s still a challenge if you want to hit the highest settings at the highest resolutions at the highest anti-aliasing levels. Furthermore while we can crack 60fps in single player mode, our rule of thumb here is that multiplayer framerates will dip to half our single player framerates, so hitting high framerates here may not be high enough.

For our Battlefield 3 benchmark NVIDIA cards have consistently been the top performers over the years, and as a result this is one of the hardest fights for any AMD card. So how does the 290X fare? Very well, as it turns out. The slowest game for the 290X (relative to the GTX 780) has it losing to the GTX 780 by just 2%, effectively tying NVIDIA’s closest competitor. Not only is the 290X once again the first single-GPU AMD card that can break 60fps average on a game at 2560 – thereby ensuring good framerates even in heavy firefights – but it’s fully competitive with NVIDIA in doing so in what’s traditionally AMD’s worst game. At worst for AMD, they can’t claim to be competitive with GTX Titan in this one.

Moving on to 4K gaming, none of these single-GPU cards are going to cut it at Ultra quality; the averages are decent but the minimums will drop to 20fps and below. This means we either drop down to Medium quality, where 290X is now performance competitive with GTX Titan, or we double up on GPUs, which sees the 290X CF in uber mode take top honors. This game happens to be another good example of how the 290X is scaling into 4K better than the GTX 780 and other NVIDIA cards are, as not only does AMD’s relative positioning versus NVIDIA cards improve, but in heading to 4K AMD picks up a 13% lead over the GTX 780. The only weak spot here for AMD will be performance scaling for multiple GPUs, as while the 290X enjoys a 94% scaling factor at 2560, that drops to 60% at 4K, at a time where NVIDIA’s scaling factor is 76%. The 290X has enough of a performance lead for the 290X CF to hold out over the GTX 780 SLI, but the difference in scaling factors will make it cut close.

Meanwhile in an inter-AMD comparison, this is the first game in our benchmark suite where the 290X doesn’t beat the 280X by at least 30%. Falling just short at 29.5%, it’s a reminder that despite the similarities between 290X (Hawaii) and 280X (Tahiti), the performance differences between the two will not be consistent.

Looking at our delta percentages, this is another strong showing for the 290X CF, especially as compared to the 280X CF. AMD has once again halved their variance as compared to the 280X CF, bringing it down to sub-10% levels. This despite the theoretical advantage that the dedicated CFBI should give the 280X. However AMD can’t claim to have the lowest variance of any multi-GPU setup, as this is NVIDIA’s best game, with the GTX 780 SLI seeing a variance of only 6%. It’s a shame not all games can be like this (for either vendor) since there would be little reason not to go with a multi-GPU setup if this was the typical AFR experience as opposed to the best AFR experience.

Finally, looking at delta percentages under 4K shows that AMD’s variance has once again risen slightly compared to the variance at 2560x1440, but not significantly so. The 290X CF still holds under 10% here.

Bioshock Infinite Crysis 3
Comments Locked

396 Comments

View All Comments

  • TheJian - Friday, October 25, 2013 - link

    Incorrect. Part of the point of gsync is when you can do 200fps in a particular part of the game they can crank up detail and USE the power you have completely rather than making the whole game for say 60fps etc. Then when all kinds of crap is happening on screen (50 guys shooting each other etc) they can drop the graphics detail down some to keep things smooth. Gsync isn't JUST frame rate. You apparently didn't read the anandtech live blog eh? You get your cake and can eat it too (stutter free, no tearing, smooth and extra power used when you have it available).
  • MADDER1 - Friday, October 25, 2013 - link

    If Gsync drops the detail to maintain fps like you said, then you're really not getting the detail you thought you set. How is that having your cake and eating it too?
  • Cellar Door - Friday, October 25, 2013 - link

    How so? If Mantle gets 760gtx performance in BF4 from a 260X ..will you switch then?
  • Animalosity - Sunday, October 27, 2013 - link

    No. You are sadly mistaken sir.
  • Antiflash - Thursday, October 24, 2013 - link

    I've usually prefer Nvidia Cards, but they have it well deserved when decided to price GK110 to the stratosphere just "because they can" and had no competition. That's poor way to treat your customers and taking advantage of fanboys. Full implementation of Tesla and Fermi were always priced around $500. Pricing Keppler GK110 at $650+ was stupid. It's silicon after all, you should get more performance for the same price each year. Not more performance at a premium price as Nvidia tried to do this generation. AMD is not doing anything extraordinary here they are just not following nvidia price gouging practices and $550 is their GPU at historical market prices for their flagship GPU. We would not have been having this discussion if Nvidia had done the same with GK110.
  • inighthawki - Thursday, October 24, 2013 - link

    " It's silicon after all, you should get more performance for the same price each year"

    So R&D costs come from where, exactly? Not sure why people always forget that there is actual R&D that goes into these types of products, it's not just some $5 just of plastic and silicon + some labor and manufacturing costs. Like when they break down phones and tablets and calculate costs they never account for this. As if their engineers are basically just selecting components on newegg and plugging them together.
  • jecastejon - Thursday, October 24, 2013 - link

    R&D. Is R&D tied only to a specific Nvidia card? AMD as others also invest a lot in R&D with every product generation, even if they are not successful. Nvidia will have to do a reality cheek with their pricing and be loyal to their fans and the market. Today's advantages don't last for to long.
  • Antiflash - Thursday, October 24, 2013 - link

    NVIDIA's logic. Kepler refresh: 30% more performance => 100% increase in price
    AMD's logic. GCN refresh: is 30% more preformance => 0% increase in price
    I can't see how this is justified by R&D of just a greedy company squishing its more loyal customer base.
  • Antiflash - Thursday, October 24, 2013 - link

    Just for clarification. price comparison between cards at its introduction comparing NVIDIA's 680 vs Titan and AMD's 7970 vs 290x
  • TheJian - Friday, October 25, 2013 - link

    AMD way=ZERO PROFITS company going broke, debt high, 6Bil losses in 10yrs
    NV way=500-800mil profits per year so you can keep your drivers working.

    Your love of AMD's pricing is dumb. They are broke. They have sold nearly everything they have or had, fabs, land, all that is left is the company itself and IP.

    AMD should have priced this card at $650 period. Also note, NV hasn't made as much money as 2007 for 6 years. They are not gouging you or they would make MORE than before in 2007 right? Intel, Apple, MS are gouging you as they all make more now than then (2007 was pretty much highs for a lot of companies, down since). People like you make me want to vomit as you just are KILLING AMD, which in turn will eventually cost me dearly buying NV cards as they will be the only ones with real power in the next few years. AMD already gave up the cpu race. How much longer you think they can fund the gpu race with no profits? 200mil owed to GF in Dec 31, so the meager profit they made last Q and any they might have made next Q is GONE. They won't make 200mil profit next Q...LOL. Thanks to people like you asking for LOW pricing and free games.

    You don't even understand you are ANTI-AMD...LOL. Your crap logic is killing them (and making NV get 300mil less profit than 2007). The war is hurting them both. I'd rather have AMD making 500mil and NV making 1B than what we get now AMD at ZERO and NV at 550mil.

Log in

Don't have an account? Sign up now