Battlefield 3

Our major multiplayer action game of our benchmark suite is Battlefield 3, DICE’s 2011 multiplayer military shooter. Its ability to pose a significant challenge to GPUs has been dulled some by time and drivers, but it’s still a challenge if you want to hit the highest settings at the highest resolutions at the highest anti-aliasing levels. Furthermore while we can crack 60fps in single player mode, our rule of thumb here is that multiplayer framerates will dip to half our single player framerates, so hitting high framerates here may not be high enough.

For our Battlefield 3 benchmark NVIDIA cards have consistently been the top performers over the years, and as a result this is one of the hardest fights for any AMD card. So how does the 290X fare? Very well, as it turns out. The slowest game for the 290X (relative to the GTX 780) has it losing to the GTX 780 by just 2%, effectively tying NVIDIA’s closest competitor. Not only is the 290X once again the first single-GPU AMD card that can break 60fps average on a game at 2560 – thereby ensuring good framerates even in heavy firefights – but it’s fully competitive with NVIDIA in doing so in what’s traditionally AMD’s worst game. At worst for AMD, they can’t claim to be competitive with GTX Titan in this one.

Moving on to 4K gaming, none of these single-GPU cards are going to cut it at Ultra quality; the averages are decent but the minimums will drop to 20fps and below. This means we either drop down to Medium quality, where 290X is now performance competitive with GTX Titan, or we double up on GPUs, which sees the 290X CF in uber mode take top honors. This game happens to be another good example of how the 290X is scaling into 4K better than the GTX 780 and other NVIDIA cards are, as not only does AMD’s relative positioning versus NVIDIA cards improve, but in heading to 4K AMD picks up a 13% lead over the GTX 780. The only weak spot here for AMD will be performance scaling for multiple GPUs, as while the 290X enjoys a 94% scaling factor at 2560, that drops to 60% at 4K, at a time where NVIDIA’s scaling factor is 76%. The 290X has enough of a performance lead for the 290X CF to hold out over the GTX 780 SLI, but the difference in scaling factors will make it cut close.

Meanwhile in an inter-AMD comparison, this is the first game in our benchmark suite where the 290X doesn’t beat the 280X by at least 30%. Falling just short at 29.5%, it’s a reminder that despite the similarities between 290X (Hawaii) and 280X (Tahiti), the performance differences between the two will not be consistent.

Looking at our delta percentages, this is another strong showing for the 290X CF, especially as compared to the 280X CF. AMD has once again halved their variance as compared to the 280X CF, bringing it down to sub-10% levels. This despite the theoretical advantage that the dedicated CFBI should give the 280X. However AMD can’t claim to have the lowest variance of any multi-GPU setup, as this is NVIDIA’s best game, with the GTX 780 SLI seeing a variance of only 6%. It’s a shame not all games can be like this (for either vendor) since there would be little reason not to go with a multi-GPU setup if this was the typical AFR experience as opposed to the best AFR experience.

Finally, looking at delta percentages under 4K shows that AMD’s variance has once again risen slightly compared to the variance at 2560x1440, but not significantly so. The 290X CF still holds under 10% here.

Bioshock Infinite Crysis 3
Comments Locked

396 Comments

View All Comments

  • Blamcore - Friday, October 25, 2013 - link

    Wow, I was just remarking yesterday that NV fanbois had sunk to the level of apple fanbois, when I was seeing the argument "you just like AMD because you can't afford NV" on a few boards. Now here is apple fanbois famous argument "my company is better because they have a higher profit margin" Gratz your unreasonable bias just went up a level!
    I know, you aren't a fanboy, you are really a business expert here to recommend that a company should gain market share by releasing a card roughly equal to what it's competitor had out for months and pricing it the same as they do! Maybe the could have asked 650 if they released it last January
  • puppies - Saturday, October 26, 2013 - link

    R+D costs come from the sale price of the card. Are you tring to claim a $300 GPU costs $300 in materials? R+D costs also come from the fact that shrinking the process enables the manufacturer to get more cards per die each time.

    Look at Intel and AMD their chips don't go up in price each time they get faster, they stay at the same price point. The last 2 cards I have bought have been Nvidia but the next one will be AMD at this rate. I expect a 660TI to be faster and more energy efficient than a 560TI and at the same price point WHEN IT IS RELEASED and I think a lot of people are in the same boat. Nvidia is trying to push people into spending more each time they release a new model line up and it stinks.

    I don't care if a 660 is faster than a 560TI, forcing people to move down the GPU lineup just smacks of NVIDIA price gouging.
  • Samus - Thursday, October 24, 2013 - link

    I have to disagree with you Berzerker. Although his post clearly "overpromotes" the 290, it is incredible value when you consider it is faster and cheaper (by hundreds of dollars) than the Titan.

    -Geforce 660TI owner
  • Laststop311 - Thursday, October 24, 2013 - link

    For people that value a quiet computer, this card is trash
  • Spunjji - Friday, October 25, 2013 - link

    For people that value a quiet computer, all stock coolers are useless.

    People that value a truly quiet computer won't even be playing at this end of the GPU market.
  • Samus - Friday, October 25, 2013 - link

    This card is a great candidate for water cooling since the back of the PCB is essentially empty. Water cooling the face side is cheaper/easier, and this card can clearly use it.
  • HisDivineOrder - Friday, October 25, 2013 - link

    He didn't say "silent." He said "quiet." I'd argue the Titan/780/690 coolers were all "quiet," but not "silent."

    Since he said quiet, I don't think his expectation is unreasonable to expect a certain level of "quiet" at the $500+ range of discrete cards.
  • Nenad - Friday, October 25, 2013 - link

    780 with stock cooler is not useless, and it IS quiet (it is not 'silent')
    BTW, going by posted numbers it seems 290x will be TWICE as noisy as GTX780 ?
  • ballfeeler - Thursday, October 24, 2013 - link

    Methinks Berzerker7 is just salty and perhaps partial to nvidia.  Nothing itchy wrote is inaccurate, including the $550 price that Salty-Berzerker7 claimed was $600. 

    -          Fastest card ?  - yup

    -          Free game ? – yup

    -          Pooped all over titan ? –yup

    Do not be salty mr. Berzerker7.  AMD just roundhouse kicked nvidia square in the gonads with performance above Titan for half the price.
  • Shark321 - Thursday, October 24, 2013 - link

    At 1080p it's actually slower than Titan if you average all reviews across the sites. With some reviews even slightly slower than the 780. It's also the loudest card ever produced after 30 minutes of playing (9,6 Sone in Battlefield 3 according to PCGamesExtreme). With this noise it's not acceptable and there will be no other coolers for the time being.

Log in

Don't have an account? Sign up now