Crysis 3

Still one of our most punishing benchmarks, Crysis 3 needs no introduction. With Crysis 3, Crytek has gone back to trying to kill computers and still holds “most punishing shooter” title in our benchmark suite. Only in a handful of setups can we even run Crysis 3 at its highest (Very High) settings, and that’s still without AA. Crysis 1 was an excellent template for the kind of performance required to drive games for the next few years, and Crysis 3 looks to be much the same for 2014.

Crysis 3 - 3840x2160 - High Quality + FXAA

Crysis 3 - 3840x2160 - Low Quality + FXAA

Crysis 3 - 2560x1440 - High Quality + FXAA

Crysis 3 - 1920x1080 - High Quality + FXAA

Meanwhile delta percentage performance is extremely strong here. Everyone, including the GTX 980, is well below 3%.

Always a punishing game, Crysis 3 ends up being one of the only games the GTX 980 doesn’t take a meaningful lead on over the GTX 780 Ti. To be clear the GTX 980 wins in most of these benchmarks, but not in all of them, and even when it does win the GTX 780 Ti is never far behind. For this reason the GTX 980’s lead over the GTX 780 Ti and the rest of our single-GPU video cards is never more than a few percent, even at 4K. Otherwise at 1440p we’re looking at the tables being turned, with the GTX 980 taking a 3% deficit. This is the only time the GTX 980 will lose to NVIDIA’s previous generation consumer flagship.

As for the comparison versus AMD’s cards, NVIDIA has been doing well in Crysis 3 and that extends to the GTX 980 as well. The GTX 980 takes a 10-20% lead over the R9 290XU depending on the resolution, with its advantage shrinking as the resolution grows. During the launch of the R9 290 series we saw that AMD tended to do better than NVIDIA at higher resolutions, and while this pattern has narrowed some, it has not gone away. AMD is still the most likely to pull even with the GTX 980 at 4K resolutions, despite the additional ROPS available to the GTX 980.

This will also be the worst showing for the GTX 980 relative to the GTX 680. GTX 980 is still well in the lead, but below 4K that lead is just 44%. NVIDIA can’t even do 50% better than the GTX 680 in this game until we finally push the GTX 680 out of its comfort zone at 4K.

All of this points to Crysis 3 being very shader limited at these settings. NVIDIA has significantly improved their CUDA core occupancy on Maxwell, but in these extreme situations GTX 980 will still struggle with the CUDA core deficit versus GK110, or the limited 33% increase in CUDA cores versus GTX 680. Which is a feather in Kepler’s cap if anything, showing that it’s not entirely outclassed if given a workload that maps well to its more ILP-sensitive shader architecture.

Crysis 3 - Delta Percentages

Crysis 3 - Surround/4K - Delta Percentages

The delta percentage story continues to be unremarkable with Crysis 3. GTX 980 does technically fare a bit worse, but it’s still well under 3%. Keep in mind that delta percentages do become more sensitive at higher framerates (there is less absolute time to pace frames), so a slight increase here is not unexpected.

Battlefield 4 Crysis: Warhead
Comments Locked

274 Comments

View All Comments

  • nathanddrews - Friday, September 19, 2014 - link

    http://www.pcper.com/files/review/2014-09-18/power...
  • kron123456789 - Friday, September 19, 2014 - link

    Different tests, different results. That's nothing new.
  • kron123456789 - Friday, September 19, 2014 - link

    But, i still think that Nvidia isn't understated TDP of the 980 and 970.
  • Friendly0Fire - Friday, September 19, 2014 - link

    Misleading. If a card pumps out more frames (which the 980 most certainly does), it's going to drive up requirements for every other part of the system, AND it's going to obviously draw its maximum possible power. If you were to lock the framerate to a fixed value that all GPUs could reach the power savings would be more evident.

    Also, TDP is the heat generation, as has been said earlier here, which is correlated but not equal to power draw. Heat is waste energy, so the less heat you put out the more energy you actually use to work. All this means is that (surprise surprise) the Maxwell 2 cards are a lot more efficient than AMD's GCN.
  • shtldr - Wednesday, September 24, 2014 - link

    "TDP is the heat generation, as has been said earlier here, which is correlated but not equal to power draw."
    The GPU is a system which consumes energy. Since the GPU does not use that energy to create mass (materialization) or chemical bonds (battery), where the energy goes is easily observed from the outside.
    1) waste heat
    2) moving air mass through the heatsink (fan)
    3) signalling over connects (PCIe and monitor cable)
    4) EM waves
    5) degradation/burning out of card's components (GPU silicon damage, fan bearing wear etc.)
    And that's it. The 1) is very dominant compared to the rest. There's no "hidden" work being done by the card. It would be against the law of conservation of energy (which is still valid, as far as I know).
  • Frenetic Pony - Friday, September 19, 2014 - link

    That's a misunderstanding of what TDP has to do with desktop cards. Now for mobile stuff, that's great. But the bottlenecks for "Maxwell 2" isn't in TDP, it's in clockspeeds. Meaning the efficiency argument is useless if the end user doesn't care.

    Now, for certain fields the end user cares very much. Miners have apparently all moved onto ASIC stuff, but for other compute workloads any end user is going to choose NVIDIA currently, just to save on their electricity bill. For the consumer end user, TDP doesn't matter nearly as much unless you're really "Green" conscious or something. In that case AMD's 1 year old 290x competes on price for performance, and whatever AMD's update is it will do better.

    It's hardly a death knell of AMD, not the best thing considering they were just outclassed for corporate type compute work. But for your typical consumer end user they aren't going to see any difference unless they're a fanboy one way or another, and why bother going after a strongly biased market like that?
  • pendantry - Friday, September 19, 2014 - link

    While it's a fair argument that unless you're environmentally inclined the energy savings from lower TDP don't matter, I'd say a lot more people do care about reduced noise and heat. People generally might not care about saving $30 a year on their electricity bill, but why would you choose a hotter noisier component when there's no price or performance benefit to that choice.

    AMD GPUs now mirror the CPU situation where you can get close to performance parity if you're willing to accept a fairly large (~100W) power increase. Without heavy price incentives it's hard to convince the consumer to tolerate what is jokingly termed the "space heater" or "wind turbine" inconvenience that the AMD product presents.
  • Laststop311 - Friday, September 19, 2014 - link

    actually the gpu's from amd do not mirror the cpu situation at all. amd' fx 9xxx with the huge tdp and all gets so outperformed by even the i7-4790k on almost everything and the 8 core i7-5960x obliterates it in everything, the performance of it's cpu's are NOT close to intels performance even with 100 extra watts. At least with the GPU's the performance is close to nvidias even if the power usage is not.

    TLDR amd's gpu situation does not mirror is cpu situation. cpu situation is far worse.
  • Laststop311 - Friday, September 19, 2014 - link

    I as a consumer greatly care about the efficinecy and tdp and heat and noise not just the performance. I do not like hearing my PC. I switched to all noctua fans, all ssd storage, and platinum rated psu that only turns on its fan over 500 watts load. The only noise coming from my PC is my radeon 5870 card basically. So the fact this GPU is super quiet means no matter what amd does performance wise if it cant keep up noise wise they lose a sale with me as i'm sure many others.

    And im not a fanboy of either company i chose the 5870 over the gtx 480 when nvidia botched that card and made it a loud hot behemoth. And i'll just as quickly ditch amd for nvidia for the same reason.
  • Kvaern - Friday, September 19, 2014 - link

    "For the consumer end user, TDP doesn't matter nearly as much unless you're really "Green""

    Or live in a country where taxes make up 75% of your power bill \

Log in

Don't have an account? Sign up now