DirectX 12 vs. DirectX 11

Now that we’ve had the chance to look at DirecX 12 performance, let’s take a look at things with DirectX 11 thrown into the mix. As a reminder, while the two rendering paths are graphically identical, the DirectX 12 path introduces the latter’s multi-core scalability along with asynchronous shading functionality. The game and the underlying Nitrous engine is designed to take advantage of both, but particularly the multi-core functionality as the game pushes some very high batch counts.

Ashes of the Singularity (Beta) - High Quality - DirectX 11 vs. DirectX 12

Given that we had never benchmarked Ashes under DirectX 11 before, what we had been expecting was a significant performance regression when switching to it. Instead what we found was far more surprising.

On the RTG side of matters, there is a large performance gap between DX11 and DX12 at all resolutions, increasing with the overall performance of the video card being tested. Even on the R9 290X and the 7970, using DX12 is a no brainer, as it improves performance by 20% or more.

The big surprise however is with the NVIDIA cards. For the more powerful GTX 980 Ti and GTX 780 Ti, NVIDIA doesn’t gain anything from the DX12 rendering path; in fact they lose a percent or two in performance. This means that they have very good performance under DX11 (particular the GTX 980 Ti), but it’s not doing them any favors under DX12, where as we’ve seen RTG has a rather consistent performance lead. In the past NVIDIA has gone through some pretty extreme lengths to optimize the CPU usage of their DX11 driver, so this may be the payoff from general optimizations, or even a round of Ashes-specific optimizations.

Ashes of the Singularity (Beta) - High Quality 1920x1080 - DirectX 12 Perf. Gain

Breaking down the gains on a percentage basis at 1080p, the most CPU-demanding resolution, we find that the Fury X picks up a full 50% from DX12, followed by 29% and 23% for the R9 290X and 7970 respectively. Meanwhile at the opposite end of the spectrum are the GTX 980 Ti and GTX 780 Ti, who lose 1% and 3% respectively.

Finally, right in the middle of all of this is the GTX 680. Given what happens to the architecturally similar GTX 780 Ti, this may be a case of GPU memory limitations (this is the only 2GB NVIDIA card in this set), as there’s otherwise no reason to expect the weakest NVIDIA GPU to benefit the most from DX12.

Overall then this neatly illustrates why RTG in particular has been so gung-ho about DX12, as Ashes’ DX12 path has netted them a very significant increase in performance. To some degree however what this means is a glass half full/half empty full situation; RTG gains so much from DX12 in large part because of their poorer DX11 performance (especially on the faster cards), but on the other hand a “simple” API change has unlocked a great deal of GPU power that wasn’t otherwise being used and vaulted them well into the lead. As for NVIDIA, is it that their cards don’t benefit from DX12, or is it that their DX11 driver stack is that good to begin with? At the end of the day Ashes is just a single game – and a beta game at that – but it will be interesting to see if this is a one-off situation or if it becomes recurring.

DirectX 12 Multi-GPU Performance The Performance Impact of Asynchronous Shading
Comments Locked

153 Comments

View All Comments

  • permastoned - Sunday, February 28, 2016 - link

    Wasn't trolling - there are other metrics that show the case; for you to imply that 3dmark isn't valid is just silly: http://wccftech.com/amd-r9-290x-fast-titan-dx12-en...

    Another thing; what's the deal with all these fanboys? There is no benefit to being a fanboy of either AMD or Nvidia, it is just going to cause you problems because it may cause you to buy based on brand, rather than on performance per dollar, which is the factor that actually matters. At different price ranges different brands are better - e.g top end, a 980Ti is better than a fury X, however if you are looking in the price bracket below, and want buy a 980, you will get better performance and performance per dollar from a standard fury.

    Being a fanboy will blind you from accepting the truth when the tides shift and the tables eventually turn. It helps you in no way at all, it disadvantages you in many. It also causes you to get angry on forums for no reason, and call people 'trolls' when they are stating facts.
  • Soulwager - Sunday, March 20, 2016 - link

    Poorly how, exactly? It looks to me like DX12 is just removing a bottleneck for AMD that Nvidia already fixed in DX11. It would be more correct to say that AMD has poor DX11 performance compared to Maxwell, and neither are constrained by driver overhead in DX12.
  • SunLord - Wednesday, February 24, 2016 - link

    DX12 by desing will slightly favor older AMD designs simply because of the design decisions that AMD made compared to Nvidia with regards DX11 that are paying off with Dx12 while Nvidia benefited from it with DX11 games which is why they own around 80% or so of the gaming GPU market. How much of an impact this will be depends on the game just like how it is with DX11 games some do better on AMD some will be better on Nvidia.
  • anubis44 - Thursday, February 25, 2016 - link

    If results like these continue with other DX12 games, nVidia's going to be the one with only 20% in a matter of months.
  • althaz - Thursday, February 25, 2016 - link

    Even in generation where AMD/ATI have been dominant in terms of performance and value, they've still not really dominated in sales.

    Just like even when AMD's CPUs were offering twice the performance per watt and cheaper performance per dollar, they still sold less than Intel.

    Doing it for a short time isn't enough, you have to do it for *years* to get a lead like nVidia has.

    Firstly you have to overturn brand-loyalty from complete morons (aka everybody with any brand loyalty to any company, these are corporations that only care about the contents of your wallet, make rational choices). That will happen only a small percentage of people at a time. So you have to maintain a pretty serious lead for a long time to do it.

    AMD did manage to do it in the enthusiast space with CPUs, but (arguably due to Intel being dodgy pricks) they didn't quite turn that into mainstream market dominance. Which sucks for them, because they absolutely deserved it.

    So even if AMD maintains this DX12 lead for the rest of the year and all of the next, they'll still sell less GPUs than nVidia will in that time. But if they can do it for another year after that, *then* would they be likely to start winning the GPU war.

    Personally, I don't care a lot. I hope AMD do better because they are losing and competition is good. However, I will make my next purchasing decision on performance and price, nothing else.
  • permastoned - Sunday, February 28, 2016 - link

    Wasn't trolling - there are other metrics that show the case; for you to imply that 3dmark isn't valid is just silly: http://wccftech.com/amd-r9-290x-fast-titan-dx12-en...

    2 points = trend.

    Another thing; what's the deal with all these fanboys? There is no benefit to being a fanboy of either AMD or Nvidia, it is just going to cause you problems because it may cause you to buy based on brand, rather than on performance per dollar, which is the factor that actually matters. At different price ranges different brands are better - e.g top end, a 980Ti is better than a fury X, however if you are looking in the price bracket below, and want buy a 980, you will get better performance and performance per dollar from a standard fury.

    Being a fanboy will blind you from accepting the truth when the tides shift and the tables eventually turn. It helps you in no way at all, it disadvantages you in many. It also causes you to get angry on forums for no reason, and call people 'trolls' when they are stating facts.
  • Continuity28 - Wednesday, February 24, 2016 - link

    By the time DX12 becomes commonplace, I'm sure they will have cards that were built for DX12.

    It makes a lot of sense to design your cards around what will be most useful today, not years in the future when people are replacing their cards anyways. Does it really matter if AMD's DX12 performance is better when it isn't relevant, when their DX11 performance is worse when it is relevant?
  • Senti - Wednesday, February 24, 2016 - link

    Indeed it makes much sense to build cards exactly for today so people would be forced to buy new hardware next year to have decent performance. From certain green point of view. But many people are actually hoping that their brand new mid-top card would last with decent performance at least some years.
  • cmdrdredd - Wednesday, February 24, 2016 - link

    Hardware performance for new APIs is always weak with first gen products. That isn't changing here. When there are many DX12 titles out and new cards are out there, you'll see that people don't want to try playing with their old cards and will be buying new. That's how it works.
  • ToTTenTranz - Wednesday, February 24, 2016 - link

    "Hardware performance for new APIs is always weak with first gen products."

    Except that doesn't seem to be the case with 2012's Radeon line.

Log in

Don't have an account? Sign up now