*We are currently in the middle of revisiting our CPU gaming benchmarks, but the new suite was not ready in time for this review. We plan to add in some new games (Borderland 3, Gears Tactics) and also upgrade our gaming GPU to a RTX 2080 Ti.

Gaming: Ashes Classic (DX12)

Seen as the holy child of DirectX12, Ashes of the Singularity (AoTS, or just Ashes) has been the first title to actively go explore as many of the DirectX12 features as it possibly can. Stardock, the developer behind the Nitrous engine which powers the game, has ensured that the real-time strategy title takes advantage of multiple cores and multiple graphics cards, in as many configurations as possible.

As a real-time strategy title, Ashes is all about responsiveness during both wide open shots but also concentrated battles. With DirectX12 at the helm, the ability to implement more draw calls per second allows the engine to work with substantial unit depth and effects that other RTS titles had to rely on combined draw calls to achieve, making some combined unit structures ultimately very rigid.

Stardock clearly understand the importance of an in-game benchmark, ensuring that such a tool was available and capable from day one, especially with all the additional DX12 features used and being able to characterize how they affected the title for the developer was important. The in-game benchmark performs a four minute fixed seed battle environment with a variety of shots, and outputs a vast amount of data to analyze.

For our benchmark, we run Ashes Classic: an older version of the game before the Escalation update. The reason for this is that this is easier to automate, without a splash screen, but still has a strong visual fidelity to test.

Ashes has dropdown options for MSAA, Light Quality, Object Quality, Shading Samples, Shadow Quality, Textures, and separate options for the terrain. There are several presents, from Very Low to Extreme: we run our benchmarks at the above settings, and take the frame-time output for our average and percentile numbers.

All of our benchmark results can also be found in our benchmark engine, Bench.

AnandTech IGP Low Medium High
Average FPS
95th Percentile

 

Gaming: Final Fantasy XV Gaming: Strange Brigade (DX12, Vulkan)
Comments Locked

249 Comments

View All Comments

  • destorofall - Thursday, May 7, 2020 - link

    you sound butthurt
  • 0ldman79 - Thursday, May 7, 2020 - link

    Heaven forbid his data set of God knows how many CPU doesn't include the one you want to see...

    Damn, you really should demand a refund.
  • LMonty - Thursday, May 7, 2020 - link

    You should really file a complaint, buddy. Gotta fight for your rights. ;P
  • jimbo2779 - Sunday, May 10, 2020 - link

    What has happened to the comments section here. Can we go back to just ignoring the ignoramus'. It often means they just go away.
  • psychobriggsy - Thursday, May 7, 2020 - link

    It was mentioned that Intel didn't even send these CPUs out for review, and that they're hard to obtain because Intel isn't making many of them.

    However, a few more data points would be nice. I think Ian needs to set up a system test datacentre like Phoronix so the rebuilding is kept to a minimum!
  • twizzlebizzle22 - Thursday, May 7, 2020 - link

    AMD must have sent the 7700k or specified it's use. I've noticed every review using that specific CPU. AMD aiming for the used market upgraders it seems.
  • amrnuke - Thursday, May 7, 2020 - link

    I believe that's the last Intel chip that was 4C/8T as well, right? Seems a fair comparison, I guess if AMD really think that's the market.

    Anyway, TechPowerUp went ahead and lined up the 3300X against a bunch of other relevant chips (https://www.techpowerup.com/review/amd-ryzen-3-330... It's 1% slower than the 3600 at 720P gaming, 16.5% slower than the 9900K at 720P gaming.

    CPU tests show the 4C/8T 3300X holding up well to the 6C/6T 8600K and 9400F. It pretty well trounces the 9100F.

    The 3100 beats the 9100F by 14% in CPU tests.
  • schujj07 - Thursday, May 7, 2020 - link

    720p gaming isn't even relevant. If these were iGPU tests then sure, but even a GTX 1050 can do better than 720p gaming.
  • supdawgwtfd - Thursday, May 7, 2020 - link

    Are you stupid?

    To test CPU performance you run lower resolution to ensure the CPU is the bottleneck
    Your comment is not relevant.
  • schujj07 - Saturday, May 9, 2020 - link

    Hence why most review sites use 1080p. 720p benchmarking on modern hardware is akin to Quake 3 benchmarking at 640x480 resolution back in 2000. All you end up seeing are crazy high numbers that don't mean anything. We see it all the time that CPU A is faster at 720p but then slower at 1080p?

Log in

Don't have an account? Sign up now