Gaming Tests: Gears Tactics

Remembering the original Gears of War brings back a number of memories – some good, and some involving online gameplay. The latest iteration of the franchise was launched as I was putting this benchmark suite together, and Gears Tactics is a high-fidelity turn-based strategy game with an extensive single player mode. As with a lot of turn-based games, there is ample opportunity to crank up the visual effects, and here the developers have put a lot of effort into creating effects, a number of which seem to be CPU limited.

Gears Tactics has an in-game benchmark, roughly 2.5 minutes of AI gameplay starting from the same position but using a random seed for actions. Much like the racing games, this usually leads to some variation in the run-to-run data, so for this benchmark we are taking the geometric mean of the results. One of the biggest things that Gears Tactics can do is on the resolution scaling, supporting 8K, and so we are testing the following settings:

  • 720p Low, 4K Low, 8K Low, 1080p Ultra

For results, the game showcases a mountain of data when the benchmark is finished, such as how much the benchmark was CPU limited and where, however none of that is ever exported into a file we can use. It’s just a screenshot which we have to read manually.

If anyone from the Gears Tactics team wants to chat about building a benchmark platform that would not only help me but also every other member of the tech press build our benchmark testing platform to help our readers decide what is the best hardware to use on your games, please reach out to ian@anandtech.com. Some of the suggestions I want to give you will take less than half a day and it’s easily free advertising to use the benchmark over the next couple of years (or more).

As with the other benchmarks, we do as many runs until 10 minutes per resolution/setting combination has passed. For this benchmark, we manually read each of the screenshots for each quality/setting/run combination. The benchmark does also give 95th percentiles and frame averages, so we can use both of these data points.

AnandTech Low Resolution
Low Quality
Medium Resolution
Low Quality
High Resolution
Low Quality
Medium Resolution
Max Quality
Average FPS
95th Percentile

 

All of our benchmark results can also be found in our benchmark engine, Bench.

Gaming Tests: Far Cry 5 Gaming Tests: Grand Theft Auto V
Comments Locked

126 Comments

View All Comments

  • 1_rick - Monday, January 4, 2021 - link

    Because you've got the people who will spend any amount of money to get 5fps more in their games so they can smugly tell everyone who they've got the best.
  • lopri - Monday, January 4, 2021 - link

    I see Ryzens beating this thing by sizeable margins in games.
  • zodiacfml - Monday, January 4, 2021 - link

    Ryzen 5000 series is significantly faster than Intel's i9-10900k in all games though I haven't seen compared with overclocks. The Intel gets good at rendering/encode but I'd rather buy old Xeons with Chinese motherboards for those loads
  • V3ctorPT - Monday, January 4, 2021 - link

    In gaming the real star is the 5600X... awesome performance for its price, for a 65W(!) CPU...
  • lmcd - Monday, January 4, 2021 - link

    It's basically an 80W CPU though lol
  • Crazyeyeskillah - Monday, January 4, 2021 - link

    my 5600x is 10-20c hotter than my 3600 clock for clock on the same exact rig and watercooler.
  • JessNarmo - Monday, January 4, 2021 - link

    I was considering 10850k as an upgrade option when I it for $400. It's undeniably significantly better deal than 10900k at $530.

    But ultimately decided that it's just not good enough for an upgrade because it still doesn't support PCIE 4 so if I upgrade I would have to upgrade again very shortly.

    Would have to wait for 5900x availability or maybe intel will come up with something better.
  • edzieba - Monday, January 4, 2021 - link

    The same argument can be made for the 5900x and PCIe 5 (or DDR 5). There will always be a new protocol, or new interface, or etc on the horizon.
  • JessNarmo - Monday, January 4, 2021 - link

    Disagree. Right now I have the same Skylake cores running 5Ghz and the same PCIE 3, the same everything and it's still fine except I have less cores.

    With 5900x I'll get better single thread and multi thread performance as well as PCIE4 which is really important for future GPU's and upcoming upgrades unlike PCIE5 which isn't important at all at this point in time.
  • MDD1963 - Monday, January 4, 2021 - link

    PCI-e 4.0 was going to be 'critical' for GPUs to get best performance from a 3080/3090...; instead, it was/is still a non-player. Maybe that will change for next gen. Maybe not.

Log in

Don't have an account? Sign up now