Early in December 2012, I posted a short article about the Fire Strike trailer for the next 3DMark with the note that the full release was promised before the end of 2012. Insert the obligatory “the best laid plans” quote or something similar, but the story as usual is that there have been some additional delays. Futuremark sent out an email today explaining the reason for the delays, and notably absent this time around are any firm release dates other than, “sooner rather than later.” Here’s the pertinent information from Futuremark’s President, Oliver Baltuch:

We work with many of the world's leading technology companies when creating our benchmarks. We work closely with the engineers and technicians at AMD, Intel, Microsoft, NVIDIA and many other companies, benefiting from their insights and expertise. We work with our Benchmark Development Program partners from start to finish, from the initial specification document to the final software release. BPD members receive every development build and have access to source code to see how the benchmark functions.

We believe that this open process of close cooperation with industry experts is the only way to create accurate and impartial benchmarks that measure performance fairly. Having high-level access to the industry's leaders also ensures that our benchmarks are not only relevant for today's hardware, but remain relevant year after year.

If there is a downside, it's that it takes time to gather and resolve the feedback from so many partners, and that is where we find ourselves today. I hope this helps you understand why I cannot confirm a launch date right now. Simply put, 3DMark will be ready when it's ready, which we expect will be sooner rather than later. What is certain is that the new 3DMark will be our best benchmark yet and well worth waiting for. You can find out more from our website: www.futuremark.com/3dmark

One of the interesting points is that with all the partners Futuremark has, hopefully we won’t see any completely shameful optimizations for one smartphone platform. That’s potentially part of the reason for the delays, but the other big reason is likely the support of multiple platforms—getting everything approved on three different mobile OSes for their app stores could be a bottleneck, not to mention making sure everything works properly.

Something else we haven’t mentioned before is that the new 3DMark will run at an internal resolution now and scale to your screen resolution, so regardless of your screen resolution (e.g. on a laptop, tablet, or smartphone), the same amount of computation will be done and scores will be comparable. It’s a nice feature for benchmarking, though the final results end up being more a way of doing “true” comparisons between graphics hardware than a good indication of whether device X will run game Y at a reasonable frame rate—the latter is often more a question of whether the developer takes time to optimize for a platform than the core hardware.

Besides the 3DMark delays, I met with Oliver at CES to discuss what’s happening and where they see their benchmarks headed in the near future. Obviously 3DMark and PCMark will both continue to see updates, but the devices people use for everyday computing are in flux. With 3DMark going cross-platform, it will be interesting to see if PCMark follows suit—we could certainly use some additional benchmarks for testing tablets and laptops. A good standardized and open way of testing other elements of performance—storage, WiFi, memory bandwidth, etc.—could also be useful in helping to improve less immediately obvious aspects of a device. eMMC storage for instance hasn’t exactly been the speediest way of doing things. On the other hand, seeing which hardware is fastest doesn’t always matter as much as determining (often subjectively) which platform is actually best.

Anyway, while Futuremark hasn’t committed to a firm release date, 3DMark “Next” is likely to hit some time in the next month or two. How it will stack up against other benchmarks when it lands remains to be seen.

POST A COMMENT

15 Comments

View All Comments

  • killerroach - Thursday, January 24, 2013 - link

    That being said, it's better than nothing. (Also, since either could render at resolutions other than native, it's not simply an academic consideration.)

    Even still, it should be possible to run a custom test at native resolution, not just the demo spec.
    Reply
  • JarredWalton - Thursday, January 24, 2013 - link

    Of course, iPad 2 is actually 2048x1536...I assume you meant iPad 2? And I think there will be an option to run custom resolutions, as has been in the past -- the defaults will simply be set resolutions with scaling. Reply
  • JarredWalton - Thursday, January 24, 2013 - link

    Erg... that was iPad 3 on the first line, obviously. :-) This new keyboard has arranged the keys such that I screw up numbers on a regular basis. Reply
  • MrSpadge - Thursday, January 24, 2013 - link

    IMO rendering internally at a (possibly) lower resolution and upscaling to the display resolution later is what mobile gaming really needs. Add UI elements at native resolution later on. This would especially help with the insane resolutions of modern fancy gadgets. In no way are their GPUs powerful enough to make proper use of this resolution anyway (except for light use.. but we don't need to talk about this anyway, right?) Reply
  • MrSpadge - Thursday, January 24, 2013 - link

    Forgot: this would also be really helpful for notebook gaming and IGPs. Someone showed it at some tech show already, but I didn't hear anything since then. Reply

Log in

Don't have an account? Sign up now