In an off year that hasn’t seen too many new product releases thus far, this has been anything but a dull time. For the better part of a year now the technology journalist community – spearheaded by The Tech Report’s Scott Wasson – has been investigating the matter of frame pacing and frame timing on GPUs. In applying new techniques and new levels of rigor, Scott found that frames were not being rendered as consistently as we had always assumed they were, and that cards that were equal in performance as measured by frame rates were not necessarily equal in in performance as measured by frame intervals. It was AMD in particular who was battered by all of this work, with the discovery that both their single-GPU and multi-GPU products were experiencing poor frame pacing at times. AMD could meet (and beat) NVIDIA on frame rates, only to lose out on smoothness as a result of poor frame pacing.

Since then we have seen both some progress and some new revelations on these matters. AMD was very quick to start working on resolving their single-GPU issues, and by March when they were willing and able to fully engage the tech community, they had already solved the bulk of those single-GPU issues. With those issues behind them, they also laid out a plan to tackle the more complex issue of multi-GPU frame pacing, which would involve spending a few months to write a new frame pacing mechanism for their cards.

At the same time NVIDIA also dropped a small bombshell with the public release of FCAT, their long in development frame interval benchmarking tool. FCAT could do what FRAPS alone could not, capturing and analyzing the very output of video cards to determine frame rates, frame times, and frame intervals. Though FRAPS was generally sufficient to find and diagnose single-GPU issues, FCAT shed new light onto AMD’s multi-GPU issues, painting a far more accurate – and unfortunately for AMD more dire picture of Crossfire frame pacing.

Perhaps as proof that there’s no such thing as coincidence, since then we have seen the release of AMD’s latest multi-GPU monster, the Radeon HD 7990. Packing a pair of high clocked Tahiti GPUs, the 7990 was AMD’s traditional entry into the realm of $1000 multi-GPU super cards. A capable card on paper, the 7990 has been at the mercy of AMD’s drivers and lack of a frame pacing mechanism, with the previous revelations and FCAT results causing the 7990 to suffer what can only be described as a rough launch.

Ultimately when AMD engaged the community back in March they had a clear plan for addressing their multi-GPU frame pacing issues, developing a new frame pacing mechanism for their cards. AMD stated outright that this work would take a few months, something of an arduous wait for existing Crossfire users, setting a goal that the new frame pacing mechanism would “come in or around a July driver drop.” July has since come and gone by a day, but at long last AMD has completed their initial work on their new frame pacing mechanism and is releasing the first public driver today at 2pm ET as Catalyst 13.8 Beta 1.

As part of today’s launch activities, AMD seeded the beta driver to the press a week in advance to give us a chance to put it through the necessary paces, give AMD feedback, and write up about our experiences with the new driver. Over the next several pages we’ll be going over what changes AMD has made to their drivers, how they impact the 6 games we do frame interval testing with, and ultimately whether AMD has made sufficient progress in resolving their frame pacing issues. Make no mistake: AMD wants to get past these frame pacing issues as quickly as possible and remove the cloud of doubt that has surrounded the 7990 since its launch, making this driver launch an extremely important event for the company.

In Summary: The Frame Pacing Problem
Comments Locked

102 Comments

View All Comments

  • chizow - Friday, August 2, 2013 - link

    That makes sense, but I guess the bigger concern from the outset was how AMD's allowance of runtframes/microstutter in an "all out performance" mentality might have overstated their performance. You found in your review that AMD performance typically dropped 5-10% as a result of this fix, that should certainly be considered, especially if AMD isn't doing a good job of making sure they implement this frame time fix across all their drivers, games, APIs etc.

    Also, any word whether this is a driver-level fix or an game-specific profile optimization (like CF, SLI, AA profiles)?
  • Ryan Smith - Friday, August 2, 2013 - link

    The performance aspect is a bit weird. To be honest I'm not sure why performance was up with Cat 13.6 in the first place. For a mature platform like Tahiti it's unusual.

    As for the fix, AMD has always presented it as being a driver level fix. Now there are still individual game-level optimizations - AMD is currently trying to do something about Far Cry 3's generally terrible consistency, for example (an act I'm convinced is equivalent to parting the Red Sea) - but the basic frame pacing mechanism is universal.
  • Thanny - Thursday, August 1, 2013 - link

    Perhaps this will be the end of the ludicrous "runt" frame concept.

    All frames with vsync disabled are runts, since they are never completely displayed. With a sufficiently fast graphics card and/or sufficiently less complex game, every frame will be a runt even by the arbitrary definitions you find at sites like this.

    And all the while, nothing at all is ever said about the most hideous artifact of all - screen tearing.
  • Asik - Thursday, August 1, 2013 - link

    There is a simple and definite fix for tearing artifacts and you mention it yourself - vsync. If screen tearing bothers you, and I think it should bother most people, you should keep vsync on at all times.
  • chizow - Thursday, August 1, 2013 - link

    Vsync or frame limiters are certainly workarounds, but it also introduces input lag and largely negates the benefit of having multiple powerful GPUs to begin with. A 120Hz monitor would increase the headroom for Vsync, but also by nature reduces the need for Vsync (there's much less tearing).
  • krutou - Friday, August 2, 2013 - link

    Triple buffering solves tearing without introducing significant input lag. VSync is essentially triple buffering + frame limiter + timing funny business.

    I have a feeling that Nvidia's implementation of VSync might actually not have input lag due to their frame metering technology.

    Relevant: http://www.anandtech.com/show/2794/3
  • chizow - Saturday, August 3, 2013 - link

    Yes this is certainly true, when I was on 60Hz I would always enable Triple Buffering when available, however, TB isn't the norm and few games implemented it natively. Even fewer implemented it correctly, most use a 3 frame render ahead queue, similar to the Nvidia driver forcing it which is essentially a driver hack for DX.

    Having said all that, TB does still have some input lag even at 120Hz even with Nvidia Vsync compared to 120Hz without Vsync (my preferred method of gaming now when not using 3D).
  • vegemeister - Monday, August 5, 2013 - link

    The amount of tearing is independent the refresh rate of your monitor. If you have vsync off, every frame rendered creates a tear line. If you are drawing frames at 80Hz without vsync, you are going to see a tear every 1/80 of a second no matter what the refresh rate of your screen is. The only difference is that a 60Hz screen would occasionally have two tear lines on screen at once.
  • chizow - Thursday, August 1, 2013 - link

    Sorry, not even remotely close to true. Runt frames were literally tiny shreds of frames followed by full frames, unlike normal screen tearing with Vsync off that results in 1/3 or more of the frame being updated at a time, consistently.

    The difference is, one method does provide the impression of fluidity and change from one frame to the next (with palpable tearing) whereas runt frames are literally worthless unless you think 3-4 rows worth of image followed by full images provides any meaningful sense of motion.

    I do love the term "runt frame" though, an anachronism in the tech world born of AMD's ineptitude with regard to CrossFire. I for one will miss it.
  • Thanny - Thursday, August 1, 2013 - link

    You're not making sense. All frames with vsync off are partial. The frame buffer is replaced in the middle of screen updates, so no rendered frame is ever displayed completely.

    A sense of motion is achieved by displaying different frames in a time sequence. It has nothing to do with showing parts of different frames in the same screen refresh.

    And vsync adds a maximum latency of the inverse of the screen refresh (16.67ms for a 60Hz display). On average, it will be half that. If you have a very laggy monitor (Overdrive-TN, PVA, or MVA panel types), that tiny bump from vsync might push the display lag to noticeability. For plain TN and IPS panels (not to mention CRT), there will be no detectable display lag with vsync on.

Log in

Don't have an account? Sign up now