Bioshock Infinite

Bioshock Infinite is Irrational Games’ latest entry in the Bioshock franchise. Though it’s based on Unreal Engine 3 – making it our obligatory UE3 game – Irrational had added a number of effects that make the game rather GPU-intensive on its highest settings. As an added bonus it includes a built-in benchmark composed of several scenes, a rarity for UE3 engine games, so we can easily get a good representation of what Bioshock’s performance is like.

Bioshock Infinite - 2560x1440 - Ultra Quality + DDoF

AMD and NVIDIA exchange places once more, with the 7990 taking a small lead over the GTX 690.

Bioshock Infinite - Delta Percentages - 2560x1440 - Ultra Quality + DDoF

AMD’s initial situation with Bioshock was not as dire as it was in say Battlefield 3, but with deltas approaching 60% it wasn’t pretty either. Once more they’ve managed to get their delta percentages to around 20%, a level that is acceptable for now while leaving clear room for improvement. Especially as once more the 7990 delivers deltas more than twice those of the GTX 690.

Though on a side note, this game is a great reminder of just how much better single-GPU cards are at consistency. The best multi-GPU setup is at 8.2%; the worst single-GPU setup is 2.6%.

Graphically things are roughly as expected. It’s interesting to note that NVIDIA has some significant frame time spikes that AMD doesn’t encounter, though a single-GPU setup would shortcut the issue entirely.

 

Bioshock Infinite - 95th Percentile FT - 2560x1440 - Ultra Quality + DDoF

AMD’s 95th percentile improvement isn’t nearly as pronounced in Bioshock. Meanwhile the higher variability does cost them just enough to have the 7990 fall behind the GTX 690 here.

Battlefield 3 Crysis 3
Comments Locked

102 Comments

View All Comments

  • chizow - Wednesday, August 7, 2013 - link

    There was discussions of microstutter on various forums associated with multi-GPU, but PCGH was the first site to publish it's findings in detail with both video evidence and hard data. From what I remember, they were the first to develop the methodology of using FRAPs frametimes and graphing the subsequent results to illustrate microstutter.
  • BrightCandle - Friday, August 2, 2013 - link

    One of the most shocking revelations to me is that AMDs quality assurance did not include checking the output of their cards frame by frame. I had always assumed that both NVidia and AMD had HDMI/DVI/VGA recorders that allowed them to capture the output of their cards so they could check them pixel by pixel, frame by frame and presumably check they were correct automatically.

    Such a technology would clearly have shown the problem immediately. I am stunned that these companies don't do that. Even FCAT is a blatantly blunt tool as it doesn't say anything about the contents of the frames. We still don't have any way to measure end to end latency for comparison either. All in all there is much to left to do and I am not confident that either company is testing these products well, its just I couldn't believe that AMD wasn't testing theirs for consistency (it was obvious when you played it something was wrong) at all.
  • krutou - Friday, August 2, 2013 - link

    AMD is in the business of being the best performance per price entry in every market segment. Technology and quality come second.

    How often does AMD introduce and/or develop technologies for their graphics cards? The only two that come to mind are Eyefinity and TressFX (100 times more overhyped than PhysX).
  • Death666Angel - Saturday, August 3, 2013 - link

    I think ATI had tessellation in their old DX8 chips. nVidia bought PhysX, so that shouldn't count. But I don't really see how having exclusive technology usable by a single GPU vendor is anything good. We need standardization and everybody having access to the same technologies (albeit with different performance deltas). Look at the gimmicky state of PhysX and imagine what it could be if nVidia would allow it to be fully utilized by CPUs and AMD GPUs?
  • krutou - Saturday, August 3, 2013 - link

    Because OpenCl and TressFX are doing so well right?
  • bigboxes - Sunday, August 4, 2013 - link

    March on, fanboi.
  • JamesWoods - Sunday, August 4, 2013 - link

    If you think that is all AMD/ATI has ever done for graphics then you sir, are ignorant. I was going to use a more degrading word there and thought better of it.
  • Will Robinson - Friday, August 2, 2013 - link

    LOL...what a load of tosh.
    "NVDA had to take them by the hand"?
    You and Wreckage ought to post in green text.
  • chizow - Friday, August 2, 2013 - link

    Agree with pretty much of all of this, although I would direct a lot of the blame on AMD's most loyal, enthusiastic supporters as well. Every time microstutter was mentioned and identified as being worst with AMD solutions, AMD's biggest fans would get hyperdefensive about it. If those most likely to have a problem were too busy denying any problem existed, it really should be no surprise it was never fixed.

    And this is the result. Years of denial and broken CF, finally fixed as a result of the scrutiny from the press and laughter of Nvidia fans which brought this to a head and forced AMD to take a closer look and formulate a solution.
  • EJS1980 - Friday, August 2, 2013 - link

    "Truth favors not one side."

Log in

Don't have an account? Sign up now