Bioshock Infinite

Bioshock Infinite is Irrational Games’ latest entry in the Bioshock franchise. Though it’s based on Unreal Engine 3 – making it our obligatory UE3 game – Irrational had added a number of effects that make the game rather GPU-intensive on its highest settings. As an added bonus it includes a built-in benchmark composed of several scenes, a rarity for UE3 engine games, so we can easily get a good representation of what Bioshock’s performance is like.

Bioshock Infinite - 3840x2160 - Ultra Quality + DDoF

Bioshock Infinite - 3840x2160 - Medium Quality

Bioshock Infinite - 2560x1440 - Ultra Quality + DDoF

At Bioshock’s highest quality settings the game generally favors NVIDIA’s GPUs, particularly since NVIDIA’s most recent driver release. As a result we’ll see the 295X2 come up short of 60fps on Ultra quality at 2160p, and otherwise trail the GTX 780 Ti SLI at both 2160p and 1440p. However it’s interesting to note that at 2160p with Medium quality – a compromise setting mostly for testing single-GPU setups at this resolution – we see the 295X2 jump ahead of NVIDIA’s best, illustrating the fact that what’s ultimately dragging down AMD’s performance in this game is a greater degree of bottlenecking with Bioshock’s Ultra quality effects.

Bioshock Infinite - Delta Percentages

Bioshock Infinite - Surround/4K - Delta Percentages

Meanwhile our first set of frame pacing benchmarks has more or less set the stage. Thanks to its XDMA engine the 295X2 is able to deliver acceptable frame pacing performance at both 1440p and 2160p, though at 1440p in particular NVIDIA does technically fare better than AMD here. As for the Radeon HD 7990, this offers a solid example of how AMD’s older GCN 1.0 based dual-GPU card still has great difficulty with frame pacing at higher resolutions.

Company of Heroes 2 Battlefield 4
Comments Locked

131 Comments

View All Comments

  • Dustin Sklavos - Tuesday, April 8, 2014 - link

    Single cable is beyond spec for the connector. We've been hearing connectors actually melting. "Crappy" isn't really relevant here; this is the *only* card on the market that causes these kinds of problems.
  • Anders CT - Tuesday, April 8, 2014 - link

    500 watt power consumption is insane. It should come with an on-board dieselgenerator.
  • Blitzninjasensei - Saturday, July 12, 2014 - link

    The thought of this made my day. Thanks for the joke, needed it.
  • therfman - Tuesday, April 8, 2014 - link

    This is all very nice, but unless case space is at a premium, I fail to see the advantage of this card over two 290X cards with good coolers. The PowerColor PCS+ version of the 290X runs at 1050 MHz, is much quieter than the reference boards (40-42 dBA under load at 75cm), and is available for under $600. Is having a single-slot solution worth $300 extra? Not unless you really want have everything in a small form factor case.
  • Peeping Tom - Tuesday, April 8, 2014 - link

    Is that a giveaway I hear coming? ;-)
  • silverblue - Tuesday, April 8, 2014 - link

    Please, don't... I don't think I could stand to see a card of this calibre being offered only to those in the States... :|
  • JBVertexx - Tuesday, April 8, 2014 - link

    Is there any way to tell the temperatures of each of the two GPUs? Where does the temperature reading for the testing come from - is it an average of the 2, the hotter, or the cooler one?

    Reason I'm asking is I was skeptical a 120mm rad could effectively cool two of these GPUs. Given they are connected in series, one is bound to be measurably hotter than the other.

    Otherwise, this looks to be a winner. I was considering upgrading my uATX rig so I could do SLI. But with this card, I could keep the compact form factor.
  • JBVertexx - Tuesday, April 8, 2014 - link

    After some additional research on the web, it looks like the difference in temps between the 2 GPUs is only about 2 degrees under load, so pleasantly surprised with how well the 120mm radiator handles the cooling.
  • Ryan Smith - Tuesday, April 8, 2014 - link

    The temperature readings come from MSI Afterburner, which is capable of reading the temperatures via AMD's driver API. And unless otherwise noted, the temperature is always the hottest temperature.
  • srsbsns - Tuesday, April 8, 2014 - link

    The point of this driver was improvements the the HD7000 series and their rebrands... Anandtech missed this by benching an already optimized 290x dual card?

Log in

Don't have an account? Sign up now