Bioshock Infinite

Bioshock Infinite is Irrational Games’ latest entry in the Bioshock franchise. Though it’s based on Unreal Engine 3 – making it our obligatory UE3 game – Irrational had added a number of effects that make the game rather GPU-intensive on its highest settings. As an added bonus it includes a built-in benchmark composed of several scenes, a rarity for UE3 engine games, so we can easily get a good representation of what Bioshock’s performance is like.

Bioshock Infinite - 3840x2160 - Ultra Quality + DDoF

Bioshock Infinite - 3840x2160 - Medium Quality

Bioshock Infinite - 2560x1440 - Ultra Quality + DDoF

At Bioshock’s highest quality settings the game generally favors NVIDIA’s GPUs, particularly since NVIDIA’s most recent driver release. As a result we’ll see the 295X2 come up short of 60fps on Ultra quality at 2160p, and otherwise trail the GTX 780 Ti SLI at both 2160p and 1440p. However it’s interesting to note that at 2160p with Medium quality – a compromise setting mostly for testing single-GPU setups at this resolution – we see the 295X2 jump ahead of NVIDIA’s best, illustrating the fact that what’s ultimately dragging down AMD’s performance in this game is a greater degree of bottlenecking with Bioshock’s Ultra quality effects.

Bioshock Infinite - Delta Percentages

Bioshock Infinite - Surround/4K - Delta Percentages

Meanwhile our first set of frame pacing benchmarks has more or less set the stage. Thanks to its XDMA engine the 295X2 is able to deliver acceptable frame pacing performance at both 1440p and 2160p, though at 1440p in particular NVIDIA does technically fare better than AMD here. As for the Radeon HD 7990, this offers a solid example of how AMD’s older GCN 1.0 based dual-GPU card still has great difficulty with frame pacing at higher resolutions.

Company of Heroes 2 Battlefield 4
Comments Locked

131 Comments

View All Comments

  • CiccioB - Tuesday, April 8, 2014 - link

    Well, not, not exactly. One thing is not being PCI compliant, and that's a thing I can understand. Another thing is going beyond connectors electrical power specifications. If they put 3 connectors I would have not had any problem. But as it is they are forcing components specifications, not simple indications rules on maximum size and power draw.
  • meowmanjack - Tuesday, April 8, 2014 - link

    If you look at the datasheet for the power connector (I'm guessing on the part number but the Molex part linked below should at least be similar enough), each pin is rated for 23 A and the housing can support a full load on each pin. Even if only 3 pairs are passing current, the connector can deliver over 800W at 12V.

    The limiting factor for how much power can be drawn from that connector is going to be the copper width and thickness on the PCB. If AMD designed the board to carry ~20 A (which the presumably have) off each connector it won't cause a problem.
  • meowmanjack - Tuesday, April 8, 2014 - link

    Oops, forgot the datasheet
    http://www.molex.com/molex/products/datasheet.jsp?...
  • behrouz - Tuesday, April 8, 2014 - link

    Thanks For Link,Finally My Doubts were Resolved.
  • Ian Cutress - Tuesday, April 8, 2014 - link

    Most of the power will be coming from the PCIe power connectors, not the lane itself. If you have 5/6/7 in a single system, then yes you might start to see issues without the appropriate motherboard power connectors.
  • dishayu - Tuesday, April 8, 2014 - link

    I'm yet to read the review but FIVE HUNDRED WATTS? WOW!
  • Pbryanw - Tuesday, April 8, 2014 - link

    I'd be more impressed if it drew 1.21 Jigawatts!! :)
  • krazyfrog - Tuesday, April 8, 2014 - link

    On the second last page, the second last chart is of load GPU temperature when it should be load load noise levels.
  • piroroadkill - Tuesday, April 8, 2014 - link

    Reasonable load noise and temps, high performance. Nice.

    You'll want to get the most efficient PSU you can get your mitts on, though.

    Also, I would seriously consider a system that is kicking out 600 Watts of heat to be something you wouldn't want in the same room as you. Your AC will work overtime, or you'll be sweating your ass off.

    A GPU for Siberia! But then, that's not really a downside as such, just a side effect of having a ridiculous amount of power pushing at the edges of this process node.
  • Mondozai - Tuesday, April 8, 2014 - link

    "Reasonable noise and temps"? It is shockingly quiet during load for a dual GPU card. And it has incredibly low GPU temps, too.

    As for heat, not really, only if you have a badly ventilated room in general or live in a warm climate.

Log in

Don't have an account? Sign up now