Bioshock Infinite

Bioshock Infinite is Irrational Games’ latest entry in the Bioshock franchise. Though it’s based on Unreal Engine 3 – making it our obligatory UE3 game – Irrational had added a number of effects that make the game rather GPU-intensive on its highest settings. As an added bonus it includes a built-in benchmark composed of several scenes, a rarity for UE3 engine games, so we can easily get a good representation of what Bioshock’s performance is like.

As opposed to our previous game, with Bioshock the GTX 780 Ti comes out at a very strong contender, easily surpassing everything AMD and NVIDIA. Here we see it best AMD’s best by 18%, and against GTX Titan and GTX 780 it’s 7% and 20% ahead respectively. Though admittedly everything here is averaging better than 60fps at this point.

Meanwhile for the AFR matchup, with a pair of GTX 780 Ti’s we’re either looking framerates that will make a 120Hz gamer happy, or enough horsepower to take on 4K at our highest settings and still come out well ahead. At 57.3fps the GTX 780 Ti is several frames per second ahead of the 290X CF, coming up just short of averaging 60fps even at this very high resolution.

Company of Heroes 2 Battlefield 3
Comments Locked

302 Comments

View All Comments

  • yuko - Monday, November 11, 2013 - link

    for me neither of them is gamechanger ... gsync, shield ... nice stuff i don't need
    mantle: another nice approach to create an semi-closed-standard .. it's not that directX or opengl is allready existing and working quite good, no , we need another low level standard where amd creates the api (and to be honest, they would be quite stupid not optimizing it for their hardware).

    I cannot believe and hope that mantle will flop, it does no favor to customers and the industry. It's just good for the marketing but has no real world use.
  • Kamus - Thursday, November 7, 2013 - link

    Nope, it's confirmed for every frostbite 3 game coming out, that's at least a dozen so far, not to mention it's also officially coming to starcitizen, which runs on cryengine 3 I believe.
    But yes, even with those titles it's still a huge difference, obviously.

    That said, you can expect that any engine optimized for GCN on consoles could wind up with mantle support, since the hard work is already done. And in the case of star citizen... Well, that's a PC exclusive, and it's still getting mantle.
  • StevoLincolnite - Thursday, November 7, 2013 - link

    Mantle is confirmed for all Frostbite powered games.
    That is, Battlefield 4, Dragon Age 3, Mirrors Edge 2, Need for Speed, Mass Effect, StarWars Battlefront, Plant's vs Zombies: Garden Warfare and probably others that haven't been announced yet by EA.
    Star Citizen and Thief will also support Mantle.

    So that's EA, Cloud Imperium Games, Square Enix that will support the API and it hasn't even released yet.
  • ahlan - Thursday, November 7, 2013 - link

    And for Gsync you will need a new monitor with Gsync support. I won't buy a new monitor only for that.
  • jnad32 - Thursday, November 7, 2013 - link

    http://ir.amd.com/phoenix.zhtml?c=74093&p=irol...
    BOOM!
  • Creig - Friday, November 8, 2013 - link

    Gsync will only work on Kepler and above video cards.

    So if you have an older card, not only do you have to buy an expensive gsync capable monitor, you also need a new Kepler based video card as well. Even if you already own a Kepler video card, you still have to purchase a new gsync monitor which will cost you $100 more than an identical non-gsync monitor.

    Whereas Mantle is a free performance boost for all GCN video cards.

    Summary:
    Gsync cost - Purchase new computer monitor +$100 for gsync module.
    Mantle cost - Free performance increase for all GCN equipped video cards.

    Pretty easy to see which one offers the better value.
  • neils58 - Sunday, November 10, 2013 - link

    As you say Mantle is very exciting, but we don't know how much performance we are talking about yet. My thinking on saying that crossfire was AMD's only answer is that in order to avoid the stuttering effect of dropping below the Vsync rate, you have to ensure that the minimum framerate is much higher, which means adding more cards or turning down quality settings. If Mantle turns out to be a huge performance increase things might work out, but we just don't know.

    Sure, TN isn't ideal, but people with gaming priorities will already be looking for monitors with low input lag, fast refresh rates and features like backlight strobing for motion blur reduction, G-Sync will basically become a standard feature on a brands lineup of gaming oriented monitors. I think it'll come down in price a fair bit too once there are a few competing brands.

    It's all made things tricky for me, I'm currently on a 1920x1200 'VA monitor on a 5850 and was considering going up to a 1440p 27" screen (which would have required a new GPU purchase anyway) G-Sync adds enough value to Gaming TN's to push me over to them.
  • jcollett - Monday, November 11, 2013 - link

    I've got a large 27" IPS panel so I understand the concern. However, a good high refresh panel need not cost very much and still look great. Check out the ASUS VG248QE; been hearing good things about the panel and it is relatively cheap at about $270. I assume it would work with the G-Sync but I haven't confirmed that myself. I'll be looking for reviews of Battlefield 4 using Mantle this December as that could makeup a big part of the decision on my next card coming from Team Green or Red.
  • misfit410 - Thursday, November 7, 2013 - link

    I don't buy that it's a game changer, I have no intention of replacing my three Dell Ultrasharp monitors anytime soon, and even if I did I have no intention of dealing with buggy displayport as my only option to hook up a synced monitor.
  • Mr Majestyk - Thursday, November 7, 2013 - link

    +1

    I've got two high end Dell 27" monitors and it's a joke to think I'd swap them out for garbage TN monitors just to get G Sync.

    I don't see the 780 Ti as being any skin off AMD's nose. It's much dearer for very small gains and we haven't seen the custom AMD boards yet. For now I'd probably get the R9 290, assuming custom boards can greatly improve on cooling and heat.

Log in

Don't have an account? Sign up now