Final Words

Bringing this review to a close, after having seen NVIDIA upstage themselves a couple of times this year already with GK110, it’s admittedly getting a bit harder each time to write about NVIDIA’s flagship GPU. NVIDIA won’t break significant new ground just by refreshing GK110, but gradual performance increases in conjunction with periodic price drops have kept the market fresh while making NVIDIA’s high-end cards a bit faster and a bit cheaper each time. So in that respect we’re enthusiastic about seeing NVIDIA finally release a fully enabled GK110 GeForce card and the performance improvements it brings.

With that in mind, with the release of the GeForce GTX 780 Ti NVIDIA is once more left solidly in control of the single-GPU performance crown. It won’t quite get to claim a massive performance advantage over its closest competitors, but at the end of the day it’s going to be faster than any other single-GPU card out there. This will break down to being 11% faster than Radeon R9 290X, 9% faster than GTX Titan, and a full 20% faster than the original GTX 780 that it formally replaces.

To that end, while NVIDIA can still charge top dollar for their flagship card it’s a sign of the times and of the competition that they released their fully enabled GK110 part as a cheaper GTX 780 series card. At $700 it’s by no means cheap – and this has and always will be the drawback to NVIDIA’s flagships so long as NVIDIA can hold the lead – but it also means that NVIDIA does need to take AMD’s Radeon R9 290 series into account. As such the 290X and the GTX 780, though lesser performing parts, will remain as spoilers for GTX 780 Ti due to their better balance of performance and pricing. All the while GTX 780 Ti stands at the top of the heap for those who want the best.

Meanwhile we bid au revoir to the original GK110 GeForce card, GTX Titan. Though GTX Titan will still be on the market as an entry level compute card, it is finally dethroned as the fastest single-GPU gaming card in NVIDIA’s lineup. At least for the time being GTX Titan is still very secure in its place in the market as a compute card, and so there it will continue, a position that reflects the fact that there’s little need for NVIDIA to keep their gaming and compute products commingled together as a single product. Though we wouldn’t be the least bit surprised if NVIDIA made additional prosumer products of this nature in the future, as GTX Titan clearly worked out well for the company.

And though GTX Titan is falling off of our radar, we’re glad to see that NVIDIA has kept around Titan’s second most endearing design element, the Titan cooler. We won’t hazard to guess just how much it costs NVIDIA over a cheaper design (or what it adds to the final price tag), but with GTX 780 Ti NVIDIA has once again proven just how capable the cooler is when paired with GK110. Even with the slightly higher power consumption of GTX 780 Ti versus the cards that have come before it, thanks to that cooler GTX 780 Ti still hits an excellent sweet spot between performance and noise, offering the flexibility and simplicity of a blower without the noise that has traditionally accompanied such a cooler. And all the while still delivering more than enough performance to hold on to the performance crown.

Finally, let’s talk about SLI for a moment. Much like GTX Titan before it, GTX 780 Ti is so fast that it’s almost more than enough on its own for any standard single-monitor resolution. Even 2560x1440 with high settings isn’t enough to bog down GTX 780 Ti in most games, which makes a pair of GTX 780 Tis in SLI overkill by any definition. Properly using that much power requires multiple monitors, be it an Eyefinity/Surround setup, or more recently a tiled 4K monitor.

In either scenario a GTX 780 Ti is going to be a solid performer for those segments, but NVIDIA is going to have to deal with the fact that their performance advantage is going to melt away with the resolution increase. Right now a single GTX 780 Ti has a solid lead over a single 290X, but a pair of GTX 780 Tis is going to tie with a pair of cheaper 290Xs at 4K resolutions. And with 290X’s frame pacing under control NVIDIA no longer has that advantage to help build their case. GTX 780 Ti still has other advantages – power and noise in particular – but it does mean we’re in an interesting situation where NVIDIA can claim the single-GPU performance crown while the crown for the dual-GPU victor remains up for grabs. It's still very early in the game for 4K and NVIDIA isn't under any great pressure, but it will be an area of improvement for the next generation when Maxwell arrives in 2014.

Overclocking
Comments Locked

302 Comments

View All Comments

  • yuko - Monday, November 11, 2013 - link

    for me neither of them is gamechanger ... gsync, shield ... nice stuff i don't need
    mantle: another nice approach to create an semi-closed-standard .. it's not that directX or opengl is allready existing and working quite good, no , we need another low level standard where amd creates the api (and to be honest, they would be quite stupid not optimizing it for their hardware).

    I cannot believe and hope that mantle will flop, it does no favor to customers and the industry. It's just good for the marketing but has no real world use.
  • Kamus - Thursday, November 7, 2013 - link

    Nope, it's confirmed for every frostbite 3 game coming out, that's at least a dozen so far, not to mention it's also officially coming to starcitizen, which runs on cryengine 3 I believe.
    But yes, even with those titles it's still a huge difference, obviously.

    That said, you can expect that any engine optimized for GCN on consoles could wind up with mantle support, since the hard work is already done. And in the case of star citizen... Well, that's a PC exclusive, and it's still getting mantle.
  • StevoLincolnite - Thursday, November 7, 2013 - link

    Mantle is confirmed for all Frostbite powered games.
    That is, Battlefield 4, Dragon Age 3, Mirrors Edge 2, Need for Speed, Mass Effect, StarWars Battlefront, Plant's vs Zombies: Garden Warfare and probably others that haven't been announced yet by EA.
    Star Citizen and Thief will also support Mantle.

    So that's EA, Cloud Imperium Games, Square Enix that will support the API and it hasn't even released yet.
  • ahlan - Thursday, November 7, 2013 - link

    And for Gsync you will need a new monitor with Gsync support. I won't buy a new monitor only for that.
  • jnad32 - Thursday, November 7, 2013 - link

    http://ir.amd.com/phoenix.zhtml?c=74093&p=irol...
    BOOM!
  • Creig - Friday, November 8, 2013 - link

    Gsync will only work on Kepler and above video cards.

    So if you have an older card, not only do you have to buy an expensive gsync capable monitor, you also need a new Kepler based video card as well. Even if you already own a Kepler video card, you still have to purchase a new gsync monitor which will cost you $100 more than an identical non-gsync monitor.

    Whereas Mantle is a free performance boost for all GCN video cards.

    Summary:
    Gsync cost - Purchase new computer monitor +$100 for gsync module.
    Mantle cost - Free performance increase for all GCN equipped video cards.

    Pretty easy to see which one offers the better value.
  • neils58 - Sunday, November 10, 2013 - link

    As you say Mantle is very exciting, but we don't know how much performance we are talking about yet. My thinking on saying that crossfire was AMD's only answer is that in order to avoid the stuttering effect of dropping below the Vsync rate, you have to ensure that the minimum framerate is much higher, which means adding more cards or turning down quality settings. If Mantle turns out to be a huge performance increase things might work out, but we just don't know.

    Sure, TN isn't ideal, but people with gaming priorities will already be looking for monitors with low input lag, fast refresh rates and features like backlight strobing for motion blur reduction, G-Sync will basically become a standard feature on a brands lineup of gaming oriented monitors. I think it'll come down in price a fair bit too once there are a few competing brands.

    It's all made things tricky for me, I'm currently on a 1920x1200 'VA monitor on a 5850 and was considering going up to a 1440p 27" screen (which would have required a new GPU purchase anyway) G-Sync adds enough value to Gaming TN's to push me over to them.
  • jcollett - Monday, November 11, 2013 - link

    I've got a large 27" IPS panel so I understand the concern. However, a good high refresh panel need not cost very much and still look great. Check out the ASUS VG248QE; been hearing good things about the panel and it is relatively cheap at about $270. I assume it would work with the G-Sync but I haven't confirmed that myself. I'll be looking for reviews of Battlefield 4 using Mantle this December as that could makeup a big part of the decision on my next card coming from Team Green or Red.
  • misfit410 - Thursday, November 7, 2013 - link

    I don't buy that it's a game changer, I have no intention of replacing my three Dell Ultrasharp monitors anytime soon, and even if I did I have no intention of dealing with buggy displayport as my only option to hook up a synced monitor.
  • Mr Majestyk - Thursday, November 7, 2013 - link

    +1

    I've got two high end Dell 27" monitors and it's a joke to think I'd swap them out for garbage TN monitors just to get G Sync.

    I don't see the 780 Ti as being any skin off AMD's nose. It's much dearer for very small gains and we haven't seen the custom AMD boards yet. For now I'd probably get the R9 290, assuming custom boards can greatly improve on cooling and heat.

Log in

Don't have an account? Sign up now