The NVIDIA GeForce GTX 780 Ti Review
by Ryan Smith on November 7, 2013 9:01 AM ESTFinal Words
Bringing this review to a close, after having seen NVIDIA upstage themselves a couple of times this year already with GK110, it’s admittedly getting a bit harder each time to write about NVIDIA’s flagship GPU. NVIDIA won’t break significant new ground just by refreshing GK110, but gradual performance increases in conjunction with periodic price drops have kept the market fresh while making NVIDIA’s high-end cards a bit faster and a bit cheaper each time. So in that respect we’re enthusiastic about seeing NVIDIA finally release a fully enabled GK110 GeForce card and the performance improvements it brings.
With that in mind, with the release of the GeForce GTX 780 Ti NVIDIA is once more left solidly in control of the single-GPU performance crown. It won’t quite get to claim a massive performance advantage over its closest competitors, but at the end of the day it’s going to be faster than any other single-GPU card out there. This will break down to being 11% faster than Radeon R9 290X, 9% faster than GTX Titan, and a full 20% faster than the original GTX 780 that it formally replaces.
To that end, while NVIDIA can still charge top dollar for their flagship card it’s a sign of the times and of the competition that they released their fully enabled GK110 part as a cheaper GTX 780 series card. At $700 it’s by no means cheap – and this has and always will be the drawback to NVIDIA’s flagships so long as NVIDIA can hold the lead – but it also means that NVIDIA does need to take AMD’s Radeon R9 290 series into account. As such the 290X and the GTX 780, though lesser performing parts, will remain as spoilers for GTX 780 Ti due to their better balance of performance and pricing. All the while GTX 780 Ti stands at the top of the heap for those who want the best.
Meanwhile we bid au revoir to the original GK110 GeForce card, GTX Titan. Though GTX Titan will still be on the market as an entry level compute card, it is finally dethroned as the fastest single-GPU gaming card in NVIDIA’s lineup. At least for the time being GTX Titan is still very secure in its place in the market as a compute card, and so there it will continue, a position that reflects the fact that there’s little need for NVIDIA to keep their gaming and compute products commingled together as a single product. Though we wouldn’t be the least bit surprised if NVIDIA made additional prosumer products of this nature in the future, as GTX Titan clearly worked out well for the company.
And though GTX Titan is falling off of our radar, we’re glad to see that NVIDIA has kept around Titan’s second most endearing design element, the Titan cooler. We won’t hazard to guess just how much it costs NVIDIA over a cheaper design (or what it adds to the final price tag), but with GTX 780 Ti NVIDIA has once again proven just how capable the cooler is when paired with GK110. Even with the slightly higher power consumption of GTX 780 Ti versus the cards that have come before it, thanks to that cooler GTX 780 Ti still hits an excellent sweet spot between performance and noise, offering the flexibility and simplicity of a blower without the noise that has traditionally accompanied such a cooler. And all the while still delivering more than enough performance to hold on to the performance crown.
Finally, let’s talk about SLI for a moment. Much like GTX Titan before it, GTX 780 Ti is so fast that it’s almost more than enough on its own for any standard single-monitor resolution. Even 2560x1440 with high settings isn’t enough to bog down GTX 780 Ti in most games, which makes a pair of GTX 780 Tis in SLI overkill by any definition. Properly using that much power requires multiple monitors, be it an Eyefinity/Surround setup, or more recently a tiled 4K monitor.
In either scenario a GTX 780 Ti is going to be a solid performer for those segments, but NVIDIA is going to have to deal with the fact that their performance advantage is going to melt away with the resolution increase. Right now a single GTX 780 Ti has a solid lead over a single 290X, but a pair of GTX 780 Tis is going to tie with a pair of cheaper 290Xs at 4K resolutions. And with 290X’s frame pacing under control NVIDIA no longer has that advantage to help build their case. GTX 780 Ti still has other advantages – power and noise in particular – but it does mean we’re in an interesting situation where NVIDIA can claim the single-GPU performance crown while the crown for the dual-GPU victor remains up for grabs. It's still very early in the game for 4K and NVIDIA isn't under any great pressure, but it will be an area of improvement for the next generation when Maxwell arrives in 2014.
302 Comments
View All Comments
Wreckage - Thursday, November 7, 2013 - link
The 290X = Bulldozer. Hot, loud, power hungry and unable to compete with an older architecture.Kepler is still king even after being out for over a year.
trolledboat - Thursday, November 7, 2013 - link
Hey look, it's a comment from a permanently banned user at this website for trolling, done before someone could of even read the first page.Back in reality, very nice card, but sorely overpriced for such a meagre gain over 780. It also is slower than the cheaper 290x in some cases.
Nvidia needs more price cuts right now. 780 and 780ti are both badly overpriced in the face of 290 and 290x
neils58 - Thursday, November 7, 2013 - link
I think Nvidia probably have the right strategy, G-Sync is around the corner and it's a game changer that justifies the premium for their brand - AMD's only answer to it at this time is going crossfire to try and ensure >60FPS at all times for V-Sync. Nvidia are basically offering a single card solution that even with the brand premium and G-sync monitors comes out less expensive than crossfire. 780Ti for 1440p gamers, 780 for for 1920p gamers.Kamus - Thursday, November 7, 2013 - link
I agree that G-Sync is a gamechanger, but just what do you mean AMD's only answer is crossfire? Mantle is right up there with g-sync in terms of importance. And from the looks of it, a good deal of AAA developers will be supporting Mantle.As a user, it kind of sucks, because I'd love to take advantage of both.
That said, we still don't know just how much performance we'll get by using mantle, and it's only limited to games that support it, as opposed to G-Sync, which will work with every game right out of the box.
But on the flip side, you need a new monitor for G-Sync, and at least at first, we know it will only be implemented on 120hz TN panels. And not everybody is willing to trade their beautiful looking IPS monitor for a TN monitor, specially since they will retail at $400+ for 23" 1080p.
Wreckage - Thursday, November 7, 2013 - link
Gsync will work with every game past ad present. So far Mantle is only confirmed in one game. That's a huge difference.Basstrip - Thursday, November 7, 2013 - link
TLDR: When considering Gsync as a competitive advantage, add the cost of a new monitor. When considering Matnle support, think multiplatform and think next-gen consoles having AMD GPUs. Another plus side for NVidia is shadowplay and SHIELD though (but again, added costs if you consider SHIELD).Gsync is not such a game changer as you have yet to see both a monitor with Gsync AND its pricing. The fact that I would have to upgrade my monitor and that that Gsync branding will add another few $$$ on the price tag is something you guys have to consider.
So to consider Gsync as a competitive advantage when considering a card, add the cost of a monitor to that. Perfect for those that are going to upgrade soon but for those that won't, Gsync is moot.
Mantle on its plus side will be used on consoles and pc (as both PS4 and Xbox One have AMD processors, developpers of games will most probably be using it). You might not care about consoles but they are part of the gaming ecosystem and sadly, we pc users tend to get the shafted by developpers because of consoles. I remember Frankieonpc mentioning he used to play tons of COD back in the COD4 days and said that development tends to have shifted towards consoles so the tuning was a bit more off for pc (paraphrasing slightly).
I'm in the market for both a new monitor and maybe a new card so I'm a bit on the fence...
Wreckage - Thursday, November 7, 2013 - link
Mantle will not be used on consoles. AMD already confirmed this.althaz - Thursday, November 7, 2013 - link
Mantle is not used on consoles...because the consoles already have something very similar.Kamus - Thursday, November 7, 2013 - link
You are right, consoles use their own API for GCN, guess what mantle is used for?*spoiler alert* GCN
EJS1980 - Thursday, November 7, 2013 - link
Mantle is irrefutably NOT coming to consoles, so do your due diligence before trying to make a point. :)