Meet The GeForce GTX 780 Ti

When it comes to the physical design and functionality of the GTX 780 Ti, to no surprise NVIDIA is sticking with what works. The design of the GTX Titan and its associated cooler have proven themselves twice over now between the GTX Titan and the GTX 780, so with only the slightest of changes this is what NVIDIA is going with for GTX 780 Ti, too. Consequently there’s very little new material to cover here, but we’ll quickly hit the high points before recapping the general design of what has now become the GTX 780 series.

The biggest change here is that GTX 780 Ti is the first NVIDIA launch product to feature the new B1 revision of their GK110 GPU. B1 has already been shipping for a couple of months now, so GTX 780 Ti isn’t the first card to get this new GPU. However while GTX Titan and GTX 780 products currently contain a mix of the old and new revisions as NVIDIA completes the change-over, GTX 780 Ti will be B1 (and only B1) right out the door.

As for what’s new for B1, NVIDIA is telling us that it’s a fairly tame revision of GK110. NVIDIA hasn’t made any significant changes to the GPU, rather they’ve merely gone in and fixed some errata that were in the earlier revision of GK110, and in the meantime tightened up the design and leakage just a bit to nudge power usage down, the latter of which is helpful for countering the greater power draw from lighting up the 15th and final SMX. Otherwise B1 doesn’t have any feature changes nor significant changes in its power characteristics relative to the previous revision, so it should be a fairly small footnote compared to GTX 780.

The other notable change coming with GTX 780 Ti is that NVIDIA has slightly adjusted the default temperature throttle point, increasing it from 80C to 83C. The difference in cooling efficiency itself will be trivial, but since NVIDIA is using the exact same fan curve on the GTX 780 Ti as they did the GTX 780, the higher temperature throttle effectively increases the card’s equilibrium point, and therefore the average fan speed under load. Or put another way, but letting it get a bit warmer the GTX 780 Ti will ramp up its fan a bit more and throttle a bit less, which should help offset the card’s increased power consumption while also keeping thermal throttling minimized.

GeForce GTX 780 Series Temperature Targets
GTX 780 Ti Temp Target GTX 780 Temp Target GTX Titan Temp Target
83C 80C 80C

Moving on, since the design of the GTX 780 Ti is a near carbon copy of GTX 780, we’re essentially looking at GTX 780 with better specs and new trimmings. NVIDIA’s very effective (and still quite unique) metallic GTX Titan cooler is back, this time featuring black lettering and a black tinted window. As such GTX 780 Ti remains a 10.5” long card composed of a cast aluminum housing, a nickel-tipped heatsink, an aluminum baseplate, and a vapor chamber providing heat transfer between the GPU and the heatsink. The end result is the GTX 780 Ti is a quiet card despite the fact that it’s a 250W blower design, while still maintaining the solid feel and eye-catching design that NVIDIA has opted for with this generation of cards.

Drilling down, the PCB is also a re-use from GTX 780. It’s the same GK110 GPU mounted on the same PCB with the same 6+2 phase power design. This being despite the fact that GTX 780 Ti features faster 7GHz memory, indicating that NVIDIA was able to hit their higher memory speed targets without making any obvious changes to the PCB or memory trace layouts. Meanwhile the reuse of the power delivery subsystem is a reflection of the fact that GTX 780 Ti has the same 250W TDP limit as GTX 780 and GTX Titan, though unlike those two cards GTX 780 Ti will have the least headroom to spare and will come the closest to hitting it, due to the general uptick in power requirements from having 15 active SMXes. Finally, using the same PCB also means that GTX 780 has the same 6pin + 8pin power requirement and the same display I/O configuration of 2x DL-DVI, 1x HDMI, 1x DisplayPort 1.2.

On a final note, for custom cards NVIDIA won’t be allowing custom cards right off the bat – everything today will be a reference card – but with NVIDIA’s partners having already put together their custom GK110 designs for GTX 780, custom designs for GTX 780 Ti will come very quickly. Consequently, expect most (if not all of them) to be variants of their existing custom GTX 780 designs.

The NVIDIA GeForce GTX 780 Ti Review Hands On With NVIDIA's Shadowplay & The Test
Comments Locked

302 Comments

View All Comments

  • NewCardNeeded - Thursday, November 7, 2013 - link

    Even the R9 280X crossfire beats the GTX 780 Ti in SLI in several cases !!!!!!!! Note that I do mean the 280, it's not a typo. The 290X Crossfire *SLAUGHTERS* the 780 Ti in SLI AND it's a fraction of the price.
  • austinsguitar - Thursday, November 7, 2013 - link

    okay okay...lets tell this guy about what happens after a new nvidia graphics card comes out shall we...first 2 weeks a card comes out (ALWAYS UN OPTIMIZED FOR SLI) 2 weeks later (ABSOLUTE SCALING WITH THE NEXT AVAILABLE DRIVER) happens every time dude. That little guy will be better in two weeks, just trust me
  • NewCardNeeded - Thursday, November 7, 2013 - link

    I'm not so sure this time. Nvidia have held back the 780 Ti for months until AMD released their new cards. They've had plenty of time to optimize for SLI. Expect small gains yes, but nothing more.

    Let's see what happens when mantle comes out...
  • austinsguitar - Thursday, November 7, 2013 - link

    Temperature, power (wattage), noise....This beats the 290x bad....
    Think about this....95 degrees and the ungodly noise coming from the 290x is ABSOLUTELY "UN"ACCEPTABLE... The card is cheap yes, but after 2 years of game playing your energy bill will determine that factor. I do realize that amd's drivers are getting better, but come on people....mantle?
  • NewCardNeeded - Thursday, November 7, 2013 - link

    Have you never heard of a "third party cooler"?

    Coming this way soon !
  • NewCardNeeded - Thursday, November 7, 2013 - link

    Read the article again without your green tinted glasses on!

    Full load on Crysis 3:

    Power (W):
    780 Ti = 372
    290X = 375

    Does it really beat the 290X bad?
  • austinsguitar - Thursday, November 7, 2013 - link

    oh im sorry i was speaking about crossfire and sli configurations when putting into account the power draw... and everyone knows that when nvidia plays its games at 60 it clocks things lower, and power draw is very impressive...odds are these cards will never see below 60 for a while now, and nvidia's power draw at medium loads are phenomenal.
  • Kutark - Thursday, November 7, 2013 - link

    Jesus Christ, i've been reading the comments. The AMD fanbois are starting to get worse than Biodrones were during the release of TORtanic. Granted there is some definite Nvidia fanboism going on but the reality is this. Nvidia is on a 9 month old architecture and is able to put out a card that beats AMD's brand new architecture's top dawg by roughly 10%, running at a significantly lower temperature, and significantly quieter.

    Does that justify a $200 price gap? Well thats up to the consumer to decide. But to try to suggest this is a "Tie" or anything other than Nvidia reclaiming the fastest single card crown is just being ridiculous.

    I just find it hilarious some of these AMD people sitting here spouting off these very specific scenarios where the AMD card comes out on top and acts like that means anything. Ok, so crossfire 290x (only a thousand dollars!) beats sli 780ti's in several cases, whoopety do. This will affect all of the 1/10th of 1% of people who will pay that kind of money for a graphical solution on their gaming rigs.

    The other thing is, Nvidia's architecture is 9 months old for shits sake. You dont think they will have something else out in a few months thats going just crap all over AMD's new
  • Kutark - Thursday, November 7, 2013 - link

    Gah, stupid IE9 (im on a computer i can't install a good browser on). Anyways, i was just saying, Nvidia will likely release something early to mid 2014 which will probably blow any current gen cards out of the water and then where is AMD? Same spot they've been in?

    I'm glad AMD released the 290x, it is overall a HUGE step forward for them and im glad nvidia has some real competition. That is only a good thing for the consumer. But overblowing this 290x as something it isnt is not doing any favors. We need to stop blowing smoke up AMD's ass so they actually keep pushing themselves and come out with a proper Nvidia smasher, and then nvidia will be in a position that they cant keep charging 400-800 dollars for cards that should be 250-400 dollars.
  • UpSpin - Friday, November 8, 2013 - link

    I don't get your comment and I'm no GPU fanboy at all (even though I only bought NVidia GPUs in the past), because I barely game high end games and find such high prices (both AMD and NVidia) for a GPU ridiculous. But I'm interested in tech and consider buying a mid-tie GPU because my Nvidia GTX 560 TI starts acting strange.

    What matters is what NVIDIA or AMD sells now and what it costs now. It's a fact, that the 290X beats in half of the benchmarks the 780 Ti. The other half the 780 Ti wins. It's a fact, that the power draw between both single cards is identical. And it's a fact that the newly released 780 Ti is $150 more expensive than the newly released 290X.

    Of course is the 290X too loud, but that's not a issue of the GPU (same power draw), more of the cooler, which should be fixed with a third party cooler implemented by ASUS, ... The NVidia reference coolers were always superb (that's why I own a reference EVGA 560 Ti, because it was really silent compared to the similar priced alternatives).

    We live here and now and we can only buy the current stuff. So I don't care if Nvidia might release in the near or far future an even better card (at the time AMD might release a new card, too). And if you want to buy a GPU now, the Nvidia is, regarding the price, a complete rip off compared to the AMD.

    As an excuse for the 'poor' performance of the Nvidia card you said, it's 9 month technology. So let me get this straight:
    NVidia sells you 9 month old technology for $150 more than AMD asks you for the latest bleeding edge technolgy they can offer? And you defend this? Are you serious? Nvidia sold the Titan for even more during the last months. So be damn happy that AMD released such a great card at such a low price point, else you would get ripped off by NVidia the following months, too.

Log in

Don't have an account? Sign up now