Meet The GeForce GTX 780

As we previously mentioned, the GTX 780 is very much a Titan Mini in a number of ways. This goes for not only the architecture, features, and performance, but as it turns out it will be the case for the design too. For the reference GTX 780 NVIDIA will be straight-up reusing the GTX Titan’s board design, from the PCB to the cooler, and everything in between.

As a result the reference GTX 780 inherits all of the great things about the GTX Titan’s design. We won’t go into significant detail here – please read our GTX Titan review for a full breakdown and analysis of Titan’s design – but in summary this means we’re looking at a very well built blower design almost entirely constructed out of metal. GTX 780 is a 10.5” long card composed of a cast aluminum housing, a nickel-tipped heatsink, an aluminum baseplate, and a vapor chamber providing heat transfer between the GPU and the heatsink. The end result is that the reference GTX 780 like Titan before it is an extremely quiet card despite the fact that it’s a 250W blower design, while it also maintains the solid feel and eye-catching design of GTX Titan.

Drilling down, the PCB is also a re-use from Titan. It’s the same GK110 GPU mounted on the same PCB with the same 6+2 phase power design. This is part of the reason that GTX 780 has the same TDP as GTX Titan, while at the same time giving GTX 780 as much or more TDP headroom than Titan itself. Using the same PCB also means that GTX 780 has the same 6pin + 8pin power requirement and the same display I/O configuration of 2x DL-DVI, 1x HDMI, 1x DisplayPort 1.2.

Also being carried over from Titan is GPU Boost 2.0, which was first introduced there and has since been added to additional products (many GeForce 700M products already have it). GPU Boost is essentially a further min-maxed turbo scheme that more closely takes into account temperatures and GPU leakage characteristics to determine what boost bins can be used while staying below TDP. It’s more temperature dependent than the original GPU Boost and as a result more variable, but in cooler situations it allows tapping into that thermal headroom to hit higher clockspeeds and greater performance, TDP allowing. At the same time this means GTX 780 also gains GPU Boost 2.0’s temperature target functionality, which allows users to cap boost by temperature as well as TDP. As with Titan this limit is 80C by default, with the idea being that adjusting the limit is a proxy for adjusting the performance of the card and the amount of noise it generates.

NVIDIA GeForce GTX 780 Review Meet The GeForce GTX 780, Cont
Comments Locked

155 Comments

View All Comments

  • lukarak - Friday, May 24, 2013 - link

    1/3rd FP32 and 1/24th FP32 is nowhere near 10-15% apart. Gaming is not everything.
  • chizow - Friday, May 24, 2013 - link

    Yes fine cut gaming performance on 780 and Titan down to 1/24th and see how many of these you sell at $650 and $1000.
  • Hrel - Friday, May 24, 2013 - link

    THANK YOU!!!! WHY this kind of thing isn't IN the review is beyond me. As much good work as Nvidia is doing they're pricing schemes, naming schemes and general abuse of customers has turned me off of them forever. Which convenient because AMD is really getting their shit together quickly.
  • chizow - Saturday, May 25, 2013 - link

    Ryan has danced around this topic in the past, he's a pretty straight shooter overall but it goes without saying why he isn't harping on this in his review. He has to protect his (and AT's) relationship with Nvidia to keep the gravy train flowing. They have gotten in trouble with Nvidia in the past (sometime around the "not covering PhysX enough" fiasco, along with HardOCP) and as a result, their review allocation suffered.

    In the end, while it may be the truth, no one with a vested interest in these products and their future success contributing to their livelihoods wants to hear about it, I guess. It's just us, the consumers that suffer for it, so I do feel it's important to voice my opinion on the matter.
  • Ryan Smith - Sunday, May 26, 2013 - link

    While you are welcome to your opinion and I doubt I'll be able to change it, I would note that I take a dim view towards such unfounded nonsense.

    We have a very clear stance with NVIDIA: we write what we believe. If we like a product we praise it, if we don't like a product we'll say so, and if we see an issue we'll bring it up. We are the press and our role is clear; we are not any company's friend or foe, but a 3rd party who stakes our name and reputation (and livelihood!) on providing unbiased and fair analysis of technologies and products. NVIDIA certainly doesn't get a say in any of this, and the only thing our relationship is built upon is their trusting our methods and conclusions. We certainly don't require NVIDIA's blessing to do what we do, and publishing the truth has and always will come first, vendor relationships be damned. So if I do or do not mention something in an article, it's not about "protecting the gravy train", but about what I, the reviewer, find important and worth mentioning.

    On a side note, I would note that in the 4 years I have had this post, we have never had an issue with review allocation (and I've said some pretty harsh things about NVIDIA products at times). So I'm not sure where you're hearing otherwise.
  • chizow - Monday, May 27, 2013 - link

    Hi Ryan I respect your take on it and as I've said already, you generally comment on and understand more about the impact of pricing and economy more than most other reviews, which is a big part of the reason I appreciate AT reviews over others.

    That being said, much of this type of commentary about pricing/economics can be viewed as editorializing, so while I'm not in any way saying companies influence your actual review results and conclusions, the choice NOT to speak about topics that may be considered out of bounds for a review does not fall under the scope of your reputation or independence as a reviewer.

    If we're being honest here, we're all human and business is conducted between humans with varying degrees of interpersonal relationships. While you may consider yourself truthful and forthcoming always, the tendency to bite your tongue when friendships are at stake is only natural and human. Certainly, a "How's your family?" greeting is much warmer than a "Hey what's with all that crap you wrote about our GTX Titan pricing?" when you meet up at the latest trade show or press event. Similarly, it should be no surprise when Anand refers to various moves/hires at these companies as good/close friends, that he is going to protect those friendships where and when he can.

    In any case, the bit I wrote about allocation was about the same time ExtremeTech got in trouble with Nvidia and felt they were blacklisted for not writing enough about PhysX. HardOCP got in similar trouble for blowing off entire portions of Nvidia's press stack and you similarly glossed over a bunch of the stuff Nvidia wanted you to cover. Subsequently, I do recall you did not have product on launch day and maybe later it was clarified there was some shipping mistake. Was a minor release, maybe one of the later Fermi parts. I may be mistaken, but pretty sure that was the case.
  • Razorbak86 - Monday, May 27, 2013 - link

    Sounds like you've got an axe to grind, and a tin-foil hat for armor. ;)
  • ambientblue - Thursday, August 8, 2013 - link

    Well, you failed to note how the GTX 780 is essentially kepler's version of a GTX 570. It's priced twice as high though. The Titan should have been a GTX 680 last year... its only a prosumer card because of the price LOL. that's like saying the GTX 480 is a prosumer card!!!
  • cityuser - Thursday, May 23, 2013 - link

    whatever Nvidia do, it never improve their 2D quality, I mean , look at what nVidia will give you at BluRay playing, the color still dead , dull, not really enjoyable.
    It's terrible to use nVidia to HD home cinema, whatever setting you try.
    Why nVidia can ignore this? because it's spoiled.
  • Dribble - Thursday, May 23, 2013 - link

    What are you going on about?

    Bluray is digital, hdmi is digital - that means the signal is decoded and basically sent straight to the TV - there is no fiddling with colours, or sharpening or anything else required.

Log in

Don't have an account? Sign up now