Meet The GeForce GTX 780, Cont

With all of that said, GTX 780 does make one notable deviation from GTX Titan. NVIDIA has changed their stock fan programming for GTX 780, essentially slowing down the fan response time to even out fluctuations in fan speeds. NVIDIA has told us that they’ve found that next to loud fans in general, the second most important factor in fan noise becoming noticeable is rapidly changing fan speeds, with the changing pitch and volume drawing attention to the card. Slowing down the response time in turn will in theory keep the fan speed from spiking so much, or quickly dropping (i.e. loading screen) only to have to immediately jump back up again.

In our experience fan response times haven’t been an issue with Titan or past NVIDIA cards, and we’d be hard pressed to tell the difference between GTX 780 and Titan. With that said there’s nothing to lose from this change, GTX 780 doesn’t seem to be in any way worse for it, so in our eyes there’s no reason for NVIDIA not to go ahead with the change.

On that note, since this is purely a software(BIOS) change, we asked NVIDIA about whether this could be backported to the hardware equivalent Titan. The answer is fundamentally yes, but because NVIDIA doesn’t have a backup BIOS system, they aren’t keen on using BIOS flashing any more than necessary. So an official (or even unofficial) update from NVIDIA is unlikely, but given the user community’s adept BIOS modding skills it’s always possible a 3rd party could accomplish this on their own.

Moving on, unlike Titan and GTX 690, NVIDIA will be allowing partners to customize GTX 780, making this the first line of GK110 cards to allow customization. Potential buyers that were for whatever reason disinterested in Titan due to its blower will find that NVIDIA’s partners are already putting together more traditional open air cooler coolers for GTX 780. We can’t share any data about them yet – today is all about the reference card – but we already have one such card in-hand with EVGA’s GeForce GTX 780 ACX.

The reference GTX 780 sets a very high bar in terms of build quality and performance, so it will be interesting to see what NVIDIA’s partners can come up with. With NVIDIA testing and approving all designs under their Greenlight program, all custom cards have to meet or beat NVIDIA’s reference card in factors such as noise and power delivery, which for GTX 780 will not be an easy feat.  However because of this requirement it means NVIDIA’s partners can deviate from NVIDIA’s reference design without buyers needing to be concerned that custom cards are significantly worse than then reference cards, something that benefits NVIDIA’s partners by their being able to attest to the quality of their products (“it got through Greenlight”), and benefitting buyers by letting them know they’re getting something that will be as good as the reference GTX 780, regardless of the specific make or model.

On that note, since we’re talking about card construction let’s quickly dive into overclocking. Overclocking is essentially unchanged from GTX Titan, especially since everything so far is using the reference PCB. The maximum power target remains at 106% (265W) and the maximum temperature target remains at 95C. Buyers will be able to adjust these as they please through Precision X and other tools, but no more than they already could on Titan, which means overclocking is fairly locked down.

Overvolting is also supported in a Titan-like manner, and once again is at the discretion of the card’s partner. By default GTX 780 has a maximum voltage of 1.1625v, with approved overvolting allowing the card to be pushed to 1.2v. This comes in the form of higher boost bins, so enabling overvolting is equivalent to unlocking a +13MHz bin and a +26MHz bin and their requisite voltages. However this also means that those voltages aren’t typically reached with overclocking and overvolting only has a minimal effect, as most overclocking attempts are going to hit TDP limits before they hit the unlocked boost bins.

GeForce Clockspeed Bins
Clockspeed GTX Titan GTX 780
1032MHz N/A 1.2v
1019MHz 1.2v 1.175v
1006MHz 1.175v 1.1625v
992MHz 1.1625v 1.15v
979MHz 1.15v 1.137v
966MHz 1.137v 1.125v
953MHz 1.125v 1.112v
940MHz 1.112v 1.1v
927MHz 1.1v 1.087v
914MHz 1.087v 1.075v

 

Meet The GeForce GTX 780 Software: GeForce Experience, Out of Beta
Comments Locked

155 Comments

View All Comments

  • Stuka87 - Thursday, May 23, 2013 - link

    The video card does handle the decoding and rendering for the video. Anand has done several tests over the years comparing their video quality. There are definite differences between AMD/nVidia/Intel.
  • JDG1980 - Thursday, May 23, 2013 - link

    Yes, the signal is digital, but the drivers often have a bunch of post-processing options which can be applied to the video: deinterlacing, noise reduction, edge enhancement, etc.
    Both AMD and NVIDIA have some advantages over the other in this area. Either is a decent choice for a HTPC. Of course, no one in their right mind would use a card as power-hungry and expensive as a GTX 780 for a HTPC.

    In the case of interlaced content, either the PC or the display device *has* to apply post-processing or else it will look like crap. The rest of the stuff is, IMO, best left turned off unless you are working with really subpar source material.
  • Dribble - Thursday, May 23, 2013 - link

    To both of you above, on DVD yes, not on bluray - there is no interlacing, noise or edges to reduce - bluray decodes to a perfect 1080p picture which you send straight to the TV.

    All the video card has to do is decode it, which why a $20 bluray player with $5 cable will give you exactly the same picture and sound quality as a $1000 bluray player with $300 cable - as long as TV can take the 1080p input and hifi can handle the HD audio signal.
  • JDG1980 - Thursday, May 23, 2013 - link

    You can do any kind of post-processing you want on a signal, whether it comes from DVD, Blu-Ray, or anything else. A Blu-Ray is less likely to get subjective quality improvements from noise reduction, edge enhancement, etc., but you can still apply these processes in the video driver if you want to.

    The video quality of Blu-Ray is very good, but not "perfect". Like all modern video formats, it uses lossy encoding. A maximum bit rate of 40 Mbps makes artifacts far less common than with DVDs, but they can still happen in a fast-motion scene - especially if the encoders were trying to fit a lot of content on a single layer disc.

    Most Blu-Ray content is progressive scan at film rates (1080p23.976) but interlaced 1080i is a legal Blu-Ray resolution. I believe some variants of the "Planet Earth" box set use it. So Blu-Ray playback devices still need to know how to deinterlace (assuming they're not going to delegate that task to the display).
  • Dribble - Thursday, May 23, 2013 - link

    I admit it's possible to post process but you wouldn't, a real time post process is highly unlikely to add anything good to the picture - fancy bluray players don't post process, they just pass on the signal. As for 1080i that's a very unusual case for bluray, but as it's just the standard HD TV resolution again pass it to the TV - it'll de-interlace it just like it does all the 1080i coming from your cable/satelight box.
  • Galidou - Sunday, May 26, 2013 - link

    ''All the video card has to do is decode it, which why a $20 bluray player with $5 cable will give you exactly the same picture and sound quality as a $1000 bluray player with $300 cable - as long as TV can take the 1080p input and hifi can handle the HD audio signal.''

    I'm an audiophile and a professionnal when it comes to hi-end home theater, I myself built tons of HT system around PCs and or receivers and I have to admit this is the funniest crap I've had to read. I'd just like to know how many blu-ray players you've personnally compared up to let's say the OPPO BDP -105(I've dealt with pricier units than this mere 1200$ but still awesome Blu-ray player).

    While I can certainly say that image quality not affected by much, the audio on the other side sees DRASTIC improvements. Hardware not having an effect on sound would be like saying: there's no difference between a 200$ and a 5000$ integrated amplifier/receiver, pure non sense.

    ''the same picture and sound quality''

    The part speaking about sound quality should really be removed from your comment as it really astound me to think you can beleive what you said is true.
  • eddman - Thursday, May 23, 2013 - link

    http://i.imgur.com/d7oOj7d.jpg
  • EzioAs - Thursday, May 23, 2013 - link

    If I were a Titan owner (and I actually purchase the card, not some free giveaway or something), I would regret that purchase very, very badly. $650 is still a very high price for the normal GTX x80 cards but it makes the Titan basically a product with incredibly bad pricing (not that we don't know that already). Still, I'm no Titan owner, so what do I know...

    On the other hand, when I look at the graphs, I think the HD7970 is an even better card than ever despite it being 1.5 years older. However, as Ryan pointed out for previous GTX500 users who plan on sticking with Nvidia and are considering high end cards like this, it may not be a bad card at all since there are situations (most of the time) where the performance improvements are about twice the GTX580.
  • JeffFlanagan - Thursday, May 23, 2013 - link

    I think $350 is almost pocket-change to someone who will drop $1000 on a video card. $1K is way out of line with what high-quality consumer video cards go for in recent years, so you have to be someone who spends to say they spent, or someone mining one of the bitcoin alternatives in which case getting the card months earlier is a big benefit.
  • mlambert890 - Thursday, May 23, 2013 - link

    I have 3 Titans and don't regret them at all. While I wouldn't say $350 is "pocket change" (or in this case $1050 since its x3), it also is a price Im willing to pay for twice the VRAM and more perf. With performance at this level "close" doesn't count honestly if you are looking for the *highest* performance possible. Gaming in 3D surround even 3xTitan actually *still* isn't fast enough, so no way I'd have been happy with 3x780s for $1000 less.

Log in

Don't have an account? Sign up now