Power, Temperature, & Noise

As always, last but not least is our look at power, temperature, and noise. Next to price and performance of course, these are some of the most important aspects of a GPU, due in large part to the impact of noise. All things considered, a loud card is undesirable unless there’s a sufficiently good reason – or sufficiently good performance – to ignore the noise.

GeForce GTX 780 Series Voltages
GTX 780 Ti Boost Voltage GTX 780 Boost Voltage GTX 780 Ti Base Voltage
1.187v 1.1625v 1.012v

Taking a quick look at voltages, we find that our GTX 780 Ti operates at a slightly higher voltage at its maximum boost bin than the original GTX 780 did. The difference is minor, but the additional voltage may be necessary to hit the slightly higher clockspeeds GTX 780 Ti operates at relative to GTX Titan and GTX 780.

GeForce GTX 780 Ti Average Clockspeeds
Max Boost Clock 1020MHz
Metro: LL
1000MHz
CoH2
997MHz
Bioshock
954MHz
Battlefield 3
980MHz
Crysis 3
980MHz
Crysis: Warhead
1000MHz
TW: Rome 2
950MHz
Hitman
993MHz
GRID 2
967MHz
Furmark
823MHz

Moving on to clockspeeds, we find that the GTX 780 Ti does very well when it comes to boosting. With a maximum boost clock of 1020MHz, we have 2 benchmarks averaging 1000MHz, and another 4 averaging 980MHz or better.

With all of our GK110 cards sharing a common design, at idle there’s very little to differentiate them. Other than GTX Titan’s extra 3GB of VRAM, we’re essentially looking at identical cards when idling.

Moving on to load power, we can see the power/heat ramifications of the slight clockspeed increase coupled with the activation of the 15th SMX. Even with the further optimizations NVIDIA has put into the new revision of GK110, power consumption has gone up in accordance with the higher performance of the card, just as we’d expect. Since NVIDIA doesn’t notably alter their power efficiency here, that increased performance has to come at the cost of increased power consumption. Though in this benchmark it’s worth pointing out that we’re measuring from the wall and that GTX 780 Ti outperforms GTX Titan by 8%, so some of that 29W power difference will come from the higher CPU load caused by the increased framerates.

As for the GTX 780 Ti SLI, here we see power level off at 556W, 20W more than the GTX 780 SLI. Some (if not most) of that is going to be explained by the increased CPU power consumption from the GTX 780 Ti SLI’s higher framerates. Coupled with that is the fact that in SLI setups these cards get hotter, and hence have to downclock a bit more to maintain equilibrium, which helps to offset the increased power requirements of GTX 780 Ti and keep the SLI results so close to the GTX 780 SLI results.

Switching over to FurMark, we find that power consumption is also up, but only slightly. With GPU Boost 2.0 clamping down on power consumption all of our GK110 cards should be clamped at 250W here, and with a difference between GTX 780 and GTX 780 Ti of under 10W, that’s exactly what appears to be happening here.

On a side note, it’s interesting to note here that under FurMark we’re seeing the GTX 780 Ti draw more power than the Radeon R9 290X. Despite the fact that the 290X has a higher rated TDP, in the card’s default quiet mode the card can’t actually dissipate as much heat (and thereby consume as much power) as the GTX 780 Ti can.

For idle temperatures we’re once again looking at cards that are for all intents and purposes identical. At 30C the GTX 780 Ti easily stays nice and cool.

As we mentioned in our look at the GTX 780 Ti hardware, NVIDIA has increased their default temperature throttle point from 80C on the GTX Titan/780 to 83C on the GTX 780 Ti. The end result is that in all of our temperature limited tests the GTX 780 Ti will peak at 83C-84C, whereas the older GK110 cards will peak at 80C-81C.

FurMark reiterates what we saw with Crysis 3. The temps are up a bit across the board, while the GK110 cards are holding near their throttle points. The SLI setups meanwhile approach the upper-80s at 88C, reflecting the fact that even with blowers, there’s some impact on neighboring cards in high load situations.

Our last idle scenario, we once again see all of our GK110 cards performing similarly, with idle noise levels in the 38dB-39dB range.

Moving on to our gaming load noise results, we can see the full repercussions of the GTX 780 Ti’s higher average power consumption coupled with the card’s higher temperature throttle point. Moving the throttle point along the same curve has the end result of moving higher the equilibrium point and thereby the card’s operating noise levels. As the fastest single-GPU card on this card, the GTX 780 Ti is still doing very well for itself and for a blower based design at 51.7dB, though at 1.5dB louder than GTX Titan and 4.2dB louder than GTX 780 the noise tradeoff for the card’s higher performance is very clear. Meanwhile the fact that it’s tied with the GTX 780 SLI comes with its own bit of irony.

Speaking of the GTX 780 SLI, we can see the noise impact of SLI configurations too. The GTX 780 Ti SLI levels out at 53.7dB, 2dB louder than our single-card configuration and 2dB louder than the GTX 780 SLI. At this point it’s just a bit louder than the 290X and quieter than a number of other 290 series setups.

Finally with load noise levels under FurMark we can see where our various cards will peak at for noise levels. The GTX 780 Ti creeps up to 52.3dB, essentially tying with the GTX 780 and GTX Titan. Otherwise it comes in just behind the 290X, and the start of the pack for our multi-GPU setups.

As for the GTX 780 Ti SLI, like our single-card comparison points its up slightly as compared to the GTX 780 SLI.

Overall, our look at power, temperatures, and noise has been a rather straightforward validation of our earlier suspicions. GTX 780 Ti’s higher performance leads to higher power consumption, and will all other factors being held equal – including the cooler – power, temps, and noise levels all rise a bit as compared to GTX Titan and GTX 780. There’s no such thing as a free lunch here, and while GPU Boost 2.0 will keep the maximum levels suitably in check, on average GTX 780 Ti is going to be a bit worse than the other GK110 cards due to those factors. Though even with the increased noise levels in particular, GTX 780 Ti is still able to outperform 290X on noise while also delivering better gaming performance, which makes this another tidy victory for NVIDIA.

Compute Overclocking
Comments Locked

302 Comments

View All Comments

  • r13j13r13 - Friday, November 8, 2013 - link

    a los fans de NVIDIA no se preocupen pronto sacaran la version de 1000 dolares con un 5% mas de rendimiento
  • twtech - Friday, November 8, 2013 - link

    An interesting comparison would be a 780 Ti vs. crossfired 290s. In the previous generation of cards, I wouldn't have considered that fair, as any type of SLI/crossfire setup was definitely inferior to any single-card setup in a variety of ways. But that's changed in this latest generation.

    I bought two 290X on launch day (I knew they would go out of stock by day 3, and not come back into general availability for something like 2 months), but the experience compared to the last time I'd tried CF a few years back was completely different. There are no bridges to worry about - you just plug in the two cards and go. CF doesn't have any sync polarity issues, and the driver support for CF & multi-monitor is actually pretty fleshed out. I didn't notice any stuttering or texture corruption as I had the previous time I'd given CF a try.

    In fact the only reason I tried it now is because I have 3 2560x1600 monitors, and driving that is too much to ask out of a single card. The two 290X handle it easily though.
  • FuriousPop - Sunday, November 10, 2013 - link

    what frames are you getting?
    i currently have 2x7970's in CF and was looking to upgrade to handle my same setup.
  • DMCalloway - Friday, November 8, 2013 - link

    Wow !!! There sure are a lot of used 780's on Eb*y......meanwhile in a very luxurious board room in Santa Clara California ..... ' but sir.... do you really think they'll sell at that price point? '.....( while laughing ) ' of course they'll sell at that price point; our consumer research polls show that our customer base simply can't help themselves.'..... and throughout the world the rustling of wallets and swishing of credit cards could be heard as green team loyalist geared up to purchase their second almost $700 gtx 780 for 2013...... : )
  • polaco - Friday, November 8, 2013 - link

    This is what we are talking about:
    http://www.tomshardware.com/reviews/radeon-r9-290-...
    when 290's get theirs non reference coolers NVidia 780 Ti will have to take it's bargains and go home definitley. AMD's 290 ans 290X series are full of hopes to hit even better performance numbers, however NVidia's 780Ti are at it's max.
  • EJS1980 - Friday, November 8, 2013 - link

    " AMD's 290 ans 290X series are full of hopes to hit even better performance numbers, however NVidia's 780Ti are at it's max..."

    OC'ing the 780ti will give you around 15-20% more performance, or higher, so what the hell are you talking about? I realize you're in love with all things AMD, but if you can take your goggles off for a second, you'd realize the 780ti is actually a really great card (much like the 290(x), obviously).
    I had NO IDEA how many AMD fanboys could be found mouth-breathing on the internet, at any given time. Which begs the question: if AMD has so many fanboys, why the f*ck are they doing so poorly in the discrete GPU market?
  • polaco - Saturday, November 9, 2013 - link

    Yes I do prefer AMD due to it's fair price. However what I'm talking about is that with non reference cooler 290-290X will be able to run pretty cool and have decent overclocking potential too, as shown in tomshardware chart. Since they cost several bucks less than NVidia cards and at that point should be a pretty closed gap in performance (in fact they already are) then AMD cards will be at an extremely nice price/performance point. What do you mean by poorly in discrete GPU? Many APUs has been sold, APUs are replacing discrete GPUs, all PS4 and XBox One are like discrete GPUs. And I do have preference by AMD but mainly coz this reasons: they have always been trying to innovate, they have to compete with a giant as Intel and they bring price balance to the GPU / CPU market. That doen't mean I will buy them whatever they take to the market, I evaluate all options and buy what fits my needs better. In fact 780ti is a great card nobody says the opposite, just quite expensive from my point of view and I don't want to get into the "how much NVidia has been abusing buyers wallet during this months". I wonder if any NVidia fan that has acquired and 780 or Titan previouly to 290 entry to the market could recognize that...
  • Owls - Saturday, November 9, 2013 - link

    OCing a Ti is not guaranteed. Why do people parrot this info around like every card is going to peform the way you describe?
  • EJS1980 - Saturday, November 9, 2013 - link

    Again, what the hell are you people talking about???

    Even though results can very from card to card, EVERY 780ti can be overclocked to boost performance by a significant margin. These chips are the cream of the Kepler crop, and Nvidia is confident enough with their yields that a substantial OC is all but guaranteed with each card, as EVERY review so far has illustrated.
    I personally feel this card is about a $100 overpriced, and as such, I will NOT be upgrading at this time. I also believe that even with the significant problems inherent to the new Hawaii chips, they are powerful cards at an EXCELLENT price point.
    However, I'm not going to pretend that the 290(x) are faster than the 780ti, just because their priced better. So many of you guys keep pointing out that once after market solutions arrive, the 290(x) will take back the crown, and that simply isn't true. Performance will obviously improve, but only to levels comparable to a STOCK 780ti, and maybe not even that. That's where OC'ing comes in to play, for if we're going to compare the 290(x) OC'd with a better cooling solution, then the same must be applied to the 780ti too. I expect the 780ti to maintain its 5-15% performance advantage over the 290(x) after they've BOTH released their aftermarket solutions, so the question ultimately returns to whether or not the consumer finds that performance advantage to be worthy of the price differential. Just because you don't, or I don't, does NOT mean that anyone else won't too, or that there isn't even a advantage to begin with, which there undoubtedly will be...
  • Mondozai - Friday, December 13, 2013 - link

    EJS1980, the mouth-breathing Nvidia fanboy, you're talking about a card(GTX 780 Ti) which with an aftermarket cooler could have an advantage as low as 5% for 200-250 dollars more in price. Only a Nvidia buttboy would think that's a good deal, you've been raped by them through their pricing for so long, you've come to even enjoy it.

    Most sane, non-buttboys will opt for the best price/performance ratio. Including for high-end cards. A 290 in CF with aftermarket coolers will crush everything. Even a 290X on an aftermarket cooler is going to do a lot better, especially as we transition to 4K within the next 1-2 years.

    Stop being a buttboy for Nvidia.

    (P.S. I'm currently using an Nvidia card, but I always get embarrassed when I see buttboys for a specific company like yourself).

Log in

Don't have an account? Sign up now