Power, Temperature, & Noise

As always, last but not least is our look at power, temperature, and noise. Next to price and performance of course, these are some of the most important aspects of a GPU, due in large part to the impact of noise. All things considered, a loud card is undesirable unless there’s a sufficiently good reason – or sufficiently good performance – to ignore the noise.

GeForce GTX 780 Series Voltages
GTX 780 Ti Boost Voltage GTX 780 Boost Voltage GTX 780 Ti Base Voltage
1.187v 1.1625v 1.012v

Taking a quick look at voltages, we find that our GTX 780 Ti operates at a slightly higher voltage at its maximum boost bin than the original GTX 780 did. The difference is minor, but the additional voltage may be necessary to hit the slightly higher clockspeeds GTX 780 Ti operates at relative to GTX Titan and GTX 780.

GeForce GTX 780 Ti Average Clockspeeds
Max Boost Clock 1020MHz
Metro: LL
1000MHz
CoH2
997MHz
Bioshock
954MHz
Battlefield 3
980MHz
Crysis 3
980MHz
Crysis: Warhead
1000MHz
TW: Rome 2
950MHz
Hitman
993MHz
GRID 2
967MHz
Furmark
823MHz

Moving on to clockspeeds, we find that the GTX 780 Ti does very well when it comes to boosting. With a maximum boost clock of 1020MHz, we have 2 benchmarks averaging 1000MHz, and another 4 averaging 980MHz or better.

With all of our GK110 cards sharing a common design, at idle there’s very little to differentiate them. Other than GTX Titan’s extra 3GB of VRAM, we’re essentially looking at identical cards when idling.

Moving on to load power, we can see the power/heat ramifications of the slight clockspeed increase coupled with the activation of the 15th SMX. Even with the further optimizations NVIDIA has put into the new revision of GK110, power consumption has gone up in accordance with the higher performance of the card, just as we’d expect. Since NVIDIA doesn’t notably alter their power efficiency here, that increased performance has to come at the cost of increased power consumption. Though in this benchmark it’s worth pointing out that we’re measuring from the wall and that GTX 780 Ti outperforms GTX Titan by 8%, so some of that 29W power difference will come from the higher CPU load caused by the increased framerates.

As for the GTX 780 Ti SLI, here we see power level off at 556W, 20W more than the GTX 780 SLI. Some (if not most) of that is going to be explained by the increased CPU power consumption from the GTX 780 Ti SLI’s higher framerates. Coupled with that is the fact that in SLI setups these cards get hotter, and hence have to downclock a bit more to maintain equilibrium, which helps to offset the increased power requirements of GTX 780 Ti and keep the SLI results so close to the GTX 780 SLI results.

Switching over to FurMark, we find that power consumption is also up, but only slightly. With GPU Boost 2.0 clamping down on power consumption all of our GK110 cards should be clamped at 250W here, and with a difference between GTX 780 and GTX 780 Ti of under 10W, that’s exactly what appears to be happening here.

On a side note, it’s interesting to note here that under FurMark we’re seeing the GTX 780 Ti draw more power than the Radeon R9 290X. Despite the fact that the 290X has a higher rated TDP, in the card’s default quiet mode the card can’t actually dissipate as much heat (and thereby consume as much power) as the GTX 780 Ti can.

For idle temperatures we’re once again looking at cards that are for all intents and purposes identical. At 30C the GTX 780 Ti easily stays nice and cool.

As we mentioned in our look at the GTX 780 Ti hardware, NVIDIA has increased their default temperature throttle point from 80C on the GTX Titan/780 to 83C on the GTX 780 Ti. The end result is that in all of our temperature limited tests the GTX 780 Ti will peak at 83C-84C, whereas the older GK110 cards will peak at 80C-81C.

FurMark reiterates what we saw with Crysis 3. The temps are up a bit across the board, while the GK110 cards are holding near their throttle points. The SLI setups meanwhile approach the upper-80s at 88C, reflecting the fact that even with blowers, there’s some impact on neighboring cards in high load situations.

Our last idle scenario, we once again see all of our GK110 cards performing similarly, with idle noise levels in the 38dB-39dB range.

Moving on to our gaming load noise results, we can see the full repercussions of the GTX 780 Ti’s higher average power consumption coupled with the card’s higher temperature throttle point. Moving the throttle point along the same curve has the end result of moving higher the equilibrium point and thereby the card’s operating noise levels. As the fastest single-GPU card on this card, the GTX 780 Ti is still doing very well for itself and for a blower based design at 51.7dB, though at 1.5dB louder than GTX Titan and 4.2dB louder than GTX 780 the noise tradeoff for the card’s higher performance is very clear. Meanwhile the fact that it’s tied with the GTX 780 SLI comes with its own bit of irony.

Speaking of the GTX 780 SLI, we can see the noise impact of SLI configurations too. The GTX 780 Ti SLI levels out at 53.7dB, 2dB louder than our single-card configuration and 2dB louder than the GTX 780 SLI. At this point it’s just a bit louder than the 290X and quieter than a number of other 290 series setups.

Finally with load noise levels under FurMark we can see where our various cards will peak at for noise levels. The GTX 780 Ti creeps up to 52.3dB, essentially tying with the GTX 780 and GTX Titan. Otherwise it comes in just behind the 290X, and the start of the pack for our multi-GPU setups.

As for the GTX 780 Ti SLI, like our single-card comparison points its up slightly as compared to the GTX 780 SLI.

Overall, our look at power, temperatures, and noise has been a rather straightforward validation of our earlier suspicions. GTX 780 Ti’s higher performance leads to higher power consumption, and will all other factors being held equal – including the cooler – power, temps, and noise levels all rise a bit as compared to GTX Titan and GTX 780. There’s no such thing as a free lunch here, and while GPU Boost 2.0 will keep the maximum levels suitably in check, on average GTX 780 Ti is going to be a bit worse than the other GK110 cards due to those factors. Though even with the increased noise levels in particular, GTX 780 Ti is still able to outperform 290X on noise while also delivering better gaming performance, which makes this another tidy victory for NVIDIA.

Compute Overclocking
Comments Locked

302 Comments

View All Comments

  • HisDivineOrder - Thursday, November 7, 2013 - link

    Sure, you can save money by buying into the R9 290X, but save that money because you're going to need it in a few years for a hearing aid.
  • OverclockedCeleron - Thursday, November 7, 2013 - link

    As if there won't be any custom-cooled R290X-based GPUs. You make it sound like all GPU vendors and partners have abandoned AMD, and that AMD is going to be stuck with that fan forever. Well done for being short-sighted.
  • HalloweenJack - Thursday, November 7, 2013 - link

    [img]http://assets.diylol.com/hfs/8fa/d87/4bb/resized/y...[/img]
  • halo37253 - Thursday, November 7, 2013 - link

    Personally I find it Kinda sad given the Fact that GK110 is a much bigger chip in general it would have a bigger lead at stock. Plus powerusage while gaming goes back and forth with the titan, it is competing with. Nvidia just has more aggressive TDP throttling, while AMD's is mainly temp based.

    290x is only hot under stock cooler, It actually runs pretty cool under water. Also 290x is doing more with less transistors compared to Nvidia. Sucks Nvidia needs to scale Kepler to such a higer level just to compete with AMD's lower offerings. AMD would slaughter Nvidia with a Die of equal size.

    Also with the 290 being $399 Nvidia is boned, unless the drop the gtx780's price again. 290 @ 1300mhz is about the same as a 780 @ 1400mhz.

    G-Sync only works with one monitor so far, and considering I already have a 120hz monitor and already get a taste of wants to come I could care less. We won't see it in anything not overpriced for years to come, by that time we will probably have a open standard that both Intel and AMD can use. Plus I want a 1440p ISP with G-Sync and doubt that will happen any time soon. Mantle is by far a more interesting option if you ask me. I already with with vsync off and get no tearing and games are as smooth as ever with AMD's current drivers. Smoother then my old Nvidia card, though I just made the switch to AMD and never really had the chance to use the old "crap" drivers.
  • fewafwwaefwa - Thursday, November 7, 2013 - link

    foad cretin.
  • fewafwwaefwa - Thursday, November 7, 2013 - link

    hope you get your stomach disemboweled.
  • fewafwwaefwa - Thursday, November 7, 2013 - link

    I'd gladly saw your head open.
  • Samus - Friday, November 8, 2013 - link

    The problem with this card is the 25% price premium over AMD's 290X for 11% more performance.

    The only real advantage it has over the 290X is lower noise. Other than that it lacks next-gen optimizations (Mantel, EA partnership, console ports, etc.)
  • ahlan - Friday, November 8, 2013 - link

    http://s11.postimg.org/odh7byx3n/amd_N.png

    Lol most review site are Nvidia's bitch....
  • tcube - Friday, November 8, 2013 - link

    erm... 290x in uber mode is edged by what? 1-5%... I can't call this a win + GK110 has reached it's full potential. And 780Ti is basically an excelent GK110 chip castrated and sold for 700$ instead of the regular 4k$ (K6000) + they pushed GDDR5 to it's maximum to just edge the 290x... I don't know... This thing looks like vapor ware, let's see some availability but I doubt it's ok for nvidia to sell a perfectly good chip for what? 6 times less? Plus the last tests with 290 without the x show that 290x has lots of potential left

    How I see it 290x was rushed to market and suffers because of it bad cooler, high temperatures and slow memory... 780ti is the best of the Kepler architecture Oc'ed both memory & gpu with basically a pro grade chip and 30% more diespace and it just barelly edges out the 290x in uber mode.

    What AMD managed is to make nvidia divert perfect GK110's from pro line to mainstream and shifted their focus - which is a bad thing to do atm for nvidia. And nvidia reacted like a... fanboy really by scrambling to bring a lab rat on the market... just to barely claim the crown back... instead of focusing on maxwell and pro line improvements... They really behaved like little kids with ADHD with this one... but ... oh well...

Log in

Don't have an account? Sign up now