Power, Temperature, & Noise

As always, last but not least is our look at power, temperature, and noise. Next to price and performance of course, these are some of the most important aspects of a GPU, due in large part to the impact of noise. All things considered, a loud card is undesirable unless there’s a sufficiently good reason – or sufficiently good performance – to ignore the noise.

As the GM200 flagship card, GTX Titan X gets the pick of the litter as far as GM200 GPUs go. GTX Titan X needed fully-functional GM200 GPUs, and even then needed GPUs that were good enough to meet NVIDIA’s power requirements. GTX 980 Ti on the other hand, as a cut-down/salvage card, gets second pick. So we expect to see these chips be just a bit worse; to have either functional units that came out of the fab damaged, or have functional units that have been turned off due to power reasons.

GeForce GTX Titan X/980 Voltages
GTX Titan X Boost Voltage GTX 980 Ti Boost Voltage GTX 980 Boost Voltage
1.162v 1.187v 1.225v

Looking at voltages, we can see just that in our samples. GTX 980 Ti has a slightly higher boost voltage – 1.187v – than our GTX Titan X. NVIDIA sometimes bins their second-tier cards for lower voltage, but this isn’t something we’re seeing here. Nor is there necessarily a need to bin in such a manner since the 250W TDP is unchanged from GTX Titan X.

GeForce GTX 980 Ti Average Clockspeeds
Game GTX 980 Ti GTX Titan X
Max Boost Clock 1202MHz 1215MHz
Battlefield 4
1139MHz
1088MHz
Crysis 3
1177MHz
1113MHz
Mordor
1151MHz
1126MHz
Civilization: BE
1101MHz
1088MHz
Dragon Age
1189MHz
1189MHz
Talos Principle
1177MHz
1126MHz
Far Cry 4
1139MHz
1101MHz
Total War: Attila
1139MHz
1088MHz
GRID Autosport
1164MHz
1151MHz
Grand Theft Auto V
1189MHz
1189MHz

The far more interesting story here is GTX 980 Ti’s clockspeeds. As we have pointed out time and time again, GTX 980 Ti’s gaming performance trails GTX Titan X by just a few percent, this despite the fact that GTX 980 Ti is down by 2 SMMs and is clocked identically. On paper there is a 9% performance difference that in the real world we’re not seeing. So what’s going on?

The answer to that is that what GTX 980 Ti lacks in SMMs it’s making up in clockspeeds. The card’s average clockspeeds are frequently two or more bins ahead of GTX Titan X, topping out at a 64MHz advantage under Crysis 3. All of this comes despite the fact that GTX 980 Ti has a lower maximum boost clock than GTX Titan X, topping out one bin lower at 1202MHz to GTX Titan X’s 1215MHz.

Ultimately the higher clockspeeds are a result of the increased power and thermal headroom the GTX 980 Ti picks up from halving the number of VRAM chips along with disabling two SMMs. With those components no longer consuming power or generating heat, and yet the TDP staying at 250W, GTX 980 Ti can spend its power savings to boost just a bit higher. This in turn compresses the performance gap between the two cards (despite what the specs say), which coupled with the fact that performance doesn't scale lineraly with SMM count or clockspeed (you rarely lose the full theoretical performance amount when shedding frequency or functional units) leads to the GTX 980 Ti trailing the GTX Titan X by an average of just 3%.

Idle Power Consumption

Starting off with idle power consumption, there's nothing new to report here. GTX 980 Ti performs just like the GTX Titan X, which at 74W is second only to the GTX 980 by a single watt.

Load Power Consumption - Crysis 3

Load Power Consumption - FurMark

Meanwhile load power consumption is also practically identical to the GTX Titan X. With the same GPU on the same board operating at the same TDP, GTX 980 Ti ends up right where we expect it, next to GTX Titan X. GTX Titan X did very well as far as energy efficiency is concerned – setting a new bar for 250W cards – and GTX 980 Ti in turn does just as well.

Idle GPU Temperature

Load GPU Temperature - Crysis 3

Load GPU Temperature - FurMark

As was the case with power consumption, video card temperatures are similarly unchanged. NVIDIA’s metal cooler does a great job here, keeping temperatures low at idle while NVIDIA’s GPU Boost mechanism keeps temperatures from exceeding 83C under full load.

Idle Noise Levels

Load Noise Levels - Crysis 3

Load Noise Levels - FurMark

Finally for noise, the situation is much the same. Unexpected but not all that surprising, the GTX 980 Ti ends up doing a hair worse than the GTX Titan X here. NVIDIA has not changed the fan curves or TDP, so this ultimately comes down to manufacturing variability in NVIDIA’s metal cooler, with our GTX 980 Ti faring ever so slightly worse than the Titan. Which is to say that it's still right at the sweet spot for noise versus power consumption, dissipating 250W at no more than 53dB, and once again proving the mettle of NVIDIA's metal cooler.

Compute Overclocking
Comments Locked

290 Comments

View All Comments

  • Laststop311 - Monday, June 1, 2015 - link

    how is 6GB the minimum ram needed till finfet gpus? Even at 1440p with max settings no game requires 6GB of ram. Even if a game can use 6GB of ram the way some games are programmed they just use up extra ram if it is available but that used ram isn't crucial to the operation of the game. So it will show a high ram usage when in reality it can use way less and be fine.

    You are overly paranoid. 4GB of ram should be just fine to hold u off a year or 2 till finfet gpus comes out for 1440p res. If you are smart you will skip these and just wait for 2h 2016 where 14/16nm finfet gpu's are going to make a large leap in performance. That generation of gpu's should be able to be kept long term with good results. This is when you would want an 8GB card to keep it running smooth for a good 3-4 years, since you should get good lifespan with the first finfet gpu's.
  • chizow - Monday, June 1, 2015 - link

    Again, spoken from the perspective of someone who doesn't have the requisite hardware to test or know the difference. I've had both a 980 and a Titan X, and there are without a doubt, games that run sluggishly as if you are moving through molasses as soon as you turn up bandwidth intensive settings, like MSAA, texture quality and stereo 3D and hit your VRAM limits even with the FRAPs meter saying you should be getting smooth frame rates.

    With Titan X, none of these problems and of course, VRAM shoots over the 4GB celing I was hitting before.

    And why would I bother to keep running old cards that aren't good enough now and wait for FinFET cards that MIGHT be able to run for 3-4 years after that? I'll just upgrade to 14/16nm next year if the difference is big enough, it'll be a similar 18-24 month timeframe when I usually make my upgrades anyways. What am I supposed to do in this year while I wait for good enough GPUs? Not play any games? Deal with 2-3GB slow cards at 1440p? No thanks.
  • Refuge - Monday, June 1, 2015 - link

    So you are saying I shouldn't be asking questions about something I'm spending my hard earned money on? Not a small sum of which at that?

    You sir should buy my car, it is a great deal, just don't ask me about it. Because that would be stupid!
  • Yojimbo - Monday, June 1, 2015 - link

    He's not questioning your concern, he's questioning your criteria.
  • Peichen - Sunday, May 31, 2015 - link

    Why is the most popular mid-high card: GTX 970, not on the comparison list? It is exactly half the price as 980 Ti and it would be great to see if it is exactly 50% the speed and uses half the power as well.
  • dragonsqrrl - Sunday, May 31, 2015 - link

    It's definitely more than 50% the performance and power consumption, but yes it would've been nice to include in the charts.
  • PEJUman - Monday, June 1, 2015 - link

    Ryan's selection is not random. it seems he selects the likely upgrade candidates & nearest competitors. it's the same reasoning why there is no R9 290 here. most 970 and R9 290 owners probably know how to infer their card performance from the un-harversted versions (980 and 290x).

    Granted, it's odd to see 580 here and 970 will be more valuable technically.
  • mapesdhs - Wednesday, June 3, 2015 - link

    Plus, most requests I've seen on forums have been for 970 SLI results rather than a 970 on its own, as 970 SLI is the more likely config to come anywhere a 980 Ti, assuming VRAM isn't an issue. Data for 970 SLI would thus show where in the various resolution/detail space one sees performance tail off because it needs more than 4GB.
  • bloodypulp - Monday, June 1, 2015 - link

    The 295X2 still crushes it. But blind Nvidia fanboys will claim it doesn't matter because it is either a) not a single GPU or b)AMD (and therefore sucks).
  • PEJUman - Monday, June 1, 2015 - link

    I owns 290 crossfire currently, previously a single 780 TI. Witcher 3 still sucks for my 290 CF, as well as the 295X2. so... depends on your game selections. I also have to spend more time customizing most of my games to get the optimal settings on my 290 CF than my 780TI.

Log in

Don't have an account? Sign up now