Temperature, Power, & Noise: Hot and Loud, but Not in the Good Way

For all of the gaming and compute performance data we have seen so far, we’ve only seen half of the story. With a 500mm2+ die and a TDP over 200W, there’s a second story to be told about the power, temperature, and noise characteristics of the GTX 400 series.

Idle GPU Temperature

Starting with idle temperatures, we can quickly see some distinct events among our cards. The top of the chart is occupied solely by AMD’s Radeon 5000 series, whose small die and low idle power usage let these cards idle at very cool temperatures. It’s not until half-way down the chart that we find our first GTX 400 card, with the 470 at 46C. Truth be told we were expecting something a bit better out of it given that its 33W idle is only a few watts over the 5870 and has a fairly large cooler to work with. Farther down the chart is the GTX 480, which is in the over-50 club at 51C idle. This is where NVIDIA has to pay the piper on their die size – even the amazingly low idle clockspeed of 50MHz/core 101MHz/shader 67.5Mhz/RAM isn't enough to drop it any further.

Load GPU Temperature - Crysis

Load GPU Temperature - Furmark

For our load temperatures, we have gone ahead and added Crysis to our temperature testing so that we can see both the worst-case temperatures of FurMark and a more normal gameplay temperature.

At this point the GTX 400 series is in a pretty exclusive club of hot cards – under Crysis the only other single-GPU card above 90C is the 3870, and the GTX 480 SLI is the hottest of any configuration we have tested. Even the dual-GPU cards don’t get quite this hot. In fact it’s quite interesting that unlike FurMark there’s quite a larger spread among card temperatures here, which only makes the GTX 400 series stand out more.

While we’re on the subject of temperatures, we should note that NVIDIA has changed the fan ramp-up behavior from the GTX 200 series. Rather than reacting immediately, the GTX 400 series fans have a ramp-up delay of a few seconds when responding to high temperatures, meaning you’ll actually see those cards get hotter than our sustained temperatures. This won’t have any significant impact on the card, but if you’re like us your eyes will pop out of your head at least once when you see a GTX 480 hitting 98C on FurMark.

Idle Power Consumption

Up next is power consumption. As we’ve already discussed, the GTX 480 and GTX 470 have an idle power consumption of 47W and 33W respectively, putting them out of the running for the least power hungry of the high-end cards. Furthermore the 1200W PSU we switched to for this review has driven up our idle power load a bit, which serves to suppress some of the differences in idle power draw between cards.

With that said the GTX 200 series either does decently or poorly, depending on your point of view. The GTX 480 is below our poorly-idling Radeon 4000 series cards, but well above the 5000 series. Meanwhile the GTX 470 is in the middle of the pack, sharing space with most of the GTX 200 series. The lone outlier here is the GTX 480 SLI. AMD’s power saving mode for Crossfire cards means that the GTX 480 SLI is all alone at a total power draw of 260W when idle.

Load Power Consumption - Crysis

Load Power Consumption - Furmark

For load power we have Crysis and FurMark, the results of which are quite interesting. Under Crysis not only is the GTX 480 SLI the most demanding card setup as we would expect, but the GTX 480 itself isn’t too far behind. As a single-GPU card it pulls in more power than either the GTX 295 or the Radeon 5970, both of which are dual-GPU cards. Farther up the chart is the GTX 470, which is the 2nd most power draining of our single-GPU cards.

Under FurMark our results change ever so slightly. The GTX 480 manages to get under the GTX 295, while the GTX 470 falls in the middle of the GTX 200 series pack. A special mention goes out to the GTX 480 SLI here, which at 851W under load is the greatest power draw we have ever seen for a pair of GPUs.

Idle Noise Levels

Idle noise doesn’t contain any particular surprises since virtually every card can reduce its fan speed to near-silent levels and still stay cool enough. The GTX 400 series is within a few dB of our noise floor here.

Load Noise Levels

Hot, power hungry things are often loud things, and there are no disappointments here. At 70dB the GTX 480 SLI is the loudest card configuration we have ever tested, while at 64.1dB the GTX 480 is the loudest single-GPU card, beating out even our unreasonably loud 4890. Meanwhile the GTX 470 is in the middle of the pack at 61.5dB, coming in amidst some of our louder single-GPU cards and our dual-GPU cards.

Finally, with this data in hand we went to NVIDIA to ask about the longevity of their cards at these temperatures, as seeing the GTX 480 hitting 94C sustained in a game left us worried. In response NVIDIA told us that they have done significant testing of the cards at high temperatures to validate their longevity, and their models predict a lifetime of years even at temperatures approaching 105C (the throttle point for GF100). Furthermore as they note they have shipped other cards that run roughly this hot such as the GTX 295, and those cards have held up just fine.

At this point we don’t have any reason to doubt NVIDIA’s word on this matter, but with that said this wouldn’t discourage us from taking the appropriate precautions. Heat does impact longevity to some degree – we would strongly consider getting a lifetime warranty for the GTX 480 to hedge our bets.

Wolfenstein Final Words
Comments Locked

196 Comments

View All Comments

  • henrikfm - Tuesday, March 30, 2010 - link

    Now it would be easier to believe only idiots buy ultra-high end PC hardware parts.
  • ryta1203 - Tuesday, March 30, 2010 - link

    Is it irresponsible to use benchmarks desgined for one card to measure the performance of another card?

    Sadly, the "community" tries to hold the belief that all GPU architectures are the same, which is of course not true.

    The N-queen solver is poorly coded for ATI GPUs, so of course, you can post benchmarks that say whatever you want them to say if they are coded that way.

    Personally, I find this fact invalidates the entire article, or at least the "compute" section of this article.
  • Ryan Smith - Wednesday, March 31, 2010 - link

    One of the things we absolutely wanted to do starting with Fermi is to include compute benchmarks. It's going to be a big deal if AMD and NVIDIA have anything to say about it, and in the case of Fermi it's a big part of the design decision.

    Our hope was that we'd have some proper OpenCL/DirectCompute apps by the time of the Fermi launch, but this hasn't happened. So our decision was to go ahead with what we had, and to try to make it clear that our OpenCL benchmarks were to explore the state of GPGPU rather than to make any significant claims about the compute capabilities of NVIDIA or AMD's GPUs. We would rather do this than to ignore compute entirely.

    It sounds like we didn't make this clear enough for your liking, and if so I apologize. But it doesn't make the results invalid - these are OpenCL programs and this is what we got. It just doesn't mean that these results will carry over to what a commercial OpenCL program may perform like. In fact if anything it adds fuel to the notion that OpenCL/DirectCompute will not be the great unifier we had hoped for them to be if it means developers are going to have to basically write paths optimized around NVIDIA and AMD's different shader structure.
  • ryta1203 - Tuesday, March 30, 2010 - link

    The compute section of this article is just nonsense. Is this guy a journalist? What does he know about programming GPUs?
  • Firen - Tuesday, March 30, 2010 - link

    Thanks for this comprehensive review, it covers some very interesting topics betwen Team Green and Team Red.

    Yet, I agree with one of the comments here, you missed how easy that ATI 5850 and 5870 can be overlocked thanks to their lite design, a 5870 can easily deliver more or less the same performance as a 480 card while still running cooler and consumes less power..

    Some people might point out that our new 'champion' card can be overlocked as well..that's true..however, doesn't it feel terrifying to have a graphic card running hotter than boiling water!
  • Fulle - Tuesday, March 30, 2010 - link

    I wonder what kind of overclocking headroom the 470 has.... since someone with a 5850 can easily bump the voltage up a smidge, and get about a 30% overclock with minimal effort... people who tinker can usually safely reach about 1GHz core, for about a 37% overclock.

    Unless the 470 has a bit of overclocking headroom, someone with a 5850 could easily overclock to have superior performance, lower heat, lower noise, and lower power consumption.

    After all these months and months of waiting, Nvidia has basically released a few products that ATI can defeat by just binning their current GPUs and bumping up the clockspeed? *sigh* I really don't know who would buy these cards.
  • Shadowmaster625 - Tuesday, March 30, 2010 - link

    You're being way too kind to Nvidia. Up to 50% more power consumption for a very slight (at best) price/performance advantage? This isnt a repeat of the AMD/Intel thing. This is a massive difference in power consumption. We're talking about approximately $1 a year per hour a week of gaming. If you game for 20 hours a week, expect to pay $20 a year more for using the GTX470 vs a 5850. May as well add that right to the price of the card.

    But the real issue is what happens to these cards when they get even a modest coating of dust in them? They're going to detonate...

    Even if the 470 outperformed the 5850 by 30%, I dont think it would be worth it. I cant stand loud video cards. It is totally unacceptable to me. I again have to ask the question I find myself asking quite often: what kind of world are you guys living in? nVidia should get nothing more than a poop-in-a-box award for this.
  • jujumedia - Wednesday, March 31, 2010 - link

    with those power draws and the temps it reaches for daily operation i see gpu failure rates high on the gtx 480 and 470 as they are already faulty from the fab lab. Ill stick with ATI for 10 fps less.
  • njs72 - Wednesday, March 31, 2010 - link

    I been holding on for months to see what Fermi would bring in the world of GPUs. After reading countless reviews of this card i dont think its a justifyable upgrade for my gtx260. I mean yeah the performance is much higher but in most reviews of benchmarks with games like Crysis this card barely wins against the 5870, but buying this card i would need to upgrade the psu and posibly a new case for ventilation. I keep loading up Novatechs website and and almost adding a 5870 to the basket, and not pre ordering gtx480 like i was intending. What puts me off more than anything with the new nvidia card is its noise and temps. I cant see this card living for very long.

    Ive been a nvidia fan ever since the the first geforce card came out, which i still have tucked away in a draw somewhere. I find myself thinking of switching to ATI, but read too many horror stories about their driver implementation that puts me off. Maybe i should just wait for Nvidia to refresh its new card and keep hold of my 260 for a bit longer. i really dont know :-(
  • Zaitsev - Wednesday, March 31, 2010 - link

    There is an error with the Bad Company 2 image mouse overs for the GTX 480. I think the images for 2xAA and 4xAA have been mixed up. 2xAA clearly has more AA than the 4xAA image.

    Compare GTX 480 2x with GTX 285 4x and they look very similar. Also compare 480 4x with 285 2x.

    Very nice article, Ryan! I really enjoyed the tessellation tests. Keep up the good work.

Log in

Don't have an account? Sign up now