Gaming Performance

Diving into our performance benchmarks, we’ll be running light on the commentary here due to the fact that there’s really not much to say about the gaming performance of the 290 Tri-X OC. Sapphire’s 6% core overclock and 4% memory overclock translates into a real world performance difference of 3% on average. This makes the 290 Tri-X OC a bit faster than a reference 290, but it doesn’t otherwise change the relative rankings of various cards. At most this slightly extends the lead over the GTX 780 to 9% and wipes out the 290X quiet mode’s marginal lead over the 290.

In the end the difference is slight enough that the bulk of the interest in this card should rightfully be on the card’s cooler, and ultimately whether that cooler justifies the $50 premium.

Metro: Last Light - 2560x1440 - High Quality

On a quick note looking at Rome, as one of the games the 290X throttles in the most, this is also the game where the Sapphire 290 Tri-X OC takes the largest lead over the 290X. The 6% performance lead here reflects on the fact

 

290 Tri-X OC Thermal Management & The Test Power, Temperature, & Noise
Comments Locked

119 Comments

View All Comments

  • ShieTar - Tuesday, December 24, 2013 - link

    "Curiously, the [idle] power consumption of the 290 Tri-X OC is notably lower than the reference 290."

    Well, it runs about 10°C cooler, and silicone does have a negative temperature coefficient of electrical resistance. That 10°C should lead to a resistance increase of a few %, and thus to a lower current of a few %. Here's some nice article about the same phenomenon observed going from a Stock 480 to an Zotac AMP! 480:

    http://www.techpowerup.com/reviews/Zotac/GeForce_G...

    The author over there was also initially very surprised. Apparently kids these days just don't pay attention in physics class anymore ...
  • EarthwormJim - Tuesday, December 24, 2013 - link

    It's mainly the leakage current which decreases as temperature decreases, which can lead to the reductions in power consumption.
  • Ryan Smith - Tuesday, December 24, 2013 - link

    I had considered leakage, but that doesn't explain such a (relatively) massive difference. Hawaii is not a leaky chip, meanwhile if we take the difference at the wall to be entirely due to the GPU (after accounting for PSU efficiency), it's hard to buy that 10C of leakage alone is increasing idle power consumption by one-third.
  • The Von Matrices - Wednesday, December 25, 2013 - link

    In your 290 review you said that the release drivers had a power leak. Could this have been fixed and account for the difference?
  • Samus - Wednesday, December 25, 2013 - link

    Quality vrms and circuitry optimizations will have an impact on power consumption, too. Lots of factors here...
  • madwolfa - Wednesday, December 25, 2013 - link

    This card is based on reference design.
  • RazberyBandit - Friday, December 27, 2013 - link

    And based does not mean an exact copy -- it means similar. Some components (caps, chokes, resistors, etc.) could be upgraded and still fill the bill for the base design. Some components could even be downgraded, yet the card would still fit the definition of "based on AMD reference design."
  • Khenglish - Wednesday, December 25, 2013 - link

    Yes power draw does decrease with temperature, but not because resistance drops. Resistance dropping has zero effect on power draw. Why? Because processors are all about pushing current to charge and discharge wire and gate capacitance. Lower resistance just means that happens faster.

    The real reason power draw drops is due to lower leakage. Leakage current is completely unnecessary and is just wasted power.

    Also an added tidbit. The reason performance increases while temperature decreases is mainly due to the wire resistance dropping, not an improvement in the transistor itself. Lower temperature decreases the number of carriers in a semiconductor but improves carrier mobility. There is a small net benefit to how much current the transistor can pass due to temperature's effect on silicon, but the main improvement is from the resistance of the copper interconnects dropping as temperature drops.
  • Totally - Wednesday, December 25, 2013 - link

    Resistance increases with temperature -> Power draw increases P=(I^2)*R.
  • ShieTar - Thursday, December 26, 2013 - link

    The current isn't stabilized generally, the current is: P=U^2/R.

    " Because processors are all about pushing current to charge and discharge wire and gate capacitance. Lower resistance just means that happens faster."

    Basically correct, nevertheless capacitor charging happens asymptotic, and any IC optimised for speed will not wait for a "full" charge. The design baseline is probably to get the lowest charging required for operation at the highest qualified temperature. Since decreasing temperature will increase charging speed, as you pointed out, you will get to a higher charging ratio, and thus use more power.

    On top of that, the GPU is not exclusively transistors. There is power electronics, there are interconnects, there are caches, and who knows what else (not me). Now when the transistors pull a little more charge due to the higher temperature, and the interconnects which deliver the current have a higher resistance, then you get additional transmission losses. And that's on top of higher leakage rates.

    Of course the equation gets even more fun if you start considering the time constants of the interconnects itself, which have gotten quiet relevant since we got to 32nm structures, hence the high-K materials. Though I have honestly no clue how this contribution is linked to temperature.

    But hey, here's hoping that Ryan will go and investigate the Power drop with his equipment and provide us with a full explanation. As I personally don't own a GPU which gets hot in idle (can't force the fan below 30% by software and won't stop it by hand) I cannot test idle power behavior on my own, but I can and did repeat the Furmark-Test described in the link above, and also see a power-saving of about 0.5W per °C with my GTX660. And thats based on internal power monitoring, so the mainboard/PCIe slot and the PSU should add a bit more to that:

    https://www.dropbox.com/s/javq0dg75u40357/Screensh...

Log in

Don't have an account? Sign up now