Meet The EVGA GeForce GTX 670 Superclocked

Our second card of the day is EVGA’s GeForce GTX 670 Superclocked, which in EVGA’s hierarchy is their first tier of factory overclocked cards. EVGA is binning GTX 670s and in turn promoting some of them to this tier, which means the GTX 670 Superclocked are equipped with generally better performing chips than the average reference card.

GeForce GTX 670 Partner Card Specification Comparison
  EVGA GeForce GTX 670 Superclocked GeForce GTX 670 (Ref)
CUDA Cores 1344 1344
Texture Units 112 112
ROPs 32 32
Base Clock 967MHz 915MHz
Boost Clock 1046MHz 980MHz
Memory Clock 6210MHz 6008MHz
Memory Bus Width 256-bit 256-bit
Frame Buffer 2GB 2GB
TDP 170W 170W
Manufacturing Process TSMC 28nm TSMC 28nm
Width Double Slot Double Slot
Length 9.5" 9.5"
Warranty 3 Years N/A
Price Point $419 $399

For the GTX 670 SC, EVGA has given both the core clock and memory clock a moderate boost. The core clock has been increased by 52MHz (6%) to 967MHz base and 66MHz (7%) boost to 1046MHz. Meanwhile the memory clock has been increased by 202MHz (3%) to 6210MHz.

Other than the clockspeed changes, the GTX 670 SC is an almost-reference card utilizing a reference PCB with a slightly modified cooler. EVGA is fabricating their own shroud, but they’ve copied NVIDIA’s reference shroud down to almost the last detail. The only functional difference is that the diameter of the fan intake is about 5mm less, otherwise the only difference is that EVGA has detailed it differently than NVIDIA and used some rounded corners in place of square corners.

The only other change you’ll notice is that EVGA is using their own high flow bracket in place of NVIDIA’s bracket. The high flow bracket cuts away as much metal as possible, maximizing the area of the vents. Though based on our power and temperature readings, this doesn’t seem to have notably impacted the GTX 670 SC.

While we’re on the matter of customized cards and factory overclocks, it’s worth reiterating NVIDIA’s position on factory overclocked cards. Reference and semi-custom cards (that is, cards using the reference PCB) must adhere to NVIDIA’s power target limits. For GTX 670 this is a 141W power target, with a maximum power target of 122% (170W). Fully custom cards with better power delivery circuitry can go higher, but not semi-custom cards. As a result the flexibility in building semi-custom cards comes down to binning. EVGA can bin better chips and use them in cards such as the Superclocked – such as our sample which can go 17 boost bins over the base clock versus 13 bins for our reference GTX 670 – but at the end of the day for stock performance they’re at the mercy of what can be accomplished within 141W/170W.

In any case, as the card is otherwise a reference GTX 670 EVGA is relying on the combination of their factory overclock, their toolset, and their strong reputation for support to carry the card. EVGA has priced the card at $419, $20 over the GTX 670 MSRP, in-line with other factory overclocked cards.

On the subject of pricing and warranties, since this is the first EVGA card we’ve reviewed since April 1st, this is a good time to go over the recent warranty changes EVGA has made.

Starting April 1st, EVGA has implemented what they’re calling their new Global Warranty Policy. Starting July 1st, 2011 (the policy is being backdated), all new EVGA cards ship with at least a 3 year warranty. And for the GTX 600 series specifically, so far EVGA has only offered models with a 3 year warranty in North America, which simplifies their product lineup.

To complement the 3 year warranty and replace the lack of longer term warranties, EVGA is now directly selling 2 and 7 year warranty extensions, for a total of 5 and 10 years respectively. So instead of buying a card with a 3 year warranty or a longer warranty, you’ll simply buy the 3 year card and then buy a warranty extension to go with it. However the extended warranty requires that the card be registered and the warranty purchased within 30 days.

The second change is that the base 3 year warranty no longer requires product registration. EVGA has other ways to entice buyers into registering, but they’ll now honor all applicable cards for 3 years regardless of the registration status. At the same time the base 3 year warranty is now a per-product warranty (e.g. a transferable warranty) rather than per-user warranty, so the base warranty will transfer to 2nd hand buyers. The extended warranties however will not.

The third change is how EVGA is actually going to handle the warranty process. First and foremost, EVGA is now allowing cards to be sent to the nearest EVGA RMA office rather than the office for the region the card was purchased from. For example a buyer moving from Europe to North America can send the card to EVGA’s North American offices rather than sending it overseas.

Finally, EVGA is now doing free cross shipping, alongside their existing Advanced RMA program. EVGA will now cross-ship replacement cards for free to buyers. The buyer meanwhile is responsible for paying to ship the faulty card back and putting up collateral on the new card until EVGA receives the old card.

There’s also one quick change to the step-up program that will impact some customers. With the move to purchasing extended warranties, the step-up program is only available to customers who either purchase an extended warranty or purchase an older generation card that comes with a lifetime warranty. Step-up is not available to cards with only the base 3 year warranty.

Moving on, along with EVGA’s new warranty EVGA is bundling the latest version of their GPU utilities, Precision X and OC Scanner X.

Precision X, as we touched upon quickly in our GTX 680 review, is the latest iteration of EVGA’s Precision overclocking & monitoring utility. It’s still based on RivaTuner and along with adding support for the GTX 600 series features (power targets, framerate caps, etc), it also introduces a new UI. Functionality wise it’s still at the top of the pack along with the similarly RivaTuner powered MSI Afterburner. Personally I’m not a fan of the new UI – circular UIs and sliders aren’t particularly easy to read – but it gets the job done.

Gallery: EVGA X Tools

OC Scanner X has also received a facelift and functionality upgrade of its own. Along with its basic FurMark-ish stress testing and error checking, it now also offers a basic CPU stress test and GPU benchmark.

Meet The GeForce GTX 670 The Test
Comments Locked

414 Comments

View All Comments

  • chizow - Thursday, May 10, 2012 - link

    Hehe ya exactly.

    It seems as if many of the apologists willing to give AMD and Nvidia a pass on 28nm pricing are new to the GPU game, or tech toy game for that matter. They just have no historical perspective at all which I'm sure thrills the marketing/finance guys over in Silicon Valley....they can't sink their meathooks into these guys fast enough.

    But yeah its not about being able to afford it, its about being able to buy them and actually feel good about the purchase looking back, a week, a month, a year from now. Most people only need to be burned once to learn their lesson, hopefully those early adopters who bought 7970/7950 and GTX 680/690 have learned theirs.
  • CeriseCogburn - Friday, May 11, 2012 - link

    Hehe exactly read above, you're wrong. Prove otherwise or shut up. Calling everyone else stupid when you have ZERO EVIDENCE presented doesn't cut it.
  • chizow - Friday, May 11, 2012 - link

    Zero evidence? Try 10 years of GPU benchmarks. Seriously, try looking at some before commenting because its obvious you haven't paid close enough attention in the past....
  • SlyNine - Saturday, May 12, 2012 - link

    You never provide any valid evidence. But this topic has been debated and historical data is all the proof you need.
  • SlyNine - Saturday, May 12, 2012 - link

    I didn't want to spend 500$, but I did want something 2x as fast as my 5870. So the 680GTX got the bill.

    But honestly I wouldn't expect cards to keep evolving at the same rate. Cards used more slots and more power to keep doubling and tripling in performance. That trend cannot go on for long because their is not enough slots and power to do so.

    I fully expect all performance increases now to be from architecture improvements and node changes.
  • CeriseCogburn - Friday, May 11, 2012 - link

    You guys can claim anything you want with your bland, data absent talking point, so let's examine just how far out of sane bounds you two are ( you and chizow ) - and BTW I'd appreciate the reviewers talking point as well. A full quote will be fine.

    Let's skip any insane retentiveness with fancy specific wording you've used as a ruse taken absolutely literally in the hopes that those not noticing a perfectly literal and absolutely strict translation would be fooled by the idea presented, and do a cursory examination:

    We can start with the G80 - it morphed into the G92 and G92b which all you slam artists screamed was a rebranded absolute clone.

    So we'll take the 9800GTX+ vs- the next released card, the GTX280.
    GTX280 morphed into GTX285
    We can move from the GTX285 to the GTX480 - the GTX480 morphed into the GTX580.
    So we move from GTX580 to GTX680.

    Although I have not strictly gone insane talking point ruse literal and used a sort of CHEAT you people espouse with your lousy nm + new die move talking point, what I have is what people actually EXPERIENCED AS CARD RELEASES - so we'll have to go with those.

    9800GTX+ to GTX280 (wow that gigantic upgrade )

    GTX285 to GTX480 ( wow that gigantic upgrade )

    GTX580 to GTX680 ( wow that gigantic upgrade )

    Yes, you people are full of it. That's why you keep AVOIDING any real information and figured if you could spew on just the talking point, no one would have to notice what lying crap it is.
  • chizow - Friday, May 11, 2012 - link

    Once again, your arguments are full of fail or you simply don't know how to read simple benchmarks. Using your own, flawed comparisons, you would see:

    9800GTX+ to GTX280 (wow that gigantic upgrade) +70% OR MORE

    GTX285 to GTX480 ( wow that gigantic upgrade ) +60% OR MORE

    GTX580 to GTX680 ( wow that gigantic upgrade ) +30%......

    The reason your comparison is flawed however is because you are comparing half-generations when you compare a refresh to a new generation, so the gap in both time and performance is diminished which decreases value for your $$$.

    Correct comparisons are as follows, and when you look at it that way, GTX 680 and all other 28nm parts look EVEN worst in retrospect:

    8800GTX to GTX 280: +75% OR MORE
    GTX 280 to GTX 480: +80% OR MORE
    GTX 480 to GTX 680: +40%.....

    or if you prefer refresh to refresh but a full generation between them:

    9800GTX+ to GTX 285: +75% or MORE
    GTX 285 to GTX 580: +80% or MORE
    GTX 580 to GTX 685???: ???

    Seriously just read some benchmarks then come back because it seems you're the only one who doesn't seem to get it.
  • CeriseCogburn - Saturday, May 12, 2012 - link

    For shame for shame - more lies... no wonder you're yelling and you NEVER used benchmarks....
    Let's use anand's historical data !

    And let's do it correctly. We go from the card we have now, to the card they release. People now have the GTX580 - and that's what they see in the charts as you whine, bitch and moan and spread your Charlie D FUD. Likewise in former tier jumps/releases.
    So we will use the TRUTH, not some anal retentive abject cheat worded just so, as you two fools suggest we should, to spin your lies ever more "in your favor".

    9800GTX+ to GTX280 , crysis, 25fps to 34fps

    http://www.anandtech.com/show/2549/11

    There it is chizow and it ain't no 75% ! NOT EVEN CLOSE

    GTX285 to GTX480 , crysisw, 30fps to 44fps

    http://www.anandtech.com/show/2977/nvidia-s-geforc...

    Guess you lost on that 80% lie there, too.

    GTX580 to GTX680, 41fps to 48fps

    http://www.anandtech.com/show/5699/nvidia-geforce-...

    NOPE. Certainly not half of the former two moves, with NONE at any 80%,let alone 75%, not even 50%, can't even say 33% increase, EVER.

    Sorry chizow, your only lies, anf big ones at that, won't do.
  • SlyNine - Saturday, May 12, 2012 - link

    You're cherry picking. A huge fallacy. Some benchmarks do show 75%+

    Plus we are talking about 8800GTX to GTX280. We are not talking about rebaged products with very minor changes.
  • CeriseCogburn - Sunday, May 13, 2012 - link

    ROFL - I used what was first in line, I provided the information - I BROKE THE GIANT LIE you the amd fanboy have used with ZERO EVIDENCE.

    Let's face it, I'm 100% more accurate than you ever attempted to be, merely spewing your talking point in as big a fat fib fashion you could muster.

    Of course that's the usual crap from the liars.
    Now you'll just whine the facts I persented vs the no facts you ever presented or even linked to, "don't count".

    R O F L - loser ( what else do you expect me to do man - you're making it very difficult to support your opinion guy)

Log in

Don't have an account? Sign up now