A little more than a year ago NVIDIA introduced the GF110 GPU, the power-optimized version of their Fermi patriarch, GF100. The first product was their flagship GTX 580, followed by the eventual GTX 570. Traditionally NVIDIA would follow this up with a 3rd product. The GTX 200 series had 285/275/260, and the GTX 400 series had GTX 480/470/465. However in the past year we have never seen the 3rd tier GF110 card… until now.

Today NVIDIA will be launching the GeForce GTX 560 Ti With 448 Cores (and yes, that’s the complete name), a limited edition product that will serve as the 3rd tier product, at least for a time. And while NVIDIA won't win any fans with the name, the performance is another matter entirely. If you've ever wanted a GTX 570 but didn't want to pay the $300+ price tag, as we'll see NVIDIA has made a very convincing argument that this is the card for you.

  GTX 580 GTX 570 GTX 560 Ti w/448 Cores GTX 560 Ti
Stream Processors 512 480 448 384
Texture Address / Filtering 64/64 60/60 56/56 64/64
ROPs 48 40 40 32
Core Clock 772MHz 732MHz 732MHz 822MHz
Shader Clock 1544MHz 1464MHz 1464MHz 1644MHz
Memory Clock 1002MHz (4008MHz data rate) GDDR5 950MHz (3800MHz data rate) GDDR5 900Mhz (3600MHz data rate) GDDR5 1002Mhz (4008MHz data rate) GDDR5
Memory Bus Width 384-bit 320-bit 320-bit 256-bit
Frame Buffer 1.5GB 1.25GB 1.25GB 1GB
FP64 1/8 FP32 1/8 FP32 1/8 FP32 1/12 FP32
Transistor Count 3B 3B 3B 1.95B
Manufacturing Process TSMC 40nm TSMC 40nm TSMC 40nm TSMC 40nm
Price Point $489 $329 $289 $229

The GTX 560 Ti With 448 Cores is based on the same GF110 GPU as the GTX 580 and GTX 570. Where GTX 580 is a fully enabled GF110 product and GTX 570 is a partially binned part, the GTX 560 Ti With 448 Cores – which we’ll refer to as the GTX 560-448 for simplicity’s sake – is a further binned GF110 intended to take the position of the traditional 3rd tier product, putting it below the GTX 570.

Looking at the organization of the GF110 being used in the GTX 560-448, the difference between the GTX 570 and GTX 560-448 is that NVIDIA has disabled a further SM unit, cutting the compute/shading, texturing, and geometry performance by 7%. ROP performance remains untouched, as does the number of memory controllers. The core clock is the same as the GTX 570 at 732MHz, while the memory clock has been reduced slightly from 950MHz (3800MHz data rate) to 900MHz (3600MHz). All together compared to the GTX 570, the GTX 560-448 has 93% of the compute/shader performance, 100% of the ROP performance, and 95% of the memory bandwidth. In practice this is closer to the performance of the GTX 570 than the larger product spacing we’re used to seeing.

Power and cooling are also very similar to the GTX 570. NVIDIA has put the TDP at 210W, versus 219W for the GTX 570. As always NVIDIA does not supply an idle TDP, but it should be practically identical to the GTX 570. The end result is that the GTX 560-448 should have slightly lower performance than the GTX 570 with similar power consumption.

Now if we’re making all of these comparisons to the GTX 570, why is the GTX 560-448 a GTX 560? That’s a good question, and not one that we’ll get a completely satisfactory answer to. NVIDIA is well aware of what they’ve done, and they’ve already prepared a response:

Question: Why is the “GTX 560 Ti” designation used for this product instead of “565” or “570 LE”
The designation is meant to reflect the fact that this is not an addition to our 500 series line-up, but rather a limited edition product.

This is a completely truthful answer – and we’ll get to the limited edition aspect in a moment – but it’s not a real answer to the question. Ultimately NVIDIA has to balance OEM, consumer, and regional concerns since not every market will be getting this product, but more practically the GTX 560 Ti is a well received and well selling card whose success NVIDIA wants to extend. The result is that NVIDIA can (and will) call it whatever they want, and this time they’re calling it a GTX 560 Ti. Thus, this is why we have a GF110 product launching as a GTX 560 Ti even though it has more in common with a GTX 470 than anything else. It’s that kind of a launch.

As far as being a limited edition product, that’s not particularly complex.  NVIDIA bins GF110 GPUs for a number of products, not just GeForce but for Tesla and Quadro too. The best chips go into the most expensive products, while chips with several bad SMs go into products like low-end Quadros and NVIDIA’s 4th tier OEM only card – which is also the GTX 560 Ti. In the past year of production NVIDIA has built up a supply of mid-tier chips: chips that aren’t good enough to be in a GTX 570, but better than what the lower end markets need. Rather than taking a revenue hit by shipping these chips in those lower end products, NVIDIA has decided to mint a new GeForce product instead, and that’s the GTX 560-448.

The reason the GTX 560-448 is a limited edition product is that NVIDIA is not accumulating suitably dysfunctional chips at a rapid pace as they do chips for their other product lines.  As a result they only have a small, largely fixed number of chips to produce GTX 560-448s with. With this limited supply NVIDIA will only be chasing particularly affluent markets with a limited number of cards: The US and Canada, the UK, France, Germany, the Nordic countries, and Russia. South America and the Asia Pacific region (APAC) are notably absent. Furthermore for those markets that will be getting the GTX 560-448, it’s essentially a seasonal product specifically for Christmas: NVIDIA only expects the supply of cards to last 1-2 months, after which NVIDIA’s product lineup reverts to the 580/570/560 stack we already are accustomed to. So while a limited edition product is nothing new, we haven’t seen a coordinated launch for an LE product quite like this in recent years.

Given the hardware similarities to the GTX 570, it should come as no surprise that NVIDIA is forgoing a reference design while their partners will be launching cards based on their existing GTX 570 designs. At this point all of them have custom GTX 570 designs, and as such the GTX 560-448 cards will be using those custom designs. The card we’ve been sampled with, Zotac’s GeForce GTX 560 Ti 448 Cores Limited Edition, is one such card, based on their custom GTX 570 design. Furthermore as was the case with many proper GTX 560 Ti cards, the GTX 560-448 will be launching in overclocked designs, such as Zotac’s which ships at 765MHz instead of 732MHz. So the performance of individual GTX 560-448 products can vary by upwards of several percent.

The MSRP on the GTX 560-448 will be $289, however launch partners will be free to price it higher to match any factory overclocks they do. At $289 the GTX 560-448 is priced extremely close to the cheapest GTX 570s, and depending on clockspeeds and sales a GTX 570 could end up being the same price or cheaper, so it will be prudent to check prices. Meanwhile the GTX 560-448’s closest competition from AMD will be the Radeon HD 6950, which trends around $250 after rebate while the Radeon HD 6970 is still closer to $340. Overall NVIDIA’s pricing may be a bit high compared to their other products, but compared to AMD’s products it’s consistent with the performance.

Meet The Zotac GeForce GTX 560 Ti 448 Cores Limited Edition
POST A COMMENT

80 Comments

View All Comments

  • Transmitthis14 - Tuesday, November 29, 2011 - link

    Many would ignore "Value" like that.
    I use Premier and Pshop, (cuda Acceleration) and also like the option of Physx with my Games when coded for.
    So A 6950 then becomes a none option
    Reply
  • Exodite - Wednesday, November 30, 2011 - link

    Fringe benefits like those swing both ways though.

    I need three simultaneous display outputs, which render Nvidia options redundant in much the same way.

    Looking strictly at price/performance, which is the metric most consumers would use in this case, the 2GB 6950 is indeed an excellent option.
    Reply
  • grammaton.feather - Wednesday, November 30, 2011 - link

    Oh really?

    http://www.overclock.net/t/909086/6950-shaders-unl...

    I also agree with the CUDA argument. I have used Nvidia for years because of stereoscopic 3d support, more recently CUDA and PhysX and because I don't have driver crashes with Nvidia.

    ATI may be better value for money at the cheap end of the spectrum, so long as u don't mind driver crashes.

    A friend who used his high end PC for business had 170 plus ATI driver crashes logged by windows. I chose an Nvidia GPU for him and the crashing stopped.
    Reply
  • rum - Wednesday, November 30, 2011 - link

    I have had nothing but ATI cards since the 3870, and have NOT experienced these driver crashes you are speaking of.

    I use my computers for gaming and computer programming, and sometimes I do actually stress my system. :)

    I am in the process of evaluating my video needs, as I do need a new video card for a new PC. I won't blindly go with ATI, I will EVALUATE all aspects and get the best card for the MONEY.

    I don't need cutting edge, neither do most people, and we do evaluate things on most bang for the buck.
    Reply
  • VoraciousGorak - Wednesday, November 30, 2011 - link

    Funny, since I Fold on a GTX275, GTX460, and 9800GT on two different systems, and pretty often have NVIDIA's drivers lose their minds when finishing a WU, shutting down the system, or while simply processing a WU. I've tried multiple WHQL and beta drivers and settled with these being the least of my problems (the worst being total system unresponsiveness while Folding and a hard reset after a client stopped.) However, when I folded on my 6950 (granted, different client, but I Folded on it with the Beta client too) it never had a driver freakout, screen blank, or even a flicker when turning on or shutting off WUs.

    It seems to me that people that have major driver issues with AMD (not ATI anymore, by the way) cards are outliers, relative novices, and/or are using very outdated drivers or are recounting stories back from when the ATI 9700 series were new. I'm probably an outlier as well with my NVIDIA gripes, but characterizing a major modern GPU manufacturer with constant widespread driver issues or stating that any GPU is guaranteed to have problems is just silly.
    Reply
  • silverblue - Thursday, December 01, 2011 - link

    Oddly, those people who slate AMD are the ones who never ever seem to have an NVIDIA crash, or at least won't admit to one. Reply
  • Gorghor - Thursday, December 01, 2011 - link

    I know this has no technical relevance, but I thought I'd share since my first reaction when I saw the article title, was "did I make a mistake buying an ATI6950?"

    For some reason I have been sticking to nVidia cards since the days of the Geforce 2. I made an exception with the ATI 9800 and although the performance was good back then, I had many problems with the drivers and really didn't approve of the catalyst control center.

    When I upgraded my mainboard to the Sandy-bridge generation, I left my "old" GTX285 in the system and had several daily crashes related to the nVidia drivers (even after several months and driver updates).

    About a month ago I decided it was time for me to upgrade the graphics. I wanted to stick to nVidia albeit the problems with my GTX285, but the price/performance ratio was just unacceptable when compared to AMD cards. I chose the ATI6950 2GB considering the same money would barely get me a GTX560.

    I have to say I'm impressed. The performance is truly exceptional, even with no shader unlock, the catalyst interface has really matured over the years and my crashes are completely gone. So as far as I'm concerned, blindly going for nVidia is not an option anymore.
    Reply
  • Golgatha - Tuesday, November 29, 2011 - link

    what I take away from this review is the 560-448 is pretty much on par with a stock 570, but it will only be manufactured for roughly 2 months, and then be discontinued. Why would any gamer buy this card when they can get a superior card within $20 of this one and pretty much be guaranteed availability if they want to run SLI? Also, what's with the power connectors on the rear of the card. All of these long cards should situate those connectors on the top of the card. Reply
  • jigglywiggly - Tuesday, November 29, 2011 - link

    I'd avoid it, you will probably be screwd with drivers in the long run. Reply
  • Samus - Tuesday, November 29, 2011 - link

    As far as drivers and modern games go, people with ATI hardware have been having substantially more problems in Battlefield 3. Then the issue of microstutter arises which barely affects nVidia hardware, for what can only be presumed to be a ATI driver problem. Lastly, many people have experienced PowerPlay problems with Radeon 4000-5000 series cards running newer drivers due to clockspeed changes in-game, sometimes causing BSOD's, but always causing performance problems.

    nVidia has been, and always will be, the kind of drivers. The days of Mach64 still plague ATI's driver developement team. nVidia invented unified drivers and changed the game for the industry. They update their SLI profiles more than twice as often as ATI does for Crossfire, which alone shows their loyalty to their SLI users.

    ATI has the hardware engineering team. That's clear. They can produce faster cards that use less power at lower prices. But you will NOT get superior drivers by a longshot.
    Reply

Log in

Don't have an account? Sign up now