POST A COMMENT

80 Comments

Back to Article

  • Jamahl - Tuesday, November 29, 2011 - link

    In your 560 Ti review you said that it was "a bit faster" than the 6950. What's changed? Maybe AMD's drivers are helping to pull the card away because it's clearly ahead here with the same games being tested.

    http://www.anandtech.com/show/4135/nvidias-geforce...

    "The GTX 560 Ti ultimately has the edge: it’s a bit faster and it’s quieter than the 6950"

    Perhaps you should do an article on that one? You know you were one of the very few sites on the web who actually found the 560 Ti to be faster than the 6950 in the first place?

    I wonder why that was.
    Reply
  • Ryan Smith - Tuesday, November 29, 2011 - link

    If you haven't already, I'd invite you to take a look at Bench, our online benchmark database. The video card numbers are periodically revised for newer articles, which is what you're seeing here.

    The latest data we have for the 6950 vs. the GTX 560 Ti: http://www.anandtech.com/bench/Product/293?vs=330
    Reply
  • Jamahl - Wednesday, November 30, 2011 - link

    Glad to see you are keeping those updated and thanks for the reply.

    My point was, what happened to the 560 Ti's lead from your initial review? Looking at that bench now the 6950 is a good bit ahead.

    Drivers?
    Reply
  • Ryan Smith - Wednesday, November 30, 2011 - link

    Yes, I'd say that's a fair assessment. Looking at 1920 between January and November

    Crysis: 48.6->51.4
    BF: 58.3->68.9
    HAWX: 108->119
    CivV: 34.8->40.1
    BC2: 61.8->69.2
    Etc.

    Note that the 560 Ti was launched only a month after the 6900 series, so AMD only had a short amount of time to optimize their 6900 drivers between the 6900 launch and then. Whereas they've had another 10 months since then to work on their drivers further. Given the similarities between VLIW4 and VLIW5, if you had asked me for my expectations 10 months ago it's actually more than I thought AMD would get out of optimizations.

    Meanwhile the 560 Ti has shifted very little in comparison, which is not surprising since the Fermi-lite architecture had been around for over half a year by that point.

    The 560 Ti and 6950 still trade blows depending on the game in a very obvious way, but the 6950 is now winning more games and on a pure numerical average is clearly doing better.
    Reply
  • Jamahl - Friday, December 02, 2011 - link

    Yep that looks like a pretty fair assessment. I was suprised to see the gap open up so clearly. Reply
  • pixelstuff - Tuesday, November 29, 2011 - link

    Are these all of the crappy GF110 processors that had manufacturing defects? Reply
  • Ryan Smith - Tuesday, November 29, 2011 - link

    Correct. Technically speaking NVIDIA could take perfectly good GF110 GPUs can still make products like this, but it wouldn't make any sense for them to do so. All of these cards would be using GF110 GPUs with 2 defective SMs. Reply
  • Duwelon - Wednesday, November 30, 2011 - link

    Your image shows 3 defective SMs. At least i'm assuming it's supposed to be the "new" chip. Reply
  • Ryan Smith - Wednesday, November 30, 2011 - link

    Indeed, that was a diagram error on my part. It's been fixed. Reply
  • piroroadkill - Tuesday, November 29, 2011 - link

    I'd still regard the 6950 2GB as the best value proposition card, and it has been ever since the launch of the card almost a year ago.

    Even though I only bought one recently, and heard the extra shaders had been lasered off, this thankfully proved wrong, and one BIOS update later and I have a 6970.

    You can't ignore value like that.
    Reply
  • Transmitthis14 - Tuesday, November 29, 2011 - link

    Many would ignore "Value" like that.
    I use Premier and Pshop, (cuda Acceleration) and also like the option of Physx with my Games when coded for.
    So A 6950 then becomes a none option
    Reply
  • Exodite - Wednesday, November 30, 2011 - link

    Fringe benefits like those swing both ways though.

    I need three simultaneous display outputs, which render Nvidia options redundant in much the same way.

    Looking strictly at price/performance, which is the metric most consumers would use in this case, the 2GB 6950 is indeed an excellent option.
    Reply
  • grammaton.feather - Wednesday, November 30, 2011 - link

    Oh really?

    http://www.overclock.net/t/909086/6950-shaders-unl...

    I also agree with the CUDA argument. I have used Nvidia for years because of stereoscopic 3d support, more recently CUDA and PhysX and because I don't have driver crashes with Nvidia.

    ATI may be better value for money at the cheap end of the spectrum, so long as u don't mind driver crashes.

    A friend who used his high end PC for business had 170 plus ATI driver crashes logged by windows. I chose an Nvidia GPU for him and the crashing stopped.
    Reply
  • rum - Wednesday, November 30, 2011 - link

    I have had nothing but ATI cards since the 3870, and have NOT experienced these driver crashes you are speaking of.

    I use my computers for gaming and computer programming, and sometimes I do actually stress my system. :)

    I am in the process of evaluating my video needs, as I do need a new video card for a new PC. I won't blindly go with ATI, I will EVALUATE all aspects and get the best card for the MONEY.

    I don't need cutting edge, neither do most people, and we do evaluate things on most bang for the buck.
    Reply
  • VoraciousGorak - Wednesday, November 30, 2011 - link

    Funny, since I Fold on a GTX275, GTX460, and 9800GT on two different systems, and pretty often have NVIDIA's drivers lose their minds when finishing a WU, shutting down the system, or while simply processing a WU. I've tried multiple WHQL and beta drivers and settled with these being the least of my problems (the worst being total system unresponsiveness while Folding and a hard reset after a client stopped.) However, when I folded on my 6950 (granted, different client, but I Folded on it with the Beta client too) it never had a driver freakout, screen blank, or even a flicker when turning on or shutting off WUs.

    It seems to me that people that have major driver issues with AMD (not ATI anymore, by the way) cards are outliers, relative novices, and/or are using very outdated drivers or are recounting stories back from when the ATI 9700 series were new. I'm probably an outlier as well with my NVIDIA gripes, but characterizing a major modern GPU manufacturer with constant widespread driver issues or stating that any GPU is guaranteed to have problems is just silly.
    Reply
  • silverblue - Thursday, December 01, 2011 - link

    Oddly, those people who slate AMD are the ones who never ever seem to have an NVIDIA crash, or at least won't admit to one. Reply
  • Gorghor - Thursday, December 01, 2011 - link

    I know this has no technical relevance, but I thought I'd share since my first reaction when I saw the article title, was "did I make a mistake buying an ATI6950?"

    For some reason I have been sticking to nVidia cards since the days of the Geforce 2. I made an exception with the ATI 9800 and although the performance was good back then, I had many problems with the drivers and really didn't approve of the catalyst control center.

    When I upgraded my mainboard to the Sandy-bridge generation, I left my "old" GTX285 in the system and had several daily crashes related to the nVidia drivers (even after several months and driver updates).

    About a month ago I decided it was time for me to upgrade the graphics. I wanted to stick to nVidia albeit the problems with my GTX285, but the price/performance ratio was just unacceptable when compared to AMD cards. I chose the ATI6950 2GB considering the same money would barely get me a GTX560.

    I have to say I'm impressed. The performance is truly exceptional, even with no shader unlock, the catalyst interface has really matured over the years and my crashes are completely gone. So as far as I'm concerned, blindly going for nVidia is not an option anymore.
    Reply
  • Golgatha - Tuesday, November 29, 2011 - link

    what I take away from this review is the 560-448 is pretty much on par with a stock 570, but it will only be manufactured for roughly 2 months, and then be discontinued. Why would any gamer buy this card when they can get a superior card within $20 of this one and pretty much be guaranteed availability if they want to run SLI? Also, what's with the power connectors on the rear of the card. All of these long cards should situate those connectors on the top of the card. Reply
  • jigglywiggly - Tuesday, November 29, 2011 - link

    I'd avoid it, you will probably be screwd with drivers in the long run. Reply
  • Samus - Tuesday, November 29, 2011 - link

    As far as drivers and modern games go, people with ATI hardware have been having substantially more problems in Battlefield 3. Then the issue of microstutter arises which barely affects nVidia hardware, for what can only be presumed to be a ATI driver problem. Lastly, many people have experienced PowerPlay problems with Radeon 4000-5000 series cards running newer drivers due to clockspeed changes in-game, sometimes causing BSOD's, but always causing performance problems.

    nVidia has been, and always will be, the kind of drivers. The days of Mach64 still plague ATI's driver developement team. nVidia invented unified drivers and changed the game for the industry. They update their SLI profiles more than twice as often as ATI does for Crossfire, which alone shows their loyalty to their SLI users.

    ATI has the hardware engineering team. That's clear. They can produce faster cards that use less power at lower prices. But you will NOT get superior drivers by a longshot.
    Reply
  • Marlin1975 - Tuesday, November 29, 2011 - link

    hahahha.... spoken like a true nvidia employee/spokesman/shill. Reply
  • Mathieu Bourgie - Tuesday, November 29, 2011 - link

    "Then the issue of microstutter arises which barely affects nVidia hardware, for what can only be presumed to be a ATI driver problem."

    That's rubbish, Tech Report and Tom's Hardware articles covering micro-stuttering clearly proved that micro-stuttering is an issue on both AMD and Nvidia video cards.

    While I'll agree that on average, Nvidia drivers tend to be less problematic, the quality of AMD drivers has improved over the years, aren't as problematic as they used to be and aren't nearly as far behind Nvidia drivers as you paint it.
    Reply
  • greylica - Tuesday, November 29, 2011 - link

    It's another card based on Ferm(ented) hardware. I will pass by for all of GTX 4XX and 5XX waiting for cards that do respect their own specifications in OpenGL without fails like the well known Nvidia problem about GLReadPixels... Reply
  • Alka - Tuesday, November 29, 2011 - link

    Hey, I've seen you're picture before Mathieu. Aren't you the guy who steals the Tom's Hardware best graphics cards for the month format and content? You pass it off as your own on your personal blog, right? Reply
  • Mathieu Bourgie - Tuesday, November 29, 2011 - link

    Stealing from their content?

    If you paid attention to the content of my article and compared it to the content of Tom's Hardware article, you'd notice that my recommendations tend to be quite different from theirs and that they are backed by factual performance numbers coming from various sources, including AnandTech. I also tend to publish my monthly updates before Tom's Hardware do.

    Stealing "their" format?

    It's like saying that PC makers are copying Apple ideas because "Apple did it first", when Apple didn't do it first. Apple took an existing idea and improved on it and then when some PC makers decided to have their take on the idea, Apple fanboys are crying that PC makers are copying "Apple's idea".

    In the past, many other websites, who write about various topics, did "value comparison" articles and did so before Tom's Hardware did theirs. Guess what? Such "value comparison" articles existed before the Internet was mainstream.

    The concept that I'm stealing "their" format is as ridiculous as someone saying that website B, who did a review on the latest CPUs, stole website A format because website A were the first to review the latest CPUs.
    Reply
  • Alka - Tuesday, November 29, 2011 - link

    Dude, you've directly copied and pasted sections of their article without giving credit. That's beyond your own take on a value comparison.

    Do you really want me to post specific examples of your plagiarism?
    Reply
  • Mathieu Bourgie - Tuesday, November 29, 2011 - link

    Alka,

    While I wouldn't be surprised that our articles share some similarities, including similar recommendations, which is to be expected since our articles cover the same products available on the market, accusing me of directly copying and pasting sections of their article without giving credit is taking your comment to a whole other level of disrespect and makes me wonder what's your goal here, if not only trolling...

    "Do you really want me to post specific examples of your plagiarism? "
    While I'd have no problem listening to you and perhaps making some correction(s) to my article if it possibly looks like someone may think that I could have copied content from Tom's Hardware article, I would rather not do this over this comments section at AnandTech, out of respect for Ryan, Anand and the rest of AnandTech's team, whose articles I truly enjoy and wouldn't want to add a bunch of unrelated comments to their great articles.

    I invite you to contact me via:
    - The contact form on my website
    - Facebook
    - Twitter
    - Google my name
    - Etc.

    You have plenty of ways to reach me if you want to further discuss this, without filling this comment section with comments unrelated to this article and without bothering everyone else here.

    Thanks,
    Mathieu
    Reply
  • Alka - Wednesday, November 30, 2011 - link

    No point, you've clearly convinced yourself that shameless plagiarism is acceptable without giving proper credit. Reply
  • JonnyDough - Thursday, December 01, 2011 - link

    Irregardless of any fanboyism here, or whether or not you steal text,

    "That's rubbish, Tech Report and Tom's Hardware articles covering micro-stuttering clearly proved that micro-stuttering is an issue on both AMD and Nvidia video cards."

    That is truth.
    Reply
  • formulav8 - Tuesday, November 29, 2011 - link

    One of the most useless and fannish posts on here. Reply
  • Assimilator87 - Tuesday, November 29, 2011 - link

    Bad drivers are the reason I'm switching back to ATi next round. Six months to fix a very annoying F@H issue is not acceptable in my book. Reply
  • Zed03 - Tuesday, November 29, 2011 - link

    Just keep in mind this review is from the same people that claimed GTX 560 ti is faster than 6950.

    All of these charts were selected to make the the GTX 560 448 look good.

    On the Crysis, Battleforce, and Metro page, when the 6970 has a 30% lead, they call it a:

    "With Metro 2033 we see AMD and NVIDIA swap positions again, this time leaving AMD’s lineup with the very slight edge."

    On the remainder of the reviews, when the GTX 560 448 performs 5% faster in a certain configuration, the performance is "dominating".

    Wait for a real benchmark.
    Reply
  • Mstngs351 - Wednesday, November 30, 2011 - link

    You may want to drop your biased view of Anandtech and pay closer attention to the article.

    "With Metro 2033 we see AMD and NVIDIA swap positions again, this time leaving AMD’s lineup with the very slight edge." You'll notice that it states "lineup" not 560ti 448.

    You would also do well to notice that when it came to Metro 2033 the 580 was top dog at all resolutions so making the statement that AMD has a "slight edge" is, in my opinion generous.
    Reply
  • DaveLessnau - Tuesday, November 29, 2011 - link

    This is purely a comment about the editing of the article. I don't know if it's a function of your word processing software or if it's something you did on purpose, but ordinal numbers like 1st, 2nd, 3rd, etc., don't superscript the alpha part. Every time I ran across your superscripted ordinals, my eyes just locked onto the word and I lost track of the article flow. If at all possible, please fix that. Reply
  • DaveLessnau - Tuesday, November 29, 2011 - link

    Oh, and I know that Word superscripts ordinals, too. But, it doesn't put the baseline of the superscript above the top line of the regular characters. Reply
  • Ryan Smith - Wednesday, November 30, 2011 - link

    You're correct, I am using Word to write the text for these articles. Our CMS converts Word superscript to <sup> tags, which is why you're selling the final result since HTML4 says that superscript "appears half a character above the baseline". I do like having superscript, but you make a good point in that being above the baseline is annoying, so I'll go take a look at it. Reply
  • siberian3 - Tuesday, November 29, 2011 - link

    Who cares for such a product so late in the game when we
    are waiting Radeon HD7xxx and NV GTX6xx cards in a few months?
    Reply
  • Performance Fanboi - Tuesday, November 29, 2011 - link

    If they were going with something that clumsy they should have just gone with GTX560-ClearingSiliconinAdvanceofKepler-448

    Seriously though, GTO560 for a so-called limited card at the end of the product cycle would have fit better eg: 7900GTO when they were clearing out G70.
    Reply
  • nevertell - Tuesday, November 29, 2011 - link

    Have you tried unlocking the locked off cores ? This was the only incentive to buy a 465 back in the day. That's what I was hoping for anyway, when I first heard about nvidia releasing another GF100 based product. Reply
  • Leyawiin - Tuesday, November 29, 2011 - link

    Its a very good card for a decent price if you can't wait for the next gen from both companies. Kind of amusing all the withering comments and hair-splitting over review game choices. Its a better card than the HD 6950. Period. Give it the due it deserves. Reply
  • Marlin1975 - Tuesday, November 29, 2011 - link

    I can get a 6950 for $200 AR right now. The 560-448 is going in the low $300 range right now.

    Unless it gets 50% more frames/performance it is not better than a 6950.

    http://www.newegg.com/Product/Product.aspx?Item=N8...
    Reply
  • Finally - Tuesday, November 29, 2011 - link

    For several months now the 2 leading cards in any P/P comparison have been the HD6870 and its HD6850 twin. I just picked one up this month and I'm delighted. As long as consoles won't learn how to upgrade their GPUs, I don't see a necessity for anything above that range of graphics power...

    Of course, this is an *throws up* enthusiast website, so anyone who's not willing to build a system with at least 2 GPUs, a 1200W power supply and a triple monitor setup, leave the room, we are not interested in you.
    Reply
  • Leyawiin - Tuesday, November 29, 2011 - link

    The cheapest 1GB HD 6950 on Newegg (who is generally the lowest price on video cards) is $240. The cheapest HD 6950 2GB is $255. You pay that upfront - rebates (if they go well) are months down the road. The GTX 560 448 is about 10% faster. Yes, its the better card. Reply
  • Marlin1975 - Tuesday, November 29, 2011 - link

    The cheapest 560-448 I have seen so far is in the low $300 range. I can get a 6950 in the low $200 range.

    10% more performance for 50% more price is not better.
    Reply
  • Ushio01 - Tuesday, November 29, 2011 - link

    The image of the GF110 on the first page is wrong it shows 3 deactivated SM units which would make this card a 416 shader part. It should show 2 deactivated SM units for a 448 shader part. Reply
  • Ryan Smith - Tuesday, November 29, 2011 - link

    Thanks. You are correct. That will be fixed later today. Reply
  • Per Hansson - Tuesday, November 29, 2011 - link

    Hi, what about overclocking?
    Is this GPU poor at it since it's been binned so hard or is it just that another SM unit where bad while not hindering the clockspeed of the chip?
    Reply
  • Ryan Smith - Wednesday, November 30, 2011 - link

    I did not have a chance to test overclocking (I only had a single day to test the card). However since NVIDIA is binning chips based on defective SMs, I have no reason to believe that overclocking should be significantly different from the 570. Reply
  • MrSpadge - Tuesday, November 29, 2011 - link

    "NVIDIA is purposely introducing namespace collisions, and while they have their reasons I don’t believe them to be good enough."

    That nails it. Not sure if I should laugh or cry about the current name. They introduced a 3-letter prefix and 3-digit numbers to get rid of the obscure subscripts.. only to reintroduce the "Ti" (why was that one not the GTX565?!) and now this addition to "Ti". Hilarious, if this were a commedy show.
    And remind you, just because nVidia did much worse in the past doesn't make this any better...

    MrS
    Reply
  • Belard - Tuesday, November 29, 2011 - link

    That is why I DO NOT BUY or SELL nvidia products. This new name proves they are getting dumber by the month. This should be a 570-LE, simple .

    All these names are stupid since the end of the GeForce 9000 series. While AMD has been mostly good with their names... Mostly. AMD HD 6870, easy.

    Gtx vs gt is stupid since they don't make a 550 gt and 550 gtx. TI = totally worthless for a name. If this only atttracts dumb customers, they can keep them.
    Reply
  • ericore - Tuesday, November 29, 2011 - link

    This card is the most Perfect example of a corporation trying to milk the consumer.
    The new Geforce cards are just after Christmas, so what does Nvidia do release a limited addition crap product VS what's around the corner and with a crappy name. The limited namespace is ingenious, but I must hardheadedly agree with Anand on the namespace issue.

    Intelligent people will forgot this card, and wait till after Christmas. Nvidia will have no choice to release Graphics card in Q1 because AMD is going to deliver a serious can of whip ass because of their ingenious decision to go with a low power process silicon VS high performance. You see, they've managed to keep the performance but at half the power then add that it is 28nm VS 40nm and what an nerdy orgasm that is. Nvidia will be on their knees, and we may finally see them offer much lower priced cards; so do you buy from the pegger or from the provider? That's a rhetorical question haha.
    Reply
  • Revdarian - Tuesday, November 29, 2011 - link

    Actually after Christmas you can expect is a 7800 by AMD (that is mid range of the new production, think around or better than current 6900), one month later with luck the high end AMD, and you won't expect the green camp to get a counter until March at the earliest.

    Now that was said on a Hard website by the owner directly, so i would take it as being very accurate all in all.
    Reply
  • ericore - Tuesday, November 29, 2011 - link

    HAha, so same performance at half the power + 28nm VS 40nm + potential Rambus memory which is twice as fast, all in all we are looking at -- at least -- double frame rates. Nvidia was an uber fail with their fermi hype. AMD has not hyped the product at all, but rest assure it will be a bomb and in fact is the exact opposite story to fermi. Clever AMD you do me justice in your intelligent business decisions, worthy of my purchase. Reply
  • HStanford1 - Wednesday, December 07, 2011 - link

    Can't say the same about their CPU lineup

    Roflmao
    Reply
  • granulated - Tuesday, November 29, 2011 - link

    The ad placement under the headline is for the old 384 pipe card !
    If that isn't an accident I will be seriously annoyed.
    Reply
  • DanNeely - Tuesday, November 29, 2011 - link

    "It’s quite interesting to find that idle system power consumption is several watts lower than it is with the GTX 570. Truth be told we don’t have a great explanation for this; there’s the obvious difference in coolers, but it’s rare to see a single fan have this kind of an impact."

    I think it's more likely that Zotak used marginally more efficient power circuitry than on the 570 you're comparing against. 1W there is a 0.6% efficiency edge, 1W on a fan at idle speed is probably at least a 30% difference.
    Reply
  • LordSojar - Tuesday, November 29, 2011 - link

    Look at all the angry anti-nVidia comments, particularly those about them releasing this card before the GTX 600 series.

    nVidia is a company. They are here to make money. If you're an uninformed consumer, then you are a company's (no matter what type they are) bread and butter, PERIOD. You people seem to forget companies aren't in the charity business...

    As for this card, it's an admirable performer, and a good alternative to the GTX 570. That's all it is.

    As for AMD... driver issues or not aside, their control panel is absolutely god awful (and I utilize a system with a fully updated CCC daily). CCC is a totally hilarious joke and should be gutted and redone completely; it's clunky, filled with overlapping/redundant options and ad-ridden. Total garbage... if you even attempt to defend that, you are the very definition of a fanboy.

    As for microstutter, AMD's Crossfire is generally worse at first simply because of the lack of frequent CFX profile updates. Once those updates are in place, it's a non issue between the two companies, they both have it in some capacity using dual/tri/quad GPU solutions. Stop jumping around with your red or green pompoms like children.

    AMD has fewer overall features at a lower overall price. nVidia has more overall features at a higher overall price. Gee... who saw that coming...? Both companies make respectable GPUs and both have decent drivers, but it's a fact that nVidia tend to have the edge in the driver category while AMD have an edge in the actual hardware design category. One is focused on very streamlined, gaming centric graphics cards while the other is focused on more robust, computing centric graphics cards. Get a clue...

    ...and let's not even discuss CUDA vs Stream... Stream is total rubbish, and if you don't program, you have no say in countering that point, so please don't even attempt to. Any programmer worth their weight will tell you, quite simply, that for massively parallel workloads where GPU computing has an advantage that CUDA is vastly superior to ANYTHING AMD offers by several orders of magnitude and that nVidia offers far better support in the professional market when compared to AMD.

    I'm a user of both products, and personally, I do prefer nVidia, but I try not to condemn people for using AMD products until the moment they try to assert that they got a better deal or condemn me for slightly preferring nVidia due to feature sets. People will choose what they want; power users generally go with nVidia, which does carry a price premium for the premium feature sets. Mainstream and gaming enthusiasts go with AMD, because they are more affordable for every fps you get. Welcome to Graphics 101. Class dismissed.
    Reply
  • marklahn - Wednesday, November 30, 2011 - link

    Simply put, nvidia has cuda and physx, amd has higher ALU performance which can be beneficial in some scenarios - gogo OpenCL for not being vendor specific though! Reply
  • marklahn - Wednesday, November 30, 2011 - link

    Oh and Close to the Metal, brook and stream are all mainly things of the past, so don't bring that up please. ;) Reply
  • Revdarian - Wednesday, November 30, 2011 - link

    Such a long post does not make you right, in the part of "CUDA vs Stream" you actually mean "CUDA vs OpenCL and DirectCompute" for example, as those are the two vendor agnostic standards, so that just shows that what is really "rubbish" is your attempt to pose as an authority on the subject. Reply
  • ericore - Wednesday, November 30, 2011 - link

    Its fine that they need to make money; but they insult my intellegence which is why I am putting them down.
    There is no justification for buying this reviewed card; any statement in contradiction to this is a folly.
    It is true that Nvidia has superior drivers, and superior professional support, and superior architecture for professionals.
    But most people fall out of this branch, and therefore AMD is the better contender for shear gaming performance and Eye Infinity far superior
    than what Nvidia offers. AMD's control panel, can use some work; you're right about that, the total garbage aspect reveals in fact that you are
    an Nvidia fanboy; you betrayed yourself. I don't care for the microstutter argument. As for the AMD has less features argument, it is absolute garbage; I gagged at your narrow-mindedness as you seem only able to present the professional perspective rather than being objective. AMD in fact, for consumers offers all relevant features that Nvidia offers plus more, minus 3D which is still irrelivant at this point; we (the ppl) don't have 3D TVs. Cuda is superior, but AMD can still rape ( you heard me right ) Nvidia in software like Elcomsoft Wireless Auditor, conversely ditto for Nvidia regarding video rendering. Ha you Nvidia fanboy, blessing and protecting each feature Nvidia has to offer; isn't that cute. Power users lol, let's get one thing straight power users does not mean Professional; only professional means that. Power Users just means users who can and do use wide variety of software, can extend beyond this software, and has knowledge of programming; check mark to all, I have. You naughty Nvidia fanboy.
    Reply
  • cactusdog - Monday, December 05, 2011 - link

    Its funny when people complain about AMD drivers when its obvious they have not used them, or are very new to them.

    CCC isnt " Ad ridden" The AMD home page can be completely disabled (unticked) to not show any web content. Only someone who is unfamiliar with AMD software would not know that.

    CCC has built in overclocking control and manual fan control and all the settings one would need. If you cant cope with them you can set CCC to Basic mode.

    The only people that complain about AMD drivers are 99% of the time first time users and have little to no experience with it.

    Crossfire/SLI is a different matter and both companies have issues. I've been recommending against a multi-gpu setup for years. If you choose a multi-gpu setup be prepared for driver issues, stuttering, waiting for profiles, and some games that will never have multi-gpu support.

    I've used both and never had driver issues with either, but I prefer AMD image quality to Nvidia. Thats the most important thing for me. I dont use anything that can make use of Cuda, and physx is mostly a marketing ploy.
    Reply
  • HStanford1 - Wednesday, December 07, 2011 - link

    I've got two 460's and never had to bother with botched drivers or microstuttering.
    Maybe I'm just lucky, but I dread the day it happens
    Reply
  • bill4 - Wednesday, November 30, 2011 - link

    There's a lot to like about your reviews, but why the same old dated games you've been benchmarking forever? Why no BF3, Crysis 2, Witcher 2, etc benchmarks? EG, the latest and greatest most demanding games? Heck you guys even still use Hawx, I have NO idea why that game has a sequel and is 500 years old! I dont care what these video cards do on old games where they get 140 FPS, which I see in so many reviews! I look at the results in the most demanding games.

    Well the reason you mentioned in one review for using Hawx is, "it's the only flight game with a built in benchmark" or something like that. As if you just want to press a "benchmark" button and not do any actual work. Seems lazy, just use fraps or something for a bench and update your games, please!
    Reply
  • Ryan Smith - Wednesday, November 30, 2011 - link

    We update our benchmark suite every 6-12 months as necessary. As you've noted the current suite is rather long in the tooth and we'll be updating the benchmark suite next month (December) when we switch the testbed to SNB-E. In the meantime we're using the current suite to keep the tests consistent for this generation of cards. Reply
  • Alexo - Thursday, December 01, 2011 - link

    Switching to SNB-E will be a disservice to most of your readers (which don't use that platform) as it will give skewed results. Reply
  • carage - Wednesday, November 30, 2011 - link

    Does anyone know how this card handles HDMI Audio Bitstreaming?
    I assume it would inherit the same half-baked feature set as the old 570.
    So HTPC users should steer away unless proven otherwise.
    Reply
  • Ryan Smith - Wednesday, November 30, 2011 - link

    As HDMI audio bitstreaming is a function of the GPU (rather than drivers or otherwise), it will be the same as GTX 570/580. Reply
  • Per Hansson - Wednesday, November 30, 2011 - link

    carage: what's wrong with it on the 570 & 580? (I own neither) Reply
  • jweller - Wednesday, November 30, 2011 - link

    How is $280 considered "budget"? Reply
  • venomblade - Wednesday, November 30, 2011 - link

    Is it a type-o how the 560 ti 448 has a mem clock of 900mhz and yet you say it has an effective speed of 3800mhz? Shouldn't it be 3600mhz? Reply
  • Ryan Smith - Wednesday, November 30, 2011 - link

    Heh, it turns out doing math on a plane is harder than I thought. Thanks for that. Fixed. Reply
  • JonnyDough - Thursday, December 01, 2011 - link

    Good strategy to be getting rid of bad chips and appearing to dominate a bit more of the high end market, even if it is the same product. Still, they are not necessarily better cards than what AMD offers, and remember how late Fermi was coming to the game. AMD will soon have the 7000 series, which everyone but the NVidia camp and its followers are awaiting. =) Reply
  • Finally - Thursday, December 01, 2011 - link

    Anand doesn't offer any Price/Performance comparisons. If they did, the HD6850 and HD6870 haven been clogging the first 2 places for several months now... Reply
  • JonnyDough - Thursday, December 01, 2011 - link

    "NVIDIA is purposely introducing namespace collisions, and while they have their reasons I don’t believe them to be good enough"

    Its really no surprise, Nvidia has been f'n up names since Matrox was around. Whoever keeps naming their cards should have been fired in 1997.

    If you were to post one only one comment here with the ability to rate it to 1000 that said "Nvidia sucks at naming their video cards" it would be rated up to 1001. Their confusing naming schemes are one big reason I buy AMD's video cards. At least with AMD I can keep track of what I am buying. I don't even bother keeping up with Nvidia cards for this very reason. You would think that a simple numbering system based on performance would suffice, and additional letters for things that specialty cards support. For instance, if only specific rare cards could do CUDA processing, then add a "C" moniker. This isn't hard Nvidia. Fix it with your next generation, or keep losing customers because of your stupidity.
    Reply
  • silverblue - Thursday, December 01, 2011 - link

    Yes, but AMD aren't perfect either; the 6770 and 5770 spring to mind.

    NVIDIA reminds me of Intel albeit not so bad; with Intel you have to research whether the chip you're looking at even supports VT-d.
    Reply
  • Finally - Thursday, December 01, 2011 - link

    I don't care much for names, although I found their re-re-re-branding of the 8800GT atrocious... Well, I just tend to go for the GPU maker that offer me the most bang for the buck - and none of these cards have cost me more than 150€... my last 3 cards where 7600GT, HD4850 and now it's an HD6870. German hardware review page always offers a Performance/€ comparison table and that's where I look before I go shopping. Reply
  • Finally - Thursday, December 01, 2011 - link

    ...and their name is computerbase dot de Reply
  • Burticus - Thursday, December 01, 2011 - link

    Good for Nvidia to be able to utilized slightly flawed chips... but what's the consumer value here? The price point @ $300 puts it next or very close to GTX 570. Why not just get one of those? The 560 TI's are finally getting down to the $200 range.... maybe if the 448 core version was $250 instead of $300. Reply
  • Matrices - Thursday, December 01, 2011 - link

    Agreed. The pricing is off the mark. 570s are $330 to $350 and 560 Tis can be had for $200 to $220 with minor rebates.

    The $290-$300 price point of this thing is skewed too far in the wrong direction.
    Reply

Log in

Don't have an account? Sign up now