POST A COMMENT

79 Comments

Back to Article

  • Marlin1975 - Tuesday, March 15, 2011 - link

    But over priced.

    If this was in the $100 area it be a much better buy. But the cheaper 460 is better right now.

    Also you have the 450 in yoru graph as a 256bit bandwidth, not the 128bit it is.
    Reply
  • vol7ron - Tuesday, March 15, 2011 - link

    I'm not usually an advocate of OCing gpus, but I'm curious how much more performance could be achieved. We know there's some room in the memory, how much more can the gpu/shaders really extract? While Zotac OCs, they normally don't max it out on air cooling, so a little testing would be nice :) Reply
  • slickr - Tuesday, March 15, 2011 - link

    Its a crap card.

    Its about $50 overpriced, its worse in consumption, noise and coolness than Nvidia's own 1 year old GTS 450.

    So how can this be a good card? For the same price I can get a GTX 460 768mb that performs 20% faster and I can get cooler, quieter and less power draw card for $50 less in the 5770 and still get the same performance.

    If you ask me this card is a rip off for consumers who don't know anything about graphic cards.
    Reply
  • Aircraft123 - Tuesday, March 15, 2011 - link

    I really don't understand why nVidia is so concerned with these lower performing cards.

    Their own cards from TWO GENERATIONS AGO perform better then this "new" card.

    I have a GTX275 it will perform equivalent or better than this new card and you can find it on a particular auction site for ~$100.

    The only thing missing from my 275, is directX 11. Which unless you get a 470 or greater the card isn't powerful enough to run any dx11 stuff anyway.

    I could also say the same for AMD considering the 4870 performs better than the 5770.

    I am interested in the dual fermi card due out soon though. It will be interesting if/how they can beat the 6990 with lower power sonsumption/noise.

    Anyhow good article.
    Reply
  • Marlin1975 - Tuesday, March 15, 2011 - link

    The reason is the re-work I bet has a better yeild number, let alone more performance from the same chip with a new series number.
    So people think they are getting new cutting edge when its just a 4xx series chip re-worked.
    Reply
  • Taft12 - Tuesday, March 15, 2011 - link

    In fairness, if we are considering dollars and cents your GTX275 has a TDP twice the 550 Ti, and probably eats up double or more at idle as well. Reply
  • jiffylube1024 - Tuesday, March 15, 2011 - link

    The GTX 275 costs a lot more to make than the Ti 550 (they couldn't mass manufacture the GTX 275 card at a $100 or even $150 pricepoint and hope to make a profit) and a GTX 275's that you could find for $100 today would either be old stock (meaning they're just clearing inventory) or a used card, meaning no profit for Nvidia whatsoever.

    It's pretty obvious that companies come out with these cards to occupy lower pricepoints... The problem is that, as you point out, they are often too cut-down and previous generation cards throttle them. It's a balance, and when they hit the right price/performance, magic happens (GTX 460), but they often miss the mark on other cards (GTS 450, Ti 550).

    Even if you examine the Ti550 on paper, it stands no chance vs the GTX 460 -- it has 56% less shader power than the GTX 460 (336 shaders vs 192; a similar drop in texturing power) and not enough of a clockspeed advantage (900 MHz vs 675 MHz) to make up for that. At $150, the Ti550 is a total waste since you can find GTX 460's for $130 or less these days. It's going to take a fall below $100 for these cards to become worthwhile.

    Nevertheless, if GTX 460 stock dries up, then without a Ti 550, Nvidia has a gaping hole below $250.

    I think this has a lot to do with manufacturing costs -- it's not economical to keep making GTX 460's and sell them for ~$100. The Ti 550 has 66.7% the transitor count of the GTX 460, meaning a much cheaper die to manufacture.
    Reply
  • Kiji - Tuesday, March 15, 2011 - link

    "Indeed the GTX 550 Ti is faster than the 5770 - by around 7% - but then the 5770 costs 36% more." - I think you mean "..36% less."

    Good review, and it's disappointing NVidia doesn't want to change their mentality.
    Reply
  • Kiji - Tuesday, March 15, 2011 - link

    Nevermind, already fixed :) Reply
  • passive - Tuesday, March 15, 2011 - link

    At the end of page 5 you say that the 550 is ahead of the 5770 by 50%, and at 90% of the 6850. According to your graphs, both of these are very wrong (even when using the Zotacs numbers):

    23.1 (550) / 19.8 (5770) = 16.6%
    23.1 (550) / 29.8 (6850) = 77.5%

    What's weird is that immediately after you say how the 550 is beating the 450 by 30%, which is accurate, but further paints a pro-Nvidia picture.

    I know we live in era of community edited content, but in order to prevent accusations of bias, you should really catch this before you publish.
    Reply
  • vedye - Thursday, March 17, 2011 - link

    Thanks for pointing that out!! But the author will not respond to you. Anantech is already proven pro-Nvidia. If I were them, I would ignore ur post as well. Just keep pretending. Reply
  • Demon-Xanth - Tuesday, March 15, 2011 - link

    From the consumer standpoint, why would I get a 550 over a 460? I read through this and can't come up with a single reason. Reply
  • Gami - Tuesday, March 15, 2011 - link

    there's no point in getting it.. they need to eliminate the stock of the 400 series first, from all outlets, so that this thing would actually have a chance to be even thought of being bought. Reply
  • Taft12 - Tuesday, March 15, 2011 - link

    Soon that reason will be "the GTX 460 768MB is not available", but that is not yet true, and indeed there is no reason to buy this card. Reply
  • qwertymac93 - Tuesday, March 15, 2011 - link

    how did they get 1GB of memory with a 192-bit bus? Are you sure its not 768MB? Reply
  • Demon-Xanth - Tuesday, March 15, 2011 - link

    There's a whole page on that. Plus many comments on other pages. Reply
  • z3nny - Tuesday, March 15, 2011 - link

    Yeah, RTFA. Reply
  • Lolimaster - Tuesday, March 15, 2011 - link

    So after near 1.5years HD5770 is the better buy for mid value customers. Right now in many places HD6850 costs less than the 460 1GB and performs better (even more with the new 11.4pre and future (mejolnir "driver" updates) Reply
  • qwertymac93 - Tuesday, March 15, 2011 - link

    never mind, read the next page... And now i wish you guys had an edit function... Reply
  • z3nny - Tuesday, March 15, 2011 - link

    It would have helped if you RTFA first before posting like a fool. Reply
  • silverblue - Wednesday, March 16, 2011 - link

    And yet you leapt on him not once but twice about the same thing, despite the OP admitting his/her mistake.

    Really not constructive.
    Reply
  • therealnickdanger - Tuesday, March 15, 2011 - link

    Perhaps I missed it, but does this carry all the A/V features of other 5xx cards or of 4xx cards? Reply
  • mosox - Tuesday, March 15, 2011 - link

    That card is competing with an ATI card that was released in...2009.

    In this review 6 out of the 10 games tested are TWIMTBP games favoring Nvidia. I guess there will never be transparent criteria for selecting the test games in here. Looking forward to see 110% of the games tested on Anand being TWIMTBP games.
    Reply
  • medi01 - Tuesday, March 15, 2011 - link

    What's TWIMTBP? Reply
  • HangFire - Tuesday, March 15, 2011 - link

    The Way It (was) Meant To Be Played- Nvidia's program to encourage game developers to optimize for their video cards. Reply
  • Ryan Smith - Tuesday, March 15, 2011 - link

    Our criteria for picking new games is rather straightforward based on several factors: does the game make significant use of hardware features, is it challenging to high-end GPUs, is it possible to get consistent test results, is it popular enough that people play it/know what it is, and does it cover a suitable genre (we don't want all FPSs). We also take reader suggestions in to account - and indeed if you read the article at one point we were soliciting suggestions for a new UE3 game for the next refresh.

    At this point I honestly couldn't tell you what games in our lineup are TWIMTBP games. It's not something we factor in one way or another. The fact that NV invests as much money as it does in the program is naturally going to make it hard to avoid such games though, if that's what we intended to do.
    Reply
  • JarredWalton - Tuesday, March 15, 2011 - link

    As a funny side note, DiRT 2 is an "AMD/ATI" game judging by the loading screens, yet it still favors NVIDIA in general. Ultimately, you buy cards for the performance, price, and power requirements. I'm not sure why you'd even suggest that we're trying to run all TWIMTBP games when our final recommendations are so heavily in favor of the AMD cards this round. Reply
  • nitrousoxide - Tuesday, March 15, 2011 - link

    I've been wondering that since the first time I saw Anandtech's graphics test. You are displaying so many data no matter what card you are testing. Is it even relevant to show 5970 or GTX580? That makes the graph less readable. Reply
  • fullback100 - Tuesday, March 15, 2011 - link

    Yeah I would rather see old video cards like 3850 and 8800GT than 5970 or GTX580. Really, how many people have top of ends cards? There would be a lot more people with video cards from like two generations ago. Reply
  • Taft12 - Tuesday, March 15, 2011 - link

    For starters you can't compare older cards on new games that use DX11. Next, most people are surprised to find out just how uncompetitive 2-generation-old cards are. Those 2 are probably in line with current GT440 or 5670. Many miles behind the slowest cards in these comparisons. Reply
  • HangFire - Tuesday, March 15, 2011 - link

    For a while, AT listened and included the 8800GT with most tests. This was a great baseline as most people understood where their card fell in, compared to the 8800GT.

    AT has since decided (again) that all of us play nothing but the latest Dx11 games in Dx11 mode with all the goodies turned on, and the only folks upgrading already own Dx11 cards anyway.
    Reply
  • mapesdhs - Tuesday, March 15, 2011 - link


    Very true!

    I've been collating performance results to compare older cards to newer
    models as and when I can. Google: "Ian PC Benchmarks", it's the first link
    that comes back (Blinkenlights site), then select, "PC Benchmarks, Advice
    and Information". Note though that Blinkenlights is a mirror, my main site at
    sgidepot is always updated first and more often.

    I've included lots of 8800GT, 4890 and GTX 460 1GB data so far and I've
    just obtained a 9800GT, 3850 AGP (should be a giggle!) and intend to obtain
    various other older cards, including a GTX 275/285. I also have an X1950
    Pro AGP (don't giggle, I got better results than reviews of the PCIe version).

    Platform wise, I'm trying to put together a socket 775 build and also an
    AM2/AM3 setup (I've already obtained a Core2Duo 6850 and Q6600
    Core2Quad, though no mbd yet). And I'm adding further P55 examples, eg.
    I've obtained an i5 670 and will be including an i5 760 aswell. All this done
    on a stupid small budget btw (eBay madness), so if anyone has S775 or
    AM2 parts they don't want, feel free to contact me. eBay is not exactly
    bargain central anymore. :\ If you're after the highest price though, eBay
    is best. Or of course free donations are welcome! 8) (I'll cover the postage;
    I'm in the UK) I want to create a spread of data that will be genuinely useful
    to people.

    I don't have Crysis or Metro33 to test with, but I've used a good range of
    freely available tests (recommendations welcome; I'm not going to use AVP
    though - I had a look, thought it was rather poor). When I have the time I'll
    also add real game tests using my own tests, focusing more on older titles
    as that's a common issue people have (I'll be testing with Oblivion, the
    1st Stalker game, CoD WAW and a few others).

    I'm also including pro apps as & when I can since I do have a number of
    borrowed Quadro FX cards to test aswell (580, 1500, 1700, 4500, 5500,
    5600, etc.) which will all be for sale once the tests are done. So far I've
    done some tests on the 1500 and 5500, but until I sort out a proper certified
    X58 setup (for SLI) the results won't be fully fleshed out (Dell T7500
    barebones on its way, need parts). Interesting to compare gamer & pro cards.

    Note that I'm not massively familiar with older gfx cards, so suggestions are
    welcome as to what I should include and/or look for. Feel free to email with
    ideas (contact page is on my site, just email my Yahoo acccount). Don't post
    here though as that'll only clog up the thread.

    Lastly, I'm also putting together a standard X58 setup in a little while, but
    first I want to sort out the older systems.

    Oh, for those commenting about DX11 on older cards, that's absolutely
    true, which is why whenever possible I run each test in all three modes,
    ie. DX9, DX10 and DX11.

    Ian.

    PS. If there happens to be anybody in the Edinburgh area who has a card
    they'd be willing to lend me so I can add results for it, please let me know.
    You can visit and see for yourself. I'm in the Corstorphine/Clermiston area.
    Reply
  • medi01 - Tuesday, March 15, 2011 - link

    At least this time it doesn't make you color blind, and bar colors make sense (on most charts) unlike in AMD notebook review. Reply
  • Samus - Tuesday, March 15, 2011 - link

    ...but so does the GTX460. The 550 comes close to the 'stock' 460 when it is radically overclocked, just as the 460 can beat $200+ cards when it is radically overclocked.

    I appriciate the overclocking 'potential' and coverage, but ever since the eVGA GTX460 FTW review, AT has been dilluting the true nature of these products with overclocked cards carrying heavier weight in the charts than they should.

    Your older reviews (<2009) always had a overclock section, omiting the overclocked nature from the rest of the charts. I liked that.

    I just don't like seeing overclocked cards reviewed. They are limited runs and YMMV; the eVGA 460 FTW was available for less than a month after you reviewed it, and has since been replaced twice with the Superclocked, and now the Superclocked Extreme Edition, all of which has had varying GPU/BUS/MEM clocks at prices in excess of $80 over the stock cards. That's BS.
    Reply
  • mapesdhs - Tuesday, March 15, 2011 - link


    Actually the FTW is still easily available, I bought another two last week for a PC I'm
    building for a friend.

    Ian.
    Reply
  • Ryan Smith - Tuesday, March 15, 2011 - link

    A lot has changed since 2009. The biggest of which is that NV and AMD have both given manufacturers more freedom in their designs, and simultaneously manufacturers have been looking to further differentiate their products beyond the cooler and price. Factory overclocks are how they're doing it - it allows them to build a card with a higher performance level for little extra cost, increasing their gross margin while filling small holes in the market.

    Truth be told it creates a bit of a hassle for us as this results in a different card/clock combo every $10, but clearly it's an effective strategy for the manufacturers. At the same time I get why it frustrates you guys, which is why we don't include these cards on our long run charts. But when it comes to reviewing custom cards it's going to be rare to see cards without a factory overclock - most enthusiast cards now have a factory overclock, and what the manufacturers are willing to sample.

    On the plus side, as mapesdhs has already noted, manufacturers are getting better about availability. These cards will never have the kind of long term availability that reference clocked cards do (largely due to the fact that it's a single supplier versus many), but many of them are available through the primary market lifetime of the card (which is to say until it's replaced by a newer GPU).
    Reply
  • mapesdhs - Tuesday, March 15, 2011 - link


    Has to be said though, I didn't expect the FTW to still be that easily available,
    but it is.

    However, the earlier poster is also correct that there are slightly lower clocked
    alternatives from EVGA that cost less, in one case the core/shader clocks are
    the same, just a bit slower RAM (the SSC version). Shop around, and note that
    sometimes minor differences in prices can be negated by varying shipping costs
    between suppliers. I know one company that keeps offering 'special' deals, but
    their shipping costs are so high that they're usually more expensive overall than
    alternative sources.

    I bought the FTWs because that's what my friend wanted, basically a replica
    of the system I built for myself.

    Ian.
    Reply
  • DrPop - Tuesday, March 15, 2011 - link

    I love this site and all the reviews are generally very good.
    However, I am at a loss as to why this and all other GPU reviewers still use aged compute benchmarks such as folding @ home, etc.

    Could you PLEASE start running some BOINC tests with QUALITY, optimized, MODERN code for the latest GPUs, so that the world can see the real "number crunching power" each GPU possesses?

    Examples of this would be DNETC on BOINC (highly optimized), or Collatz, etc.

    I am quite sure you will be surprised at how the computing bar graph will look - it will be very different than the graphs that come out of your current, aged compute code suite.

    Thank you!
    Reply
  • Ryan Smith - Tuesday, March 15, 2011 - link

    It's true Dnetc is highly optimized (and I use it as a torture test because of that) but it's actually kind of a poor benchmark. It's purely compute bound to the point where cache, memory, etc have no impact. For our compute tests we want benchmarks that stress all aspects of the GPU, so that means it not only needs to be compute intensive, but memory intensive, cache sensitive, etc. Otherwise it's just a proxy for GFLOPs and a best case scenario for the VLIW5 architecture.

    With that said, I am completely open to suggestions. If you know of other programs that offer a decent benchmark and work on AMD and NVIDIA GPUs, I'd like to hear about it. We'll be refreshing the suite in the next couple of months, so now is the time to make suggestions.
    Reply
  • HangFire - Tuesday, March 15, 2011 - link

    At least the new 550 is marginally faster and lower power idle than the 450. Someone buying-up from one to the other will get a small boost, and even if power demands are slightly higher, the difference is so small that they shouldn't have to buy a new power supply. If they complain they didn't get a big boost, well, buy something with a larger second digit.

    After all the naming shenanigans Nvidia has played in the past, they should be commended for (at least) getting the name of the card right.

    Memory bandwidth is a very important buying comparison for me. I only buy middle end cards with higher than 50GB/s bandwidth, and high end with more than 100GB/s. This is a form of future proofing. I know I can always turn down detail and still get the frame rates (unless it is a very poorly written game). I would settle for 98GB/s. I would not settle for 32GB/s, or some and some.

    Oh, yeah, still no comment from AT on intro-time Linux driver support. Why not at least ask, why give Nvidia shelter on this point?
    Reply
  • Ryan Smith - Tuesday, March 15, 2011 - link

    Our experience with desktop Linux articles in the past couple of years is that there's little interest from a readership perspective. The kind of video cards we normally review are for gaming purposes, which is lacking to say the least on Linux. We could certainly try to integrate Linux in to primary GPU reviews, but would it be worth the time and what we would have to give up in return? Probably not. But if you think otherwise I'm all ears. Reply
  • HangFire - Tuesday, March 15, 2011 - link

    All I'm asking for is current and projected CUDA/OpenCL level support, and what OS distro's and revisions are supported.

    You may not realize it, but all this GPGPU stuff is really used in science, government and defense work. Developers often get the latest and greatest gaming card and when it is time for deployment, middle end cards (like this one) are purchased en masse.

    Nividia and AMD have been crowing about CUDA and OpenCL, and then deliver spotty driver coverage for new and previous generation cards. If they are going to market it heavily, they should cough up the support information with each card release, we shouldn't have to call the corporate rep and harangue them each and every time.
    Reply
  • Belard - Wednesday, March 16, 2011 - link

    Someone who already has a GF450 would be a sucker to spend $150 for a "small-boost" upgraded card.

    When upgrading, a person should get a 50% or better video card. A phrase that never applies to a video card is "invest" since they ALL devalue to almost nothing. Todays $400~500 cards are tomorrows $150 cards and next weeks $50.

    So a current GF450 owner should look at a GF570 or ATI 6900 series cards for a good noticeable bump.
    Reply
  • mapesdhs - Wednesday, March 16, 2011 - link


    Or, as I've posted before, a 2nd card for SLI/CF, assuming their mbd
    and the card supports it. Whether or not this is worthwhile and the
    issues which affect the outcome is what I've been researching in recent
    weeks. Sorry I can't post links due to forum policy, but see my earlier
    longer post for refs.

    Ian.
    Reply
  • HangFire - Friday, March 18, 2011 - link

    I wasn't really suggesting such an upgrade (sidegrade). I was just saying that each generation card at a price point and naming convention (450->550) should have at least a little better performance than card it replaces. Reply
  • Calabros - Tuesday, March 15, 2011 - link

    tell me a reason to NOT prefer 6850 over this Reply
  • 7Enigma - Tuesday, March 15, 2011 - link

    So basically this is my 4870 in a slightly lower power envelope with DX11 features. I'm shocked the performance is so low honestly. Thanks for including the older cards in the review because it's always nice to see I'm still chugging along just fine at my gaming resolution (1280X1024) 19". Reply
  • 7Enigma - Tuesday, March 15, 2011 - link

    Forgot to add, which I bought in Jan 2009 for $180 (Sapphire Toxic 512meg VaporX, so not reference design) Reply
  • mapesdhs - Tuesday, March 15, 2011 - link


    You're the target audience for the work I've been doing, comparing cards at
    that kind of resolution, old vs. new, and especially where one is playing older
    games, etc. Google for, "Ian PC Benchmarks", click the 1st result, then select,
    "PC Benchmarks, Advice and Information". I hope to be able to obtain a couple
    of 4870s or 4890s soon, though there's already a lot of 4890 results included.

    Ian.
    Reply
  • morphologia - Tuesday, March 15, 2011 - link

    Why in the name of all that's graphical would you use this Noah's Ark menagerie of cards but leave out the 4890? It doesn't make sense. If you're going to include 4000 series cards, you must include the top-of-the-line single-GPU card. It's proven to be quite competitive even now, against the lower-level new cards. Reply
  • HangFire - Tuesday, March 15, 2011 - link

    Advertisers hate it when you see how competitive older offerings are with new stuff. So, little-used features like in DirectX 11 are used to force out comparisons with older cards that still deliver great frame rates and value, and cause users to not upgrade for a while.

    We won this battle for a while and AT had a few older cards it included in its benchmarks. Now its back to nothing but the latest.
    Reply
  • morphologia - Tuesday, March 15, 2011 - link

    "Nothing but the latest?" The 4870 and 4870X2 shown in this comparison are hardly current. I suppose the 4870 is less likely to outperform than the 4890 is, but the X2 makes an even stronger showing. Still does not make sense.

    Also, on an unrelated note, it looked like the 6990 was dropping out of various comparison scales without explanation. BattleForge 1680x1050, for example. 6990 dominated the 1920x1200 but was inexplicably absent from 1680x1050, instead the 580 topped that chart. What's up with that??
    Reply
  • Ryan Smith - Tuesday, March 15, 2011 - link

    The 6990 is not on any of the 1680 benchmarks. It's already CPU limited at 1920; at 1680 it's useless data since no one is going to use it at that resolution. Reply
  • Ryan Smith - Tuesday, March 15, 2011 - link

    Due to the amount of time it takes to benchmark (and rebenchmark) GPUs, it's necessary to keep a truncated list, and from there not every card actually makes it in to the article (and this is why we have GPU Bench). As such I focus on the current and previous generation of GPUs, while throwing in a sampling of 3rd & 4th generation GPUs as a baseline.

    I specifically pick these older GPUs based on architecture and relative performance - the idea being that while we don't have every GPU in the system, if it's a few years old it's well established how other minor variations of that GPU perform relative to the one in our results database. So in this case the 4870 is in there both because it's easy to visualize where the 4850/4890 would be relative to it, and because it was easily the most popular 4800 card.
    Reply
  • morphologia - Tuesday, March 15, 2011 - link

    Seems like the 4870X2 was a bit of a spoiler, seeing as how it trumped a few of even the current generation, though it too was dropping in and out of the bar charts with no explanation. If you are going to include it at all, there should be more consistency. Otherwise it looks like ranking/stat doctoring. Reply
  • 7Enigma - Thursday, March 17, 2011 - link

    Ryan's already mentioned why. It's a dual GPU card at the time was likely not tested at the low resolutions this particular article used for these lower-end cards. Likely 1920X1200 (or 1080) was the lowest this card was benchmarked at. I applaud Anandtech for including the data they have, and as mentioned you can use Bench to compare to your hearts desire. Bottom line: it is unlikely someone is gaming at less than 24" resolutions with a 4870X2, and if they are they can use Bench for that particular purpose.

    These guys have enough to do without going back and retesting cards from years ago. I'm just glad the data is in there.
    Reply
  • nwarawa - Tuesday, March 15, 2011 - link

    I didn't hear much complaining about the GTX460 768MB all this time : all the reviews were heralding its value. Now we have an even less powerful GPU, and 768MB suddenly becomes an issue? The heck with that. 768MB should be the standard configuration for this card, with a MSRP of $129. If you want high resolutions with AA, you should be getting a more powerful GPU as well. Nvidia should use a 768MB model of the GTX550 to phase out the 768MB GTX460, keep the 1GB GTX460 for awhile, and encourage more brands to bin their GTX560Ti's and make some 2GB models (I know Palit/Gainward does one, but no availability where I live). An overclocked 2GB GTX560Ti would be handy in a handful of games (GTA4 immediately comes to mind), and would compete well with a 6950... leaving the GTX570 to dance with the 6970, and the GTX580 to maintain its single-chip lead. Reply
  • HangFire - Wednesday, March 16, 2011 - link

    768MB did not suddenly become an issue. Previous AT articles on the two 460's have repeated warnings that 768MB would soon be not enough memory.

    Agreed that if you lower the price enough, 768MB becomes "enough" as you are unlikely to be driving high resolutions with the corresponding large numbers of in-memory high resolution textures with a low-end card. At moderate resolutions, 768MB is enough.
    Reply
  • Belard - Tuesday, March 15, 2011 - link

    (AMD - you still suck for naming the SLOWER 6870 cards to replace the 5870s etc)

    LOL - this would be FUNNY if it wasn't so sad.

    1 - The "State of the art" 550 Ti (Total idiot) card is 0~5% faster than the the 1+ year old ATI 5770. Really, other than for reference - 1280x1024 scores are useless for todays monitors. $120~140 means buying a 20" 1920x1080 monitor, $160~200 is a 21~22" model. I'm missing the 1920x1200 since its not so bloody narrow. I'd love to see a 26~27" that does 2560 x 1600 on the market.

    So when comparing the results for 1920x1080, which is a STANDARD for today. The 550 is sometimes 0~3 fps faster, sometimes slower.

    2 - Price!? The 5770 is easily 1/3rd cheaper going for $100~125 vs, $150~160.

    3 - Stupid model names!? GeForce was given to the series. So WTF is GFX good for? If the 550 is almost the bottom end... why not GTS like the GTS 450 or GTS 250? There is no consistency. It doesn't donate feature sets.
    "TI" okay... What is the difference between a TI and a NON TI card? Oh yeah, the letters on the box and in bios, nothing else. Why bother?

    We know Nivdia will most likely skip the 600 series (What happened to the 300s?) so they too can be "7s" with ATI. So well we see:
    Nvidia GeForce GT 720 mx
    Nvidia Geforce GTS 740 Pro
    Nvidia Geforce GTX 780 Ultra

    The Geforce 550 or "GF550" is ALL we need to know what the product is.

    4 - ATI 6850... it should have been included in this benchmark since its in the same price range. Newegg as them for $150~180 ($160 avg). It would really show what people are paying for. The 6850 is about 10fps faster than the 5770/GF550

    5 - GF 460-768 price is $130~160.. again, about 10fps faster than the GF550. But oh yeah, the 550 replaces the older and faster card. hate it when that hapens!

    Think I'll hold on to my ATI 4670 until the AMD 7670/7770 comes out... I want the performance of a 5870/6950 with the heat/noise and power issues of a 5770 at a price under $180.
    Reply
  • phoible_123 - Tuesday, March 15, 2011 - link

    I find it kind of interesting that nVidia's price points for their mid-range cards are higher this time around.

    The GTX460 started at $220 IIRC (and had its price cut pretty quickly), while the GTX560 is $259. The price for the 460 was pretty killer, but by the time the 560 came out, the pricing was pretty ho hum. If they had launched it at the same price level as the 460, AMD wouldn't have been able to compete. Granted, I'm sure they priced it that way to keep margins high, but this is a process improvement rather than an all-new chip (basically it's the third-gen gf100)...

    The GTS450 was $130, while the GTX550 "TI" is $150. And when the GTS450 came out, the value prop wasn't that good (when compared to the 460). It's like half the card for only a little less money.

    I recently picked up a GTX460 768MB for $90 after rebates.

    It's kind of like the radeon 48xx vs 57xx comparison (less card for more money).
    Reply
  • dmans - Tuesday, March 15, 2011 - link

    my 8800 gt is better than this thing. Reply
  • mapesdhs - Tuesday, March 15, 2011 - link


    Google for, "Ian PC tests", it's the 1st link that comes back. Scroll down the page
    for the full list of results pages (I've done a whlole bunch). Voila, a mountain of
    8800GT data for you to chew on. 8-) And much more to add!

    Ian.
    Reply
  • HangFire - Wednesday, March 16, 2011 - link

    "lan PC tests". Hmm. I get a reviews.cnet.com link for a WiFi antenna.

    And, can you please stop spamming the comments?
    Reply
  • mapesdhs - Wednesday, March 16, 2011 - link


    I'm not spamming the comments, I'm providing real info to help people
    out. Re the Google, it could be because being in the UK I'm forced
    to use google.co.uk which may give different results to google.com
    (probably does). Alas, nothing I can do about that (hmm, "try, "Ian SGI
    UK" instead, that should bring up the right link). If you want to know
    what I'm talking about though, send me a PM and I'll send you the refs
    so you can see what I mean. People keep asking upgrade questions
    which review articles do not or cannot answer, eg. those playing
    older games, at lesser resolutions, with systems that don't have uber
    CPUs, etc.. I've been working to provide the info that answers such
    questions (have you?). That isn't spamming.

    Ian.
    Reply
  • HangFire - Wednesday, March 16, 2011 - link

    >my 8800 gt is better than this thing.

    That would make it faster than the GTX260 as well. That's some 8800GT!

    I love the value that my 8800GT provided, but it is sitting on the shelf now for a reason.
    Reply
  • sheh - Tuesday, March 15, 2011 - link

    I'm not one to comment on this sort of things in general but I must in this case. Each instance of "in to" in the graphics hardware articles comes with a mental dissonance I have to resolve before reading can be resumed.

    http://www.wsu.edu/~brians/errors/into.html
    http://data.grammarbook.com/blog/definitions/into-...

    Other than that, keep up the good work. :)
    Reply
  • gammaray - Tuesday, March 15, 2011 - link

    I don't understand the logic behind the pricing of video cards nowadays.

    Low end video cards like this new 550ti should be below 100$
    mid range video cards 150ish and
    high end 200-250$ MAXIMUM
    Reply
  • mapesdhs - Tuesday, March 15, 2011 - link


    An item is only ever worth what someone is willing to pay.

    There are those with big budgets who are happy to pay $600+, hence products
    to match such affordability exist and always will do.

    If you had something to sell, would you let someone buy it for $200 if you had
    a different customer who was happy to pay you $400? ;)

    Such is the law of supply & demand. I deal with this every day with respect to
    buying/selling used SGI items. Hobbyists assume old items should be cheap
    because they're old and they don't want to pay much, but in reality commercial
    demand for certain items extremely strong, so the real value is sometimes very
    high. Same basic concept applies to anything really. A brand of chocolate
    cookies my gf & I particularly like have gone up in price recently by quite a lot,
    and I'm sure it's because they are popular. Demand rise = price rise.

    In some parts of the world, the market for high-end consumers GPUs is quite strong.

    Ian.
    Reply
  • Will Robinson - Wednesday, March 16, 2011 - link

    What a shame to soil the good reputation of past and present Ti cards on this dud. Reply
  • Belard - Wednesday, March 16, 2011 - link

    "TI" is meaningless. Might as well mean "Total Idiot".

    If they took out the "TI", it would still be the same product. Its all marketing to get people to remember about the old $200 kick-ass 4200~4600 cards... before the GF 5800 debacle.

    TI originally was about its manufacturing (so they say), but look back. There were no 4200 and 4200 TI, right? They divide the GF2-tech cards into 4x0MX and the state of the art into 4x00TI.

    We'll soon see the return of MX, PRO and Ultras I think... hell, maybe even the "Geforce GTX ti 785 Ultra TNT" in 2012.
    Reply
  • Soldier1969 - Wednesday, March 16, 2011 - link

    Poor mans card, come back when you get at least a 580 or better... Reply
  • valenti - Thursday, March 17, 2011 - link

    Ryan, can you explain where the nodes per day numbers come from?

    I spend a fair amount of time hanging around folding sites, I can't think of anybody else that uses the nodes per day metric. Most people use PPD (points per day), easily gathered from a F@H statistics program such as HFM.net

    I'm unsure how to convert nodes/day to PPD, if that is possible. In actual practice, I find that a 450 card nets almost 10,000PPD, while a 460 gets about 12,000. I prefer the 450, after taking into effect price and power needs.

    You might want to search on "capturing a WU" to read about how to copy a protein's work files, allowing you to run the same protein for each card.
    Reply
  • suddenone - Thursday, March 17, 2011 - link

    How good this card is depends on what monitor you plan on using. Anyone that has a small monitor and plans to keep it for at least three years might be happy with this card. I am puzzled to see fps over 60 on a standard 60hz monitor. My gtx 460 ran fine until I upgraded to a 24 inch 1080p monitor. I sold the card on ebay and bought the gtx 570( big gun). The gtx 570 can run any of my games at over 30fps min with all effects at maximum. Peace out. Reply
  • ol1bit - Thursday, March 17, 2011 - link

    I bought 2 460's 6 months ago it so. For 149 each, and they beat the 550 in every catigory.

    This is a bad launch by Nivida. Old product is faster at same price or lower.
    Reply
  • meatfestival - Monday, March 21, 2011 - link

    Not sure if it's already been posted, but although it does indeed use id Tech 4, Raven wrote a custom DX9 renderer for it.

    The multiplayer component (based on the Quake Wars fork of id Tech 4) is on the old OpenGL renderer.
    Reply
  • ClagMaster - Friday, March 25, 2011 - link

    The GF116 has much optimized and improved performance over that of the GT-450 it is destined to replace.

    However, the GTX-550 Ti is a low-end replacement for the GT-450 and is priced to much. I wished nVidia called it the GT-550 and reserved the Ti designation for something truly high performance.

    For $20 more, I can get a ATI-6850 that gives me 25% more performance for 10% less power consumption.
    Reply
  • chrcoluk - Friday, October 21, 2011 - link

    I think people are been very harsh on the card.

    its slower than the older 460 on these benchmarks but the gap isnt a gulf and in addition peopel are completely discounting the power factor. The major problem with graphics cards today is they use too much power, intel have managed to reduce power load whilst increasing performance, yet nvidia cant do the same.

    Also no considerations taken on what people are upgrading from, I currently have an 8800GT and am only considering upgrading, a 550 TI would double performance whilst using a little more power on load over the 8800GT and only a 3rd on idle. Buying a 460 would give me a bigger performance boost but I need to connect a 2nd power cable (wtf???) and it uses significantly more power than the 550, about 50% more. so for watts vs performance the 550 beats the 460. But it seems people on here and most other sites consider power consumption as 0 importance, maybe they dont pay their own power bills?

    The 550 TI will have many sales because nvidia know most people dont upgrade every year but more likely 3+ years frequency so the 550TI doesnt have to beat the 460 it just has to be significantly better than DX9 and DX8 cards at a good price point whilst also not needing someone to maybe buy a fatter psu as well (not considered here when comparing prices). The 550TI will double my current performance, and I guess the 460 1 gig would be about an extra 20% or so on top of that. Amount of video ram is also important, texture heavy games will saturate 768meg so in those scenarios the 550 is a better choice than the 768 460 model.
    Reply
  • xKerberos - Tuesday, January 17, 2012 - link

    Oh finally someone who makes some sense! In my country, GTX460s are more expensive than GTX550 Tis and my overclocked Gigabyte 9800gt was struggling to run Battlefield 3 so I picked up an MSI GTX550 Ti with 1gb of ram and factory overclocked. It runs BF3 like a dream at high. And seeing how it munches on VRAM, I'm surprised how people with 768mb cards run that game.

    Granted I have a tiny 1280x1024 Dell UltraSharp 19" monitor so I don't need a high end card to max games out. My 9800gt was doing a fine job until BF3 came out. I also had to fork out for 8gb of ddr2 ram to run that beast properly.

    Anyway what I like about my new card is it was cheap (half the price of GTX560, although at half the performance), it runs very cool (36C on idle in a warm tropical country! max 60C on load), very quiet and it does the job. I won't have to upgrade for a while yet!
    Reply
  • UpStateMike - Wednesday, February 22, 2012 - link

    I have to agree with this. About a month ago I began the project build to replace an ancient dinosaur that was maxed out and tired.

    I kept my good 630W ps, a 24" led 1080p viewsonic monitor, and a 9800 gtx card that I figured I could use in the new build.

    I started with a new case - rosewill challenger, and build around an i5 Sandy Bridge 2500k put into asus p8z68-vpro mb. I have 8gb of gskill ddr3 1333 at the moment.

    Now that this is all up and running, instead of my old system being the performance bottleneck of the video card, the card is now the bottleneck.

    I began by trying to cheap out and stay at $80 range, and OC the cards mentioned here, but I liked the lower power use (I'm a medium gamer but I wanted a PC that could handle any game I want to get in the future) of the 550ti and although I'm at $120 for this card, my guess is when I go for my next upgrade phase in about 6 months or so, I can wait for a good deal to come along and get another one to SLI and another 8gb of memory.

    So for me, this is a big upgrade that I can build on to SLI and get to where I should be happy for the next couple of years. I do more photography editing and whatnot so this greatly helps me with that and I can still game as I get some time. I can appreciate that the differences are minor for anyone that has bought a card in the last year, but coming from a 9800gtx I'm very happy.
    Reply

Log in

Don't have an account? Sign up now