Comments Locked

117 Comments

Back to Article

  • papapapapapapapababy - Saturday, October 17, 2009 - link

    http://ht4u.net/reviews/2009/amd_ati_radeon_hd_570...">http://ht4u.net/reviews/2009/amd_ati_radeon_hd_570...

    use babel
  • Zool - Wednesday, October 14, 2009 - link

    The 5700 series have the same improved adaptive antialiasing with shaders like the 5800 series ?
    There could be a antialiasing graph with diferent resoutions and antialiasing types for each card in reviews.
  • RDaneel - Wednesday, October 14, 2009 - link

    I may be in the minority, but I've already ordered a 5750. For a SOHO box used for only occasional gaming, it was the most future-proofed option (DX11) that also has low enough idle draw that it actually will save me enough money over the life of the card to justify any price difference with a 48xx card. Would I have loved 10% more performance, sure, but this isn't a bad blend of efficiency and longevity.
  • yacoub - Wednesday, October 14, 2009 - link

    imho, it's perfect for that situation.

    Those of us who have a gaming PC with a DX10 card in it are the ones who find this 5700 series less than stellar.
  • ET - Thursday, October 15, 2009 - link

    But those of us that have a mid-range PC with yesteryear's DX10 card (3870) find it appealing. :)
  • SantaAna12 - Wednesday, October 14, 2009 - link

    are you filtering out comments these days?
  • SantaAna12 - Wednesday, October 14, 2009 - link

    what up AT?
    ive been lookin at your recent AMD rants, and its getting tiresome. They paying u the big bucks these days? when you only compare AMD cards against AMD cards you are doing your site a disservice. When you show CF but no SLI you are showing me a new AT.

    I have expected more in the past. You goin the route of TOMS?

  • Ryan Smith - Wednesday, October 14, 2009 - link

    As we noted in the article, the CF configuration is mostly for academic reasons. I don't seriously expect anyone to pick it over a single card.

    Anyhow, what would you like to see? I have the SLI data for the 275 and the 285, but since we've already established that the 260C216 is faster than the 5770, it won't really tell you anything useful.
  • just4U - Wednesday, October 14, 2009 - link

    "NVIDIA would need to shave the price down to justify its purchase once more (something they have not done on the GTX series in response to the 5870 and 5850)."

    ------------------

    I'd like to comment on this for just a moment. Where I live we haven't seen much stock on the new dx11 cards yet.. however, Suddenly there's a slew of highly priced 295's and other top end Nvidia products that these stores were not stocking.

    My bet is that .. people walking in and making a purchase find out that they can't get that coveted new DX11 card so they opt out for one of those. So in a sense Nvidia would be riding on the coattails of Ati's new popular line that .. just doesn't have the availability. They haven't had to lower prices yet because they may be benifiting by the lack of stocked cards.

    Make sense?
  • Ananke - Wednesday, October 14, 2009 - link

    No, it doesn't make sense :)
    Why would you spent 3-400 for something that you don't want at first place? Why just not keep your money until you actually can buy what you want? We are not talking about 10 bucks, it is way bigger chunk...
  • 7Enigma - Wednesday, October 14, 2009 - link

    Because people are impatient and need instant gratification. This happens everywhere...
  • erple2 - Wednesday, October 14, 2009 - link

    That, and people walking into stores to buy computer equipment aren't generally looking for the best deal.
  • Ananke - Wednesday, October 14, 2009 - link

    I respect the MONEY :). Wasting money for something that doesn't quite fit my intention is not my way. But, people are different, I guess not everybody would do the same...My point is, better pay and get what you exactly want and need, otherwise later you'll regret. Now, if you have so much money to waste, buy anything :) that's different story.

    5770 with that anemic 128bit bus is worth less than $100, in my opinion. Above $100 it is just wasting money for getting nothing :)
  • just4U - Wednesday, October 14, 2009 - link

    Thing is, people shopping for a new card probably do not have one of the latter cards.. or what they do have is fairly sub par. While they may have a good idea what they'd like to get...

    ... alot of times you see them settle. Hell even those among us who are tech savvy have done that from time to time. Seems to me that's sort of what Nvidia's doing right now.
  • Hrel - Wednesday, October 14, 2009 - link

    I need a new card, I want it to be DX11, but the performance isn't there. I want something about 10 percent faster than a 1GB 4870 for about 150 bucks, and something about 10 percent faster than a 4890 for less than 200 bucks; with DX11. Once I see a card like that, from AMD or Nvidia; I'll buy it. Stupid X1650Pro is REALLY limping along in modern games and I'm starting to get sick of low res and min settings.
  • Ananke - Tuesday, October 13, 2009 - link

    Really good cards.....not worth the money though :)
    Did I just summarized it well ?
  • snarfbot - Tuesday, October 13, 2009 - link

    it would be nice to see some benches with crysis w/o aa.

    if they really are bandwidth limited that would make the difference.

    also overclocked performance, if the memory is the limiting factor then the 5750 would probably be a pretty good bang for the buck.
  • Leyawiin - Tuesday, October 13, 2009 - link

    The HD 5850 was "wow". The HD 5770 is a little "meh". Its great you have something that sits between the HD 4850 and HD 4870 in performance with such low power requirements and noise, but that price has to come down.
  • silverblue - Tuesday, October 13, 2009 - link

    I view the 5770 as a natural successor to the 4770/4830/4850 (so I wouldn't expect a 5830 to appear, for example) as opposed to a replacement for the 4870. By now I'd expect 40nm yields to be much better than a few months back when TSMC had issues producing the RV740 variants so hopefully any dies that are defective are only minimally so and ATI can put them on the 5750 cards. Makes me wonder about the lower-range cards due next year though.

    The Eyefinity ports are an enigma, however it could make for a very nice business class card assuming anyone can afford those dongles.
  • CarrellK - Wednesday, October 14, 2009 - link

    If you are building an Eyefinity (EF) setup, you prolly don't have three monitors. You prolly have one or two. This means you will be buying at least one monitor. My advice: buy a DP monitor that matches the physical size & resolution of your existing monitors. That way you don't need to get an adapter.

    CarrellK
  • silverblue - Wednesday, October 14, 2009 - link

    With any luck they'll become plentiful in a short space of time, offering early adopters the chance to set up a decent EF, umm, setup.

    If you think the typical EF setup will be two or three monitors, do you expect the full six monitor glory with an X2 part? I'm still wondering if even the 5870 can handle three monitors and still offer smooth gaming performance. That said, despite their power they're not going to strictly be gaming cards.
  • papapapapapapapababy - Tuesday, October 13, 2009 - link

    the fact that this cards consume little power is irrelevant when you have that great efficiency on the 5800... also including the Eyefinity gimmick here is a mistake, it only diminishes the value of that feature on the 5800. It should have been 1 card. HD 5770:
    no Eyefinity, 800 SP, 750MHz, 512MB = $99 USD

  • CarrellK - Wednesday, October 14, 2009 - link

    Eyefinity (EF) will be in all 5xxx products for a multiplicity of useful reasons, many of which aren't apparent yet. There will be frequent roll-outs of new EF goodness. There will be many, many customers who will find EF very useful. Hopefully you will realize what EF can do for you and buy one of our products. We'd like for you to be a happy customer of ours.

    CarrellK



  • yacoub - Tuesday, October 13, 2009 - link

    " It should have been 1 card. HD 5770:
    no Eyefinity, 800 SP, 750MHz, 512MB = $99 USD "

    Close, but no. 1GB of VRAM is mandatory anymore, and it needs the 256-bit bus or more texturing units and ROPs. And then it could be $125.
  • yacoub - Tuesday, October 13, 2009 - link

    Something's wrong when two of these in CrossFire can't match a single 5850. Blah.
  • qwertymac93 - Tuesday, October 13, 2009 - link

    why no mention of 4770? i know its older and slower, but its also 40nm like the 5750 and is the same price, it would be nice to see the difference between the two as they are specked quite closely(640sp@750mhz, 720sp@700mhz, both 128bit gddr5)
  • snarfbot - Tuesday, October 13, 2009 - link

    legion hardware has a good review comparing them both.

    the 5750 is between 1-3 fps faster.

    the 5750 has better overclocking potential thanks to the ram i guess, but im not sure if its worth the extra 25 bucks.

    kinda a wait and see thing for this part
  • Seramics - Tuesday, October 13, 2009 - link

    Yet another good reviews from AT, thanks Ryan. However, it becomes clear cards like HD5870 and HD5770 isnt a very good performer for its price. HD5850 and HD5750 512MB repeesents a more solid bang for bucks. Again its very amazing that AMD has been able to bring us so many next gen DX11 cards when Windows 7 isnt even launched yet and their competitor is being super slow by onli recently releasing a non high end part of G200 derivatives. That being said, from the point of view of solely performance, Cypress and Juniper is kinda a disappointing performer for its price, as well as for its specifications.
  • pullmyfoot - Tuesday, October 13, 2009 - link

    Hmm. I was expecting the 5770 to perform either at the 4890 levels or slightly slower at very least while running cooler and taking up less power. This is quite disappointing. I was all ready to get one to replace my 4850 if the price was right. I wonder how well they can tweak the drivers for this thing.
  • yacoub - Tuesday, October 13, 2009 - link

    Same here, although I'd replace my 8800GT with it. I expected about 25% more performance, and about $10 less MSRP ($150).
  • hsoverma - Tuesday, October 13, 2009 - link

    I bought a 3870 for about 210.00 a year and a half ago. This card has double the performance, a lot more features, and is starting at 160.00. I figure by christmas, it will be down around 120.00. I run a small frag-box, so the lower heat and lower power makes sense for me over a 4870, and if I ever wanted more power, I could run two of these in crossfire and have all that I need. I am putting this card on my wish-list. Thanks again for a great review.
  • 7Enigma - Tuesday, October 13, 2009 - link

    Great article except for the flaw of not a single Memory OC data point. If as the data shows the 5770 is performing poorly due to inadequate memory bandwidth (seemingly the ONLY issue hampering performance when comparing specs between it and the 4870), it makes sense that a simple OC could shed some light on this issue. Please update the article with some numbers as this card is mainstream enough that I would imagine overclockers *could* see this as a gem in disguise.
  • Ryan Smith - Tuesday, October 13, 2009 - link

    Once things calm down, we're going to do a 5800 series overclocking article. It's something that takes a while to put together, and there are major product launches every week right up through the end of this month.
  • 7Enigma - Wednesday, October 14, 2009 - link

    Can you give us a hint then whether this memory has the potential to overclock well enough to where the bottleneck is overcome? I doubt it can completely remedy the situation but if the memory overclock was enough to make up for the significantly narrower bus in most cases, I think this card would have a better reception.
  • Etern205 - Tuesday, October 13, 2009 - link

    Just a stupid question...

    ATi's Eyefinity requires a display port monitor to work properly (saw on youtube video), and what it does is makes the image share across the screen.

    My question is can you use the ATi card and use all monitors independently without the eyefinity feature?


  • squeezee - Tuesday, October 13, 2009 - link

    Yes, you can run the displays independantly just as with any other card. However the 3rd monitor always needs to be Displayport. (or use an ACTIVE displayport adapter)
  • EJ257 - Tuesday, October 13, 2009 - link

    I'm sure there is a perfectly good explanation for this that I'm missing. You say there is 1 SIMD disabled in the 5750 vs the 5770. Looking at the chart on the first page there is a difference of 80 stream processors and 4 texture units between the 5750 & 5770 so this would indicate 4 SIMDs are disabled.
  • EJ257 - Tuesday, October 13, 2009 - link

    Wait, never mind. I see each SIMD consists of 4 texture units and 20 SP (Stream Processors) and each SP contains 4 Stream Cores. I guess in the chart when you say Stream Processors you really mean Stream Cores right?
  • yacoub - Tuesday, October 13, 2009 - link

    Ryan - THANK YOU for including the 8800GT in the graphs: That is the card I (and many other potential buyers) will be upgrading from, moving to a DX11 card.

    It's too bad this new card (5770) can't quite catch the GTX260 C216, as that is its main NVidia competitor in performance and price. It uses just as much Load wattage as a 260, but seems to offer around 80-90% of the performance. Perhaps if they hadn't cut the bandwidth to 128-bit, it would have squashed the GTX-260. But ATI has a habit of under-bussing their cards and it continues to negatively impact high resolution performance, no matter what certain reviewers might claim about the bus-width not hurting performance. Time and again, testing shows potential for improvement from a wider bus.

    Blah. ATI, you always come so close to getting me to purchase but there's always something to hold me back. Perhaps if/when this card drops to $139 (without rebates). But by then NVidia might have their answer out and the GTX-260 would also drop in price or be replaced with a DX11 part, and then the 5770 again loses appeal.
  • MadMan007 - Tuesday, October 13, 2009 - link

    I think you'll pass out if you try to hold your breath waiting for a lower midrange NV DX11 answer to these cards.
  • flipmode - Tuesday, October 13, 2009 - link

    What in the world is going on with this game? 8800 GT beats the 4850? No, sorry, I don't buy that. Something is wrong here. The 5770 beats everything? If that is the case, then this game should immediately be removed from the bench suite - games in the bench suite should help us understand the general performance characteristics of the hardware and a game that returns such erratic results actually distorts that understanding.
  • Griswold - Tuesday, October 13, 2009 - link

    On page 13 you say:

    "The 3870 beats it by 14W at the cost of a significant degree of performance, while the 8800GT is neck-and-neck with the 4770, again with a decent-sized performance gap."

    You certainly meant 5770 there. But this brings me to a question: Why isnt the 4770 included here? As an owner of that card, I'm very much interested in the performance/power/noise difference - just ditch one of the relatively irrelevant SLI or CF combos. I dont think too many care about comparing high-end multi-GPU with performance parts such as the 5770 and 5750, even if its 57xx in CF.
  • flipmode - Tuesday, October 13, 2009 - link

    Ryan - thanks so much for the review. Nice job. It does seem like a 5750 Crossfire would be an interesting value - moreso than the 5770 since the latter is overpriced.

    And, Anand, I love your site, and don't take this personally, but, PLEASE GET A COMMENT SYSTEM THAT DOES NOT TOTALLY SUCK!

    Check out TechReport for an example of the awesomest comment system in the universe.

    PLEASE!
  • Ryan Smith - Tuesday, October 13, 2009 - link

    AMD only sent out 1 5750, so I don't have a second one to Crossfire at this time.
  • Roland00 - Tuesday, October 13, 2009 - link

    It makes no sense (beside bad drivers) for the 5770 to lose to the 4850. The 5770 has more memory bandwidth (76.8) compared to the 4850 (63.55 gb/s), due to the 4850 sticking with ddr3, even with the 128 bit bus. The 5770 is also clocked 36% faster than the 4850 (850 vs 625).

    Yet the 5770 underpeforms the 4850 being almost tied?
  • Zool - Tuesday, October 13, 2009 - link

    Maybe the 4*64bit memmory controlers on the perimeter of the chip keep up the data better than 2*64bit controlers with higher bandwith.
    I think that they could make it at least 192 bit (3*64bit).
  • Zool - Tuesday, October 13, 2009 - link

    Actualy where the hell are the Cypress and Juniper die shots ?
    I cant find a single one on net.
  • Ryan Smith - Tuesday, October 13, 2009 - link

    AMD is not releasing die shots.
  • dgz - Tuesday, October 13, 2009 - link

    Looks to me like AMD is trying to lure people into buying the remaining 48** cards. Once the old chips are cleared, the price of 57** will no doubt drop.
  • GrizzlyAdams - Tuesday, October 13, 2009 - link

    What really has me concerned is how the 5870 is scaling in these tests.

    The 5870 core is essentially two 5770s strapped together, and you would hope scaling would be near linear. When two 5770s in crossfire match or even beat a 5870 I'm left scratching my head.

    Somewhere there is a significant bottleneck in the 5870's design, and I'm wondering where that is. Anyone have any clue?

    Hopefully a driver update will fix these issues, because if not there is a lot of wasted silicon on each of these chips...
  • squeezee - Tuesday, October 13, 2009 - link

    Remember that there is more to the card than just the ROP/TU/ALUs. If the other logic is intact it could give the dual 5770s a net larger ammount of cache, more resources for scheduling, rasterization, etc.
  • Ryan Smith - Tuesday, October 13, 2009 - link

    Exactly. Geometry is also a big thing; the 5800 series and 5700 series have the same geometry abilities. Unfortunately this isn't something we can really test in a meaningful manner.
  • Torres9 - Tuesday, October 13, 2009 - link

    "The 5770 is 108W at load and 18W at idle, meanwhile the 5850 is 86W at load and 16W at idle."

    do u mean the 5750 or is the 5850 that good?
  • ET - Tuesday, October 13, 2009 - link

    I'm again seeing many comments of "DX11 gives me nothing". Well, you buying it gives developers one more reason to develop for it. If you stick to DX10, then it'd take more time to move to DX11. Really. Until the majority of the market moves to a new feature set (and hopefully Windows 7 will help move out of DX9), developers will only use higher end features as "special features".
  • MadMan007 - Tuesday, October 13, 2009 - link

    1 word for real DX11 rollout: consoles.
  • ET - Thursday, October 15, 2009 - link

    You're right, though not the way you think. Xbox programming is more like DX11 than DX9 or DX10, and the Xbox also has a tesselation unit (though simpler than in the DX11 parts), so moving to DX11 would make developers life easier.

    What users don't get is the difference between API and hardware capabilities. Even if developers limit themselves to DX9 level capabilities, for console compatibility, using DX10 or DX11 only to develop will be much easier than using both DX9 and DX10, and result in faster and less buggy code (optimising for two very different API's is hard).
  • xipo - Tuesday, October 13, 2009 - link

    As MadMan007 says, there wont be a large adoption rate from the developers towards DX11 until the NEXT generation of consoles ships (around 2012) supporting DX11... Win7 won't matter because game developers are still going to make games for DX9-DX11... Probably the very few game that will come out being DX11 only are going to be some kind of tech demos & suck 4ss!
  • ET - Tuesday, October 13, 2009 - link

    I haven't seen it stated, but I'd like to know if the 4850 benchmarked is 512MB or 1GB. If it's 512MB then the comparison with the 5750 isn't valid.
  • poohbear - Tuesday, October 13, 2009 - link

    u never mentioned that the performance of the 5770 might be a driver issue? the hardware is certainly capable of outdoing the 4870 as we can see in Farcry2, so maybe its just a driver issue?
  • Ryan Smith - Tuesday, October 13, 2009 - link

    I don't believe it's a driver issue. If anything it's a Far Cry 2-specific issue, but that's something I'm going to have to do some more digging for.
  • GrizzlyAdams - Tuesday, October 13, 2009 - link

    That may be due to some architectural improvements in the 5770's shaders. The drop in performance in other games may be due to the decreased memory bandwidth, which may not matter with regards to Far Cry 2.
  • papapapapapapapababy - Tuesday, October 13, 2009 - link

    this cards are super lame... 5750, now with +80 stream processors ! XD that 5750 is basically a ( lower clocked!) 4770... guess what ati? that cost me $85 bucks 6 months ago! but who cares right? nvidia is dead so why bother? just slap a dx11 sticker, rice the price ati?
  • The0ne - Tuesday, October 13, 2009 - link

    Just wanted to say I like the conclusion and it's dead spot on on the suggestions and advices.

    I'm very surprise almost no one is talking or bringing up the subject of DirectX. DX11 has more chance to succeed yet less attention. It's amazing how badly DX10 was to sway consumers about face.
  • kmmatney - Tuesday, October 13, 2009 - link

    The problem with DX10 was that you had to buy Vista to get it...
  • MadMan007 - Tuesday, October 13, 2009 - link

    DX10 rendering paths of games that were also DX9 (meaning all of them at the time and even now) were also *slower* and provided little to no i.q. improvements. So even if it hadn't been Vista-only (and only morans keep on with the Vista FUD after SP1) there was no real benefit. DX11 looks to be different in all respects.
  • Lifted - Wednesday, October 14, 2009 - link

    Yeah, get a brain!

    http://24ahead.com/images/get-a-brain-morans.jpg">http://24ahead.com/images/get-a-brain-morans.jpg
  • Zool - Tuesday, October 13, 2009 - link

    Quite strange that with die size 166mm2 againts 260mm2(rv770) and with 128bit memmory it costs this much. And the 5750 has disabled one simd which even increase the amount of usable chips (but maybe its disabled just for the diference or else the two cards would be exatly the same except clocks).
    The Tessellation part with fixed units is exatly the same as 5800 series or tuned down ?
  • philosofool - Wednesday, October 14, 2009 - link

    I chalk it up to lowish 40nm yields at TSMC.
  • Spoelie - Wednesday, October 14, 2009 - link

    + higher cost per wafer as a 55nm one
    + ddr5 prices
  • Mint - Tuesday, October 13, 2009 - link

    Unless you absolutely need to take advantage of the lower power requirements of the 40nm process (e.g. you pay a ton for power)...

    According to your tests, the 5770 consumes a whopping 48W less idle power than the 4870, and other reviews have comparable results. If your computer is out of standby a modest 10 hours a day, that works out to 175 kWh per year. That's easily $15/year even for people with cheap electricity.

    The funny thing is that I usually see people overstating the savings from power efficiency...
  • strikeback03 - Tuesday, October 13, 2009 - link

    10 hrs a day is modest? That seems high to me, unless you are doing work that pays on this, I would think most people don't have 10hrs a day for recreational computing.
  • Mint - Tuesday, October 13, 2009 - link

    We're not talking about most people, we're talking about people who bother to get a 5770 instead of living with IGPs. Many people leave their computer on 24/7 to download torrents or fold or act as a file server (it's nice to access it from work) or whatever. I think 10 hours is a reasonable average for the target audience.

    Even if you reduce it to 5 hours a day, though, that's still $8/year. I like to keep video cards for a long time (usu. 2 years or more), and even when I upgrade, the old one is usually handed down.

    My point is that it's not something to ignore when comparing to the 4870. It was much less relevent for $300 cards with a 20W-30W difference (4870 vs GTX260 at launch), but now it's a 50W difference for $150 cards.
  • UNHchabo - Wednesday, October 14, 2009 - link

    Personally, I wish that the 4770 had been included in the power charts. It may be a largely irrelevant card for price/performance, but it's still the cheapest 40nm card that AMD makes.
  • Zingam - Tuesday, October 13, 2009 - link

    Real competition does wonderful things! If NVIDIA hasn't done it so great with 8800, we would never had these great prices by ATI today!

    Unfortunately there is nothing like that on the CPU side. :(
  • MadMan007 - Tuesday, October 13, 2009 - link

    Is the GTS 250 512MB or 1GB? It's not even stated in the test setup notes.
  • Ryan Smith - Tuesday, October 13, 2009 - link

    1GB.
  • Adul - Tuesday, October 13, 2009 - link

    http://www.monoprice.com/products/product.asp?c_id...">http://www.monoprice.com/products/produ...1&p_...

    As long as the video card supports outputting hdmi through the display port this will do. So the question is does it support hdmi signals through the display port?
  • Ryan Smith - Tuesday, October 13, 2009 - link

    Passive dongles are not supported on the 5000 series. It has to be an active dongle.
  • danielkza - Tuesday, October 13, 2009 - link

    There's a typo in page 5, I think you meant 'GTS 250' instead of 'GTX 250' (1st paragraph after the charts)
  • Skiprudder - Tuesday, October 13, 2009 - link

    Thanks for the review!

    I guess I'm rather surprised at the 5770 results being consistently lower than the 4870 as well, and would be interested in a a bit more hypothesizing as to why exactly this is the case when the stats on the cards suggest they should be at minimum roughly equivalent. Is this situation the sort of thing that might see large changes with updated versions of Catalyst?
  • Spoelie - Tuesday, October 13, 2009 - link

    The difference is obviously the memory bandwidth. It seems to me that ATi should have gone with a 192bit bus, this change alone would have made the HD57x0 a worthy successor to the HD48x0 range, without any performance caveats, while still being significantly cheaper to manufacture (40nm vs 55nm, 192bit vs 256bit).
  • Skiprudder - Tuesday, October 13, 2009 - link

    I'm guessing your right, but I'd like to see Anand (or Ryan) do one of the track-down-the-engineers that this site is famous for, and hear the rational on AMD's part.
  • CarrellK - Wednesday, October 14, 2009 - link

    Anand knows where I live...

    CarrellK
  • futrtrubl - Tuesday, October 13, 2009 - link

    Or even just overclock the memory and see how performance scales. That would provide some evidence for the memory bottleneck theory.
  • plague911 - Tuesday, October 13, 2009 - link

    At this price point it looks like the 5770 & 5750 are priced to pad AMD's pockets, not to provide increase performance (not that, that's a bad thing when in a war with Intel). With the smaller process size and smaller chip size and similar performance each new part sold will net AMD a substantialy higher profit. This is why AMD will likely kill off the older gen instead of droping the price point.
  • MadMan007 - Tuesday, October 13, 2009 - link

    Yes I think that's where my mild disappointment comes from. Not that they aren't great cards for the launch MSRP, they just aren't great in light of street prices, but unlike HD4800 or even arguably HD5800 AMD doesn't seem interesting in shifting the price/performance curve with these cards. At best matching the current price/performance curve leaves me a bit cold.
  • MonkeyPaw - Tuesday, October 13, 2009 - link

    That's been the trend from ATI lately with their mid-grade cards. The 5700 series is meant to offer roughly the same performance of the 4800 series for a cheaper price. The 4600 series last time was meant to match the 3800 series (the 4770 was quite an oddball though). It's not a bad system, really, as it allows ATI to migrate their lineup with some consistency.
  • Lonyo - Tuesday, October 13, 2009 - link

    It might be that some of the cost does indeed come from the RAM though.
    Once GDDR5 chips drop some more, it will be easy for AMD to drop the prices on these cards, but that might (might) be what's limiting pricing options.

    Or AMD just want to try and get maximum profit from these cards.
    But even so, when GDDR5 prices drop it will be easier to extract profit at lower prices, so GDDR5 pricing will still be at least partly responsible.
  • geok1ng - Tuesday, October 13, 2009 - link

    Reading the charts it gets obvious that it is upgrade time: lets get 4850s, 4870s, and even 4850X2-4870X2 on the next weeks before these cards phase out: they are faster and a LOT cheaper than the 57xx series. As for the high end consumers, just wait for the 5870X2, now that is a card to roll eyes, when and IF it launches.
  • codedivine - Tuesday, October 13, 2009 - link

    This is relevant only for compute folks like me, but does 57xx support double precision?
  • Anand Lal Shimpi - Tuesday, October 13, 2009 - link

    I don't like to make a habit of disagreeing with Ryan, but unfortunately only Cypress based cards support double precision. The 57xx series does *not* support double precision.

    Take care,
    Anand
  • MadMan007 - Tuesday, October 13, 2009 - link

    So where is the double precision implemented? I didn't bother too look it up by I imagine it's buried deep in the shaders. If so why take it out? Is it just disabled or not present at all? If not present I guess I could see removal for the sake of fewer transistors but otherwise it seems like artificial market segmentation. On the other hand hardcore compute power people where time = $$ won't have a problem getting a 5850 or better, or seeing what NV does.
  • CarrellK - Wednesday, October 14, 2009 - link

    DPFP (Double Precision Floating Point) is physically not in the Juniper GPU - it is not artificial segmentation. We had to choose between giving you a GPU that would be great for consumer HPC and games at a price you could afford, or something that cost notably more.

    There are virtually zero consumer applications that need DPFP.

    CarrellK
  • stmok - Tuesday, October 13, 2009 - link

    According to ATI's Stream SDK v1.4 page...

    Desktop cards that support double precision: Radeon HD 3690, 3830, 3850, 3870, 3870 X2, 4770, 4830, 4850, 4850 X2, 4870, 4870 X2, 4890.

    Mobile GPUs that support double precision: Mobiliy Radeon 3850
    3870, 4850, 4850X2, 4870

    None of their IGPs support it.

    Their newer Stream SDK 2.0 series (currently in Beta 4), mentions they now support OpenCL in GPU, and that the Radeon HD 5870, 5850, 5770, and 5750 are supported. No mentioned of which can actually do double precision though...

    Still, considering the 5770 looks similar in spec to the 4870/4850, it may support it. (The major difference seems to be the Memory Bus Width.)

    Come to think of it, what are the requirements to support double precision on a Radeon HD-series GPU?
  • codedivine - Tuesday, October 13, 2009 - link

    Thats sad :( .. thanks for the info!
  • Ryan Smith - Tuesday, October 13, 2009 - link

    My understanding is that it's available in the entire Evergreen lineup. So I'm going to give you a tentative "yes".
  • codedivine - Tuesday, October 13, 2009 - link

    Thanks!
  • endlesszeal - Tuesday, October 13, 2009 - link

    As always from anandtech, great review. However, I almost crapped my pants when I saw the price of a "display port to dvi" dongle," $100?? Hope thats not the average not inflated by Apple price. =)
  • Zingam - Tuesday, October 13, 2009 - link

    You don't really need that dongle anyway.
  • Ryan Smith - Tuesday, October 13, 2009 - link

    Actually, the Apple adapter is still the only active adapter I'm aware of that's widely available. So yes, that $100 is because of the Apple price.
  • endlesszeal - Tuesday, October 13, 2009 - link

    sorry, if this seems newbest since im still using DVI. anyway, i did a quick peak at apples site and only saw minidp to dvi dongle. however, i jumped over to monoprice and found this:

    http://www.monoprice.com/products/product.asp?c_id...">http://www.monoprice.com/products/produ...1&p_...

    would that work?
  • Xajel - Tuesday, October 13, 2009 - link

    Nope this wont work, the card(s) has only two TMDS's for one DVI and one DVI or HDMI, you can't use two DVI's + HDMI...

    if you want to connect the third monitor you have to use Display Port, and adapters won't work since DP on this card doesn't support DVI single Pass through ( this will need a seperated TMDS chip )

    there's some devices that support DVI/HDMI pass throught using DisplayPort, I'm talking about Apple latest Mac's where they dropped DVI/HDMI and replaced it with DP... that one supports DVI/HDMI adapters as it has it's own TMDS chip which is required for DVI/HDMI signals...
  • elfick - Tuesday, October 13, 2009 - link

    Monitors with HDMI seem fairly common and DP-HDMI adapters appear to be cheap. Could you do DP-HDMI, HDMI, and DVI for a triple monitor setup?
  • Ryan Smith - Tuesday, October 13, 2009 - link

    DP-HDMI is still a passive converter, so it still won't work.
  • Ryan Smith - Tuesday, October 13, 2009 - link

    No, it has to be an active (powered) adapter. You can tell if one is active if it has a USB plug, since that's where they're drawing power from.
  • Minion4Hire - Tuesday, October 13, 2009 - link

    It has to be powered if you wish to run dual-link DVI. The single-link MonoPrice adapter will work fine for resolutions up to 1920x1200. But most people looking to run Eyefinity will probably be wanting to go whole-hog with 2560x1600 given the large price tag already associated with such a setup.
  • kzig - Thursday, October 29, 2009 - link

    If I want to run 3 1280 x 1024 monitors together as 3840 x 1024 in Eyefinity, will I need an active adapter, or can I use a cheaper passive one?
  • BladeVenom - Tuesday, October 13, 2009 - link

    The Apple one is poorly rated. Dell has one, but it to is $100. And just to rerepeat that, it has to be an active adapter. http://accessories.us.dell.com/sna/products/Cables...">http://accessories.us.dell.com/sna/prod...us&l...
  • Ryan Smith - Tuesday, October 13, 2009 - link

    I was going to respond to this, but Xajel took the words out of my mouth. Just read his post, it explains why an active adapter is required.
  • bijeshn - Tuesday, October 13, 2009 - link

    Phasing out the 4870 is a bad idea. With time I look forward to the 4870 dropping even lower in price...
  • Mint - Tuesday, October 13, 2009 - link

    The 4870 will only drop in price to clear inventory, because it's not worth it to produce them with the intent of selling them at $120 or less. I expect them to sell out before the price drops much further.

    Don't fret, though. The 5770 has a 128-bit bus and a fairly small die. It will drop in price soon enough, unless NVidia decides to stop bleeding $$ on its huge GT200 chips on $150 cards and Fermi-based mainstream cards can't get down in price.
  • samspqr - Tuesday, October 13, 2009 - link

    well, my favorite retailer (alternate.de) already has the 5750 and 5770 in stock, at 130eur and 160eur respectively

    they also have the 4870-1GB at 115eur, which is MUCH cheaper

    in any case, right now, with my usage pattern (24/7 on, but mostly GPU-idle, maybe just one hour a day of GPU stress), the difference in power consumption between the 4870 and the 5770 is at least 50w, which means ((50*24*365)/1000)*0.15eur/KWh = 65.7eur/year

    so it pays for the difference in just over 6 months, at the expense of slightly lower performance, with the advantage of less noise

    speaking of which, I like my GPUs silent, passive if possible, thankyouverymuch, so I'll wait for vendor-specific designs or after-market coolers; by the time these are out, maybe the 4870 will not eve be available anymore
  • samspqr - Tuesday, October 13, 2009 - link

    (sorry, that was 8 months, don't know how I got that 6 the first time)
  • 7Enigma - Tuesday, October 13, 2009 - link

    Search for and download GPUTool. It's still in beta and has some quirks but for massive idle power drop it cannot be beat (at least for my system, 4870). I simply lowered the 2D core/memory clocks (they have a low/medium/high setting, and ALL need to be the same setting or you get flickering), down to around 250MHz, and this dropped idle power consumption by a crazy amount (40-80w, can't remember exactly). Once the creator of the program releases a newer version I'm hoping some of the fan speed and voltage mod bugs get worked out. Even so, the 2 second click to lower idle speeds is incredibly handy.

    HTH
  • chrnochime - Wednesday, October 14, 2009 - link

    I don't know if you've tried using ATI tray tool already, but after scourging around the web trying to figure out way to keep my XFX 4870 1GB from drawing more power than needed(e.g. when just surfing/playing video), I was able to drop the GPU clock to 400 MHz, and memory to 225 MHz. The memory draws much more power than does the GPU, so leaving GPU at 400 doesn't really make that much of a difference, compared to 250.

    Keep in mind that running said program in Vista is somewhat of a headache, since the driver is not signed by MS, so you need to do the work-around to get it running as startup program so the clocks drop can be initiated by the program.
  • makechen - Wednesday, April 14, 2010 - link

    48566565
  • rdh - Tuesday, January 4, 2011 - link

    A year later, the 5770 is *cheaper* than the 4850/4870. I just purchased one for $99 from the egg. It consumes 30% less power at idle and at load than the 48xx cards.

    I suppose at the $160 price point, it was fine to slam this card. At the current price point, though, it is the BEST price-for-performance and performance-per-watt card out there.

Log in

Don't have an account? Sign up now