AMD’s Radeon HD 5770 & 5750: DirectX 11 for the Mainstream Crowd
by Ryan Smith on October 13, 2009 12:00 AM EST- Posted in
- GPUs
Conclusion
When we were looking at the Radeon HD 5800 series cards, writing a conclusion was rather easy for us. We didn’t need to talk about intangibles like Eyefinity or DirectX 11, because the 5800 series brought better performance at better prices for existing games. This made recommending the 5800 series a straightforward thing to do.
But we don’t have that luxury with the Radeon HD 5700 series. The value of the 5770 in particular is clearly not going to be in its performance. Compared to AMD’s 4870, it loses well more than it wins, and if we throw out Far Cry 2, it’s around 10% slower overall. It also spends most of its time losing to NVIDIA’s GTX 260, which unfortunately the 4870 didn’t have so much trouble with. AMD clearly has put themselves in to a hole with memory bandwidth, and the 5770 doesn’t have enough of it to reach the performance it needs to be at.
If you value solely performance in today’s games, we can’t recommend the 5770. Either the 4870 1GB or the GTX 260 would be the better buy.
But don’t mistake that for a wholesale write-off of the 5770. As a 40nm product it’s cooler running and quieter than a 4870 or a GTX 260. As a DirectX 11 product it has longer legs to run on for future games and/or applications using DirectCompute 5.0. Eyefinity is also there, but with the performance of the card, (not to mention the cost of more monitors) it’s not something we’d seriously expect to see used on a 5770.
Our jobs would be made much easier if AMD had either made the 5770 perform at parity with the 4870, or made the 5770 cheaper. Right now on a good deal we can swing a 4870 for $140, while the 5770 will be sticking to $160. That’s 14% more for a card that performs 10% worse. If we take a linear extrapolation, the 5770 needs to be at around $130 to win on performance alone, or at the very least $140 so that we can talk solely about the 10% performance loss versus the extra functionality of the 5770.
AMD believes that DirectX 11 is the key to the success of the Radeon 5700 series, and in a way they’re right. If DirectX 11 takes off quickly, then buying a 5770 over a 4870 or GTX 260 right now would be a wise buy. But that’s a future that’s hard to predict, and something we got burnt on somewhat with DX10 and DX10.1.
So here’s the bottom line for the 5770: Unless you absolutely need to take advantage of the lower power requirements of the 40nm process (e.g. you pay a ton for power) or you strongly believe that DirectX 11 will have a developer adoption rate faster than anything we’ve seen before for DirectX, the 1GB 4870 or GTX 260 is still the way to go. Or to put things another way, outside of those two circumstances we’re still at status quo. A 1GB 4870 will continue to be a better choice until the price difference between one of those on sale and a 5770 drops below $10-$15, at which point we could justify rolling the dice and paying a bit more for the 5770. AMD is their own enemy here, which means we aren’t going to be very surprised if we see the 4870 go away very quickly once the 5770 is plentiful.
There is a brighter side today, and that’s the 5750. It wins as much as it loses, and overall it’s just as good as the 4850 when it comes to performance. The pricing is no different either, which means you’re paying the same amount of money for a card of similar performance, better features, and better power characteristics. It’s a no-brainer. Along the same lines the GTS 250 and the 5750 end up going back and forth enough that there’s no consistent performance difference. We’ll take DirectX 11 and 40nm over PhysX and CUDA any day of the week, so the GTS 250 becomes the next Evergreen victim. NVIDIA would need to shave the price down to justify its purchase once more (something they have not done on the GTX series in response to the 5870 and 5850).
The 5750 also whets our appetite for a great HTPC card with its excellent power characteristics and bitstreaming audio support. However it’s at risk of being overkill for that market with its performance and still too-great thermals for a market that, seeing as how the HTPC doesn’t need great performance and always could use lower thermals. A passively cooled 5750 in particular would make a good HTPC card, but we’d look at the Radeon HD 5600 series next year for our perfect HTPC card. If you can wait that long.
With the launch of the 5700 series, AMD finally gets to take a breather. 2 of their 4 Evergreen chips have launched, and nothing else is scheduled in the near future. Look for the release of the X2 series (Hemlock) late this year, and then next year the final 2 Evergreen chips will drop. But for now half of the job is done, with AMD having pushed out DX11 parts to $110 and above in the very short span of 3 weeks. It’s a pace that makes the slow proliferation of DX10 parts absolutely glacial in comparison.
117 Comments
View All Comments
papapapapapapapababy - Saturday, October 17, 2009 - link
http://ht4u.net/reviews/2009/amd_ati_radeon_hd_570...">http://ht4u.net/reviews/2009/amd_ati_radeon_hd_570...use babel
Zool - Wednesday, October 14, 2009 - link
The 5700 series have the same improved adaptive antialiasing with shaders like the 5800 series ?There could be a antialiasing graph with diferent resoutions and antialiasing types for each card in reviews.
RDaneel - Wednesday, October 14, 2009 - link
I may be in the minority, but I've already ordered a 5750. For a SOHO box used for only occasional gaming, it was the most future-proofed option (DX11) that also has low enough idle draw that it actually will save me enough money over the life of the card to justify any price difference with a 48xx card. Would I have loved 10% more performance, sure, but this isn't a bad blend of efficiency and longevity.yacoub - Wednesday, October 14, 2009 - link
imho, it's perfect for that situation.Those of us who have a gaming PC with a DX10 card in it are the ones who find this 5700 series less than stellar.
ET - Thursday, October 15, 2009 - link
But those of us that have a mid-range PC with yesteryear's DX10 card (3870) find it appealing. :)SantaAna12 - Wednesday, October 14, 2009 - link
are you filtering out comments these days?SantaAna12 - Wednesday, October 14, 2009 - link
what up AT?ive been lookin at your recent AMD rants, and its getting tiresome. They paying u the big bucks these days? when you only compare AMD cards against AMD cards you are doing your site a disservice. When you show CF but no SLI you are showing me a new AT.
I have expected more in the past. You goin the route of TOMS?
Ryan Smith - Wednesday, October 14, 2009 - link
As we noted in the article, the CF configuration is mostly for academic reasons. I don't seriously expect anyone to pick it over a single card.Anyhow, what would you like to see? I have the SLI data for the 275 and the 285, but since we've already established that the 260C216 is faster than the 5770, it won't really tell you anything useful.
just4U - Wednesday, October 14, 2009 - link
"NVIDIA would need to shave the price down to justify its purchase once more (something they have not done on the GTX series in response to the 5870 and 5850)."------------------
I'd like to comment on this for just a moment. Where I live we haven't seen much stock on the new dx11 cards yet.. however, Suddenly there's a slew of highly priced 295's and other top end Nvidia products that these stores were not stocking.
My bet is that .. people walking in and making a purchase find out that they can't get that coveted new DX11 card so they opt out for one of those. So in a sense Nvidia would be riding on the coattails of Ati's new popular line that .. just doesn't have the availability. They haven't had to lower prices yet because they may be benifiting by the lack of stocked cards.
Make sense?
Ananke - Wednesday, October 14, 2009 - link
No, it doesn't make sense :)Why would you spent 3-400 for something that you don't want at first place? Why just not keep your money until you actually can buy what you want? We are not talking about 10 bucks, it is way bigger chunk...