I have an R9 270x. How do I know how does it compare with the RX 5500 in terms of absolute performance? (I am sure that in perf/watt there's no story).
The R9 270X (or, Radeon HD 7870) is the GCN 1.0 "Southern Islands" generation derivative. Through the miracles of extrapolation when compared to the Radeon 5700 series (and 6 years of *advancement*), I will guess 2X the performance, AND ...
It's amazing that the Radeon RX 5500 series effectively becomes your discrete, entry-level gaming card for the masses. Shazzzam.
As noted, this would be "Video Core Next" (VCN 1.0) of the "Unified Video Decoder" or, the original ATI "Avivo" (if I remember correctly ...)
Considering the 290X "Uber" is 50-100% faster than the 270X and the RX 580 is 0-15% faster than the 290X "Uber" then the RX 5500X will be 50-125% faster than your 270X easily.
I've got an R280 (effectively a rebranded...7970? I think) and my demands aren't high - 1080p, mostly older games. Running linux (and thus not having Vulkan support on the 280...) one of these may do very nicely to keep me vaguely in the loop.
You could maybe also find a nice RX 580 8GB that was previously used for cryptocoin mining for $100. That might be better if this RX 5500 is going to be over $150, but similar performance to RX 580/590.
There's also no story in FP64 performance: there, your older card is likely to be 'way ahead. If, of course, that is important to you. No, I'm mistaken: the R9 290X had impressive FP64 performance that current cards don't match, but that didn't extend down to the 270 and 270X.
Just buy a Radeon VII and develop the ocl driver implementation with the GitHub folks. You get what you pay for...
Also, just look up what a Tesla V100 costs for giggles. Look at the power and FP64 of Tesla K80 as a reference. You may appreciate being able to even have new GPU FP64 rates for the order of magnitude in cost difference.
So 2 and a half years after the release of RX 580 they are giving us a card with the performance of RX 580 at the price of RX 580 with almost the same energy consumption as RX 580. That's a hell of a lot of progress.
i'm guessing this is like the RX480 situation where it was widely rumored that the TDP was 110W, then the card ended up being 150W. but the chip itself was 110W, so the rumor was accurate, but people didn't understand what the rumor was referring to.
I beg to differ __ performance per dollar has increased considerably.
The AMD Ryzen 5 3400G with a Vega 11 GPU/APU is $144. The discrete Radeon HD 7770 originally retailed at $159, for essentially the same graphics performance.
First of all, you can't compare dGPUs and iGPUs when the OP is clearly talking about dGPUs. Secondly, If youre measure of "considerably increased performance per dollar" is paying 15 bucks less for the same performance some 7 years later, I have a bridge to sell you.
15 bucks less *and* you get the equivalent of a capable CPU period bundled in as well!
Obviously if you already have a CPU or want a more capable one, it isn't for you. But it does say something about progress. Even if it's "AMD can't get the same money for treading water on GPU". (And really, I don't think this is the case; there have been improvements, in power especially.)
I haven't seen any mention of price, and from figures on other sites, it looked like this is closer to Rx 590 performance. Plus RX 580 has a TDP of 185W. So this entire sentence seems to be off.
"If these TGPs are accurate, then this may be another case of AMD favoring absolute performance over performance efficiency for their mainstream parts"
It makes sense from a business perspective. I'd guess that most "mainstream GPU" customers aren't particularly interested in power consumption or OC headroom, and AMD gets plenty of room to use the same chip in lower end SKUs. The real cost for AMD is a more expensive PCB and maaaybe long term electromigration concerns, which are easy problems to swallow.
But the stereotype is already "AMD cards are too freakin' loud" and considering AMD board partners often skimp on cooling more than on Nvidia cards, high board power consumption can hurt sales.
Right, but that's been the case for a while now. If you want a quiet card, you'll be able to buy it, but it'll be even less powerful. They're aiming for the crowd which wants some performance at a relatively low price, and is willing to deal with sound and (in this case) power usage as part of the cost. They and their board partners are kinda stuck with the lower end of the market because they simply can't provide an equivalent product at an equivalent price - yet.
Yes, as long as they keep the PSU requirement somewhere around 500 watts and it doesn't get out of hand with heat and noise. If it worked before, and is doable with the new gen, then why not? My Sapphire Nitro RX 580 is nice and quiet. 580 was a very successful GPU, so use that as a guide.
At the moment, I don't know anything more than what's in the article. But my best guess is that AMD is juicing it for maximum performance. (Assuming there isn't a process leakage-related factor as well)
Most likely that is when board parners allows 5500 to boost as high as it can. 5500 has the same boost system that goes as high as it can, considering heat, power delivery. So if aftermarket cooler is very good and the board maker allows 5500 to eat as much power as it can... 150w is what you get. But if you use smaller cooler, reduce the power you will get near 85w or even less maybe 60w in mobile devices. So there will be very high variation among OEM buils how high it will go!
Do you know if that includes the power for the fans and or just the core plus memory? I am shocked how much power is used by the fans on some of these designs.
I'm not 100% sure yet, but I think the site has reached my ad threshold with this bottom bar thing. Remove the video animation and I'll be good with the bottom bar.
I think their who Twitter handling needs a revamp - they have broken Twitter card markup, too; "fail" is not a valid card type. Maybe they are still trying to access it over TLS 1.0, not 1.2/3?
The one comparison with the RX 560 should be corrected with both specs, there are 2 versions with different specs! if there's ever a class action about that, It would be the first I join! I went off the tech community's reviews to buy my RX 560, and got one with the cut off CU's to 14 CU's! $210 for an crippled RX 560!
Why did you keep it and not return it for a refund? Why did you buy it when you saw it having only 896 shaders? I agree, AMD made a crappy move, but it seems that you were pretty blind in that purchasing decision as well and didn't try to rectify the situation.
For me, the point of the 560 was that I could get a board without a power plug, so I could stick it in my deliberately low-wattage mini-ITX box and still reasonably game in 1080p. (There are even low-profile versions.) This doesn't feel like the successor to that.
Performance, transistor count, die size was very predictable, power consumption was not, even AMD's own numbers contradict each other (150W TDP with a need of 8pin adapter vs -30% from RX480, lol which RX480 the one with the 150W TDP (compatibility drivers) and the 6pin adapter? Anyway the retail version will only need a 6pin adapter (standard non OC parts) for sure
Does anybody else find it ridiculous paying $350 for a card that simply guarantees 60FPS @1080P?
The XBOX One X and PS4 Pro do that for less and they're ENTIRE GAME CONSOLES. Seriously, the X has close to the same TFLOPS as this card and its $300 bucks.
So why the hell are nVidia and AMD milking us dry?
But will they do so at the same image quality? Is it *actually* 60FPS or are there drops? Is the hardware subsidized? Do you have to pay more for games than you do on PC?
It's a fair argument but not an entirely fair comparison.
Now they need to release a mobile CPU, without integrated graphics, that can be paired with this discrete GPU. Hopefully, that would allow them to release 6- and 8-core mobile CPUs as well. :)
There are users that care more about performance than battery life. The GPU integrated with the CPU has little to do with the power savings of that GPU. Biggest savings is the VRAM on the dGPU not being there, but... you'll pay in performance heavily using the CPU's DDR/LPDDR memory at a significantly lower clock. That's where the most significant performance loss is root caused on an iGPU today.
"...as NVIDIA is still using TSMC 12nm - a 16nm-derrived process - for their GPUs." There's a difference. Name it correctly. "...as NVIDIA is still using TSMC 12nm FFN - a 16nm-derrived process - for their GPUs."
Ugh... switchable graphics... I wish they would move away from switchable graphics and just use IGP or a discrete, and never both. Both AMD and Nvidia GPU-based laptops have burned me on this.
I am so desperate for a real desktop replacement laptop, I am tempted to wait for Apple's design. At least my Mac Mini from 2014 actually supported OpenCL out of the box.
Bought an RX570 a couple of weeks ago, didn't want to wait any longer for small Navi. Glad I don't have to regret my decision, these numbers look pretty *meh*, I think.
"a 150W TGP is only around 17% lower than the next card up in AMD’s stack, for a performance difference I expect to be greater than that."
Why do you expect that? Energy consumption scales far faster than linear with frequency.
The right approach for the "embarrassingly parallel" tasks is going wider and slower. More ALUs @below 1GHz, enabled by choosing dense slow library vs fast and not dense enough (look, @7nm they barely match density of NVidia @14nm, without doubt due to heat dissipation issues).
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
61 Comments
Back to Article
yankeeDDL - Monday, October 7, 2019 - link
I have an R9 270x. How do I know how does it compare with the RX 5500 in terms of absolute performance? (I am sure that in perf/watt there's no story).Smell This - Monday, October 7, 2019 - link
The R9 270X (or, Radeon HD 7870) is the GCN 1.0 "Southern Islands" generation derivative. Through the miracles of extrapolation when compared to the Radeon 5700 series (and 6 years of *advancement*), I will guess 2X the performance, AND ...It's amazing that the Radeon RX 5500 series effectively becomes your discrete, entry-level gaming card for the masses. Shazzzam.
As noted, this would be "Video Core Next" (VCN 1.0) of the "Unified Video Decoder" or, the original ATI "Avivo" (if I remember correctly ...)
schujj07 - Monday, October 7, 2019 - link
Considering the 290X "Uber" is 50-100% faster than the 270X and the RX 580 is 0-15% faster than the 290X "Uber" then the RX 5500X will be 50-125% faster than your 270X easily.Beany2013 - Monday, October 7, 2019 - link
I've got an R280 (effectively a rebranded...7970? I think) and my demands aren't high - 1080p, mostly older games. Running linux (and thus not having Vulkan support on the 280...) one of these may do very nicely to keep me vaguely in the loop.mikato - Monday, October 21, 2019 - link
You could maybe also find a nice RX 580 8GB that was previously used for cryptocoin mining for $100. That might be better if this RX 5500 is going to be over $150, but similar performance to RX 580/590.quadibloc - Monday, October 7, 2019 - link
There's also no story in FP64 performance: there, your older card is likely to be 'way ahead. If, of course, that is important to you. No, I'm mistaken: the R9 290X had impressive FP64 performance that current cards don't match, but that didn't extend down to the 270 and 270X.ballsystemlord - Tuesday, October 8, 2019 - link
The new cards have buggy and poor OpenCL drivers. Their HW is also said to be inferior with respect to the FP64 metrics. Buyer beware.Demiurge - Wednesday, October 9, 2019 - link
Just buy a Radeon VII and develop the ocl driver implementation with the GitHub folks. You get what you pay for...Also, just look up what a Tesla V100 costs for giggles. Look at the power and FP64 of Tesla K80 as a reference. You may appreciate being able to even have new GPU FP64 rates for the order of magnitude in cost difference.
juergbi - Monday, October 7, 2019 - link
https://www.amd.com/en/products/graphics/amd-radeo... lists TBP as 110W, not 150W.chilidogs - Monday, October 7, 2019 - link
where are you seeing that? currently showing "Requirements Typical Board Power (Desktop) 150WPSU Recommendation 550W
juergbi - Monday, October 7, 2019 - link
It was 110W but they changed it to 150W.benedict - Monday, October 7, 2019 - link
So 2 and a half years after the release of RX 580 they are giving us a card with the performance of RX 580 at the price of RX 580 with almost the same energy consumption as RX 580. That's a hell of a lot of progress.WinterCharm - Monday, October 7, 2019 - link
Performance of a 580, Price of a 580... and 110W TDP (instead of 240W)WinterCharm - Monday, October 7, 2019 - link
Note: AMD's own website contradicts Anandtech's reporting here: https://www.amd.com/en/products/graphics/amd-radeo...Andrei Frumusanu - Monday, October 7, 2019 - link
It looks like all the media got pre-briefed on 150W, we're trying to confirm which one is correct.chilidogs - Monday, October 7, 2019 - link
i'm guessing this is like the RX480 situation where it was widely rumored that the TDP was 110W, then the card ended up being 150W. but the chip itself was 110W, so the rumor was accurate, but people didn't understand what the rumor was referring to.MikhailT - Monday, October 7, 2019 - link
Can you refresh the page and confirm you're still seeing 110W? It says 150W for me.Andrei Frumusanu - Monday, October 7, 2019 - link
AMD confirmed it's 150W, the product page has been corrected.ywyak - Monday, October 7, 2019 - link
If rumor are to be accurate, Navi 10 is a midrange product. This is an announcement low end SKU.GPU pricing has shifted sadly.
Smell This - Monday, October 7, 2019 - link
I beg to differ __ performance per dollar has increased considerably.The AMD Ryzen 5 3400G with a Vega 11 GPU/APU is $144. The discrete Radeon HD 7770 originally retailed at $159, for essentially the same graphics performance.
Death666Angel - Tuesday, October 8, 2019 - link
First of all, you can't compare dGPUs and iGPUs when the OP is clearly talking about dGPUs. Secondly, If youre measure of "considerably increased performance per dollar" is paying 15 bucks less for the same performance some 7 years later, I have a bridge to sell you.GreenReaper - Tuesday, October 8, 2019 - link
15 bucks less *and* you get the equivalent of a capable CPU period bundled in as well!Obviously if you already have a CPU or want a more capable one, it isn't for you. But it does say something about progress. Even if it's "AMD can't get the same money for treading water on GPU". (And really, I don't think this is the case; there have been improvements, in power especially.)
ET - Monday, October 7, 2019 - link
I haven't seen any mention of price, and from figures on other sites, it looked like this is closer to Rx 590 performance. Plus RX 580 has a TDP of 185W. So this entire sentence seems to be off.brucethemoose - Monday, October 7, 2019 - link
"If these TGPs are accurate, then this may be another case of AMD favoring absolute performance over performance efficiency for their mainstream parts"It makes sense from a business perspective. I'd guess that most "mainstream GPU" customers aren't particularly interested in power consumption or OC headroom, and AMD gets plenty of room to use the same chip in lower end SKUs. The real cost for AMD is a more expensive PCB and maaaybe long term electromigration concerns, which are easy problems to swallow.
Death666Angel - Tuesday, October 8, 2019 - link
But the stereotype is already "AMD cards are too freakin' loud" and considering AMD board partners often skimp on cooling more than on Nvidia cards, high board power consumption can hurt sales.GreenReaper - Tuesday, October 8, 2019 - link
Right, but that's been the case for a while now. If you want a quiet card, you'll be able to buy it, but it'll be even less powerful. They're aiming for the crowd which wants some performance at a relatively low price, and is willing to deal with sound and (in this case) power usage as part of the cost. They and their board partners are kinda stuck with the lower end of the market because they simply can't provide an equivalent product at an equivalent price - yet.mikato - Monday, October 21, 2019 - link
Yes, as long as they keep the PSU requirement somewhere around 500 watts and it doesn't get out of hand with heat and noise. If it worked before, and is doable with the new gen, then why not? My Sapphire Nitro RX 580 is nice and quiet. 580 was a very successful GPU, so use that as a guide.Davenreturns - Monday, October 7, 2019 - link
I'm seeing incorrect specs being reported more and more by my favorite hardware review sites. It's very depressing.https://www.amd.com/en/products/graphics/amd-radeo...
The typical board power is 110W. The ROPs are 32. This is direct from AMD and not some rumor.
Yojimbo - Monday, October 7, 2019 - link
Your favorite hardware review sites got it directly from AMD, too, not from some rumor.MikhailT - Monday, October 7, 2019 - link
You may want to refresh, it saysTypical Board Power (Desktop) 150 W
for me.
Davenreturns - Monday, October 7, 2019 - link
Yep, they changed it and I look like an idiot in the process. Sorry Ryan. But this still doesn't make sense. Why is the TDP so high?Ryan Smith - Monday, October 7, 2019 - link
At the moment, I don't know anything more than what's in the article. But my best guess is that AMD is juicing it for maximum performance. (Assuming there isn't a process leakage-related factor as well)haukionkannel - Tuesday, October 8, 2019 - link
Most likely that is when board parners allows 5500 to boost as high as it can. 5500 has the same boost system that goes as high as it can, considering heat, power delivery. So if aftermarket cooler is very good and the board maker allows 5500 to eat as much power as it can... 150w is what you get.But if you use smaller cooler, reduce the power you will get near 85w or even less maybe 60w in mobile devices. So there will be very high variation among OEM buils how high it will go!
Demiurge - Wednesday, October 9, 2019 - link
Do you know if that includes the power for the fans and or just the core plus memory? I am shocked how much power is used by the fans on some of these designs.WinterCharm - Monday, October 7, 2019 - link
I'm happy they're going with an axial cooler on this card.Howling Mad Merdoc - Monday, October 7, 2019 - link
Please bring back the tweets to the main page. I find there is not enough content for people who visit your site daily.HardwareDufus - Monday, October 7, 2019 - link
no can do... need that precious space for multiple Best Buy ads showing the same product.GreenReaper - Tuesday, October 8, 2019 - link
Maybe they'll hoping you'll buy two? :-)mikato - Monday, October 21, 2019 - link
I'm not 100% sure yet, but I think the site has reached my ad threshold with this bottom bar thing. Remove the video animation and I'll be good with the bottom bar.GreenReaper - Tuesday, October 8, 2019 - link
I think their who Twitter handling needs a revamp - they have broken Twitter card markup, too; "fail" is not a valid card type. Maybe they are still trying to access it over TLS 1.0, not 1.2/3?SaiMorphX - Monday, October 7, 2019 - link
The one comparison with the RX 560 should be corrected with both specs, there are 2 versions with different specs! if there's ever a class action about that, It would be the first I join! I went off the tech community's reviews to buy my RX 560, and got one with the cut off CU's to 14 CU's! $210 for an crippled RX 560!Death666Angel - Tuesday, October 8, 2019 - link
Why did you keep it and not return it for a refund? Why did you buy it when you saw it having only 896 shaders? I agree, AMD made a crappy move, but it seems that you were pretty blind in that purchasing decision as well and didn't try to rectify the situation.artifex - Monday, October 7, 2019 - link
For me, the point of the 560 was that I could get a board without a power plug, so I could stick it in my deliberately low-wattage mini-ITX box and still reasonably game in 1080p. (There are even low-profile versions.) This doesn't feel like the successor to that.mikato - Monday, October 21, 2019 - link
Yeah, we do need new tech in a card that doesn't require PSU connection, please.LarsBars - Monday, October 7, 2019 - link
I really like the look of that 5500 reference card. I wish AMD would use it! Also one just like that but full size with two fans...ModEl4 - Monday, October 7, 2019 - link
Performance, transistor count, die size was very predictable, power consumption was not, even AMD's own numbers contradict each other (150W TDP with a need of 8pin adapter vs -30% from RX480, lol which RX480 the one with the 150W TDP (compatibility drivers) and the 6pin adapter? Anyway the retail version will only need a 6pin adapter (standard non OC parts) for suretygrus - Tuesday, October 8, 2019 - link
Could 110w be for base clock before turbo?Samus - Tuesday, October 8, 2019 - link
Does anybody else find it ridiculous paying $350 for a card that simply guarantees 60FPS @1080P?The XBOX One X and PS4 Pro do that for less and they're ENTIRE GAME CONSOLES. Seriously, the X has close to the same TFLOPS as this card and its $300 bucks.
So why the hell are nVidia and AMD milking us dry?
GreenReaper - Tuesday, October 8, 2019 - link
But will they do so at the same image quality? Is it *actually* 60FPS or are there drops? Is the hardware subsidized? Do you have to pay more for games than you do on PC?It's a fair argument but not an entirely fair comparison.
Fritzkier - Tuesday, October 8, 2019 - link
Aren't what they guarantee on $350 cards are 1440p 60fps gaming?Demiurge - Wednesday, October 9, 2019 - link
Where are you getting $350 from relating to this 5500 gpu?No one is forcing you to buy...
phoenix_rizzen - Tuesday, October 8, 2019 - link
Now they need to release a mobile CPU, without integrated graphics, that can be paired with this discrete GPU. Hopefully, that would allow them to release 6- and 8-core mobile CPUs as well. :)Demiurge - Wednesday, October 9, 2019 - link
Seconded!neblogai - Sunday, October 13, 2019 - link
Without integrated graphics, it would not be mobile, as it would such too much power at idle.Demiurge - Wednesday, October 16, 2019 - link
There are users that care more about performance than battery life. The GPU integrated with the CPU has little to do with the power savings of that GPU. Biggest savings is the VRAM on the dGPU not being there, but... you'll pay in performance heavily using the CPU's DDR/LPDDR memory at a significantly lower clock. That's where the most significant performance loss is root caused on an iGPU today.ballsystemlord - Tuesday, October 8, 2019 - link
"...as NVIDIA is still using TSMC 12nm - a 16nm-derrived process - for their GPUs."There's a difference. Name it correctly.
"...as NVIDIA is still using TSMC 12nm FFN - a 16nm-derrived process - for their GPUs."
Demiurge - Wednesday, October 9, 2019 - link
Ugh... switchable graphics... I wish they would move away from switchable graphics and just use IGP or a discrete, and never both. Both AMD and Nvidia GPU-based laptops have burned me on this.I am so desperate for a real desktop replacement laptop, I am tempted to wait for Apple's design. At least my Mac Mini from 2014 actually supported OpenCL out of the box.
iamberryjohnson - Thursday, October 10, 2019 - link
I am loved AMD Radeon and extremely excited to buy it. Also, hope this will boost my experience.dr.denton - Tuesday, October 15, 2019 - link
Bought an RX570 a couple of weeks ago, didn't want to wait any longer for small Navi. Glad I don't have to regret my decision, these numbers look pretty *meh*, I think.peevee - Monday, October 28, 2019 - link
"a 150W TGP is only around 17% lower than the next card up in AMD’s stack, for a performance difference I expect to be greater than that."Why do you expect that? Energy consumption scales far faster than linear with frequency.
The right approach for the "embarrassingly parallel" tasks is going wider and slower. More ALUs @below 1GHz, enabled by choosing dense slow library vs fast and not dense enough (look, @7nm they barely match density of NVidia @14nm, without doubt due to heat dissipation issues).