Wowsas! AMD hit a home run with the Fiji archictecture.
FuryX will be unbelievable perfomance, very likely the fastest GPU in the world and that water cooling system is incredible. quiet and cool running. A little birdie had to have been whispering in nvidias ear about how amazing this card was going to be when they decided to launch their 980Ti so quickly. FuryX for 650 dollars may well leave the 980Ti in a questionable position.
I'm surprised how well AMD did here with the execution of this part after coming off the Hawaii launch parts. FuryX is the card to get.
So far, all the benchmarks (yes, they're leaked, but logical) have shown Fury X to be as fast or faster than the 980Ti while being quieter and cooler thanks to its stock CLC. Also, you get the best version for overclocking for $649 out of the box, whereas you pay over $700 for the higher-end OC 980Ti models. Hopefully any reviews we get from AT will include stock/stock and OC/OC comparisons and then we'll know for sure.
imagine if 980 TI had HBM. That's the only reason amd card is faster.. it's definitely not due to the shader arch.. kudos to them either way but the difference in most games is <5 FPS
Another thing to consider is you give up day 1 drivers going to AMD. Nvidia always has day 1 drivers for every popular game. And lets acknowledge A lot of games run like shit on AMD regardless of drivers.
AMD didn't have day 1 drivers for Witcher 3 (and didn't have drivers for the next 3 weeks either) and all their GPUs had no problems with the game. W3 even performed spectacularly after the tessellation override improved performance at zero image quality loss.
As for Nvidia's Day1 drivers, you are talking about the drivers that basically destroyed 600/700 performance, right? Lmao.
Day 1 purchases are for suckers. Best to wait until the beta testers find the problems and the devs fix them. That said, until there are reviews, I have no way of knowing if this card is as good as AMD says it is or not.
Are you on funny pills? The stock Nvidia Titan style blower coolers have been around for years no for a reason; they're extremely good. They are quiet and exhaust most of the hot air out of the case.
It seems YOU have failed to notice that AMD are giving all their rebadged chips to OEMs to fit their own coolers because AMDs stock coolers were so terrible, being both very noisy and completely inadequate to cool hot Hawaii chips.
Yes, it's great that their new top end is using a CLC but you could argue the only reason it needs this is because the new Fury chips are going to be very hot and very power hungry. An engineering marvel would be engineering a high performance chip with lower power use and heat output, instead of making something hot and power hungry and then trying to make up for it afterwards using liquid cooling.
It's pretty much always 5-15FPS between same segment card from AMD/Nvidia, yet the fanboys always take sites and one side always "stomps" the other. I usually buy the cheaper card and laugh at those 10 frames.
Lulwhat. Most reviewers are comparing the 290X with the 980 when the 290X is the 970 segment.
It's always like this. Nvidia has a faster (and much more expensive GPU) that is slightly faster than AMD's flagship GPU occasionally (but most of the time faster though), even though AMD's GPU is usually cheaper than Nvidia's 2nd best GPU and faster than it too.
I love how the settings in all your charts are cherry picked to show the AMD card in the best light. Non consistency at all in any of those, and in most situations its less than a 10% difference. Considering the Fury X is already watercooled, im really curious to see what kind of OC headroom it has compared to GM-200 (which is very high OC headroom).
So it looks like AMD is punishing sites that cast Rebrandeon in a negative light by withholding Fury samples. Any truth to this?
@Ryan, care to comment on the validity of this? Have anything to do with why you don't have 300 series to review? I know you've repeatedly stated AT's independent status, but it looks like AMD is making tech sites stick to the "script" or face punishment.
It's nothing nvidia has never done, they've blacklisted several review sites in the past. Kitguru made their own bed by throwing jabs in their articles before they had any idea what the hardware was capable of. In eteknix case I'm not familiar with them so I don't know. Ryan probably doesn't reply to blatant fanboys by the way, I'm sure he's seen your fanboyish comments littering the forums.
Sounds like typical apologist BS from known AMD fanboy, I do recall Nvidia having a similar incident with Hardware Secrets but it turned out to be just a miscommunication.
But that didn't stop AMD fanboys, like you, from getting all that fake angst and fanboy anger out with torches and pitchforks in tow, did it? And now AMD does the same thing and its all "well Nvidia did it first" kinda ironic since AMD fanboys love to claim this is the type of behavior that turned them off Nvidia for good? :D
I saw a review on the 390x with power usage and it used 468 watts max draw and 368 watts average.
I think AMD has done an amazing job slamming voltage and power into that old core to crank it up, again, but the 290x can just be overclocked to match it, since no one has found any real differences at all yet, and that is pretty sad.
Well.. no. It's going to be pretty good according to leaks, but I do get what you mean by the earlier comments.. until we get some full on testing in fanboys from "both" camps can bite me. The majority of us like both companies I think and constantly switch between the two, there is no favoritism.
Actually, for me....AMD is......near DEATH. Their CPU business died years ago, and today their GPUs all seem to lack....competence. The very presence of that radiator on the FURY X is a sign of their desperation. They are going to clock the bejesus out of that card to put up numbers that resemble a 980ti. But look at the pricing...if it were any faster...any faster at all...they would demand a higher price to pay for the more expensive HBM implementation. I know that benchmarks are coming, but the bigger story will be the aestethics. Having to find a place for that Radiator is deal breaker, and the air cooled Fury Vanilla (or whatever the hell it's called) is not going to put up some great numbers....obviously just look at the pricing...it never lies.
Yes, clearly offering a product with vastly better cooling is a sign of desperation, as shown by the EVGA hybrid kits to make up for the shortcomings of the 980, 980 Ti, and Titan X.
But then again we actually know that the clock speed on the Fury X is going to be and it's not going to be ridiculously clocked, which any idiot could figure out if they extrapolated from numbers for a properly cooled 290X.
It takes a really special sort of customer to manage to have a case with no 120mm fan slots and to confuse pricing for quality. You probably think the 980 and Titan X's performance is proportional to their price. You also probably think a 780 Ti was an acceptable buy compared to a 290X in retrospect.
Its only offering a better cooling solution if the clocks are the same as it would have been air cooled. Slapping a water cooler on it then overclocking it 20% to get better framerates isn't "better". You could easily do the same thing with a GM-200 and get huge gains. Even a moderate overclock on the stock air cooler on a 980ti already puts any of the leaked numbers in the ground. So, i'm still not sure what everyone is so jazzed over.
Its a great card, and im glad to see they're finally on level playing ground (or slightly better). This is ultimately good for the consumer as its going to force nvidia to up their game. That being said. The card is a solid competitor, nothing else. It's not "beating" hte 980ti in any meaningful way.
Gm 200 doesn't benefit from watercooling because nvidia artificially limit board partners. Gpu boost theottles like mad in demanding games because it's always power limited.
Amd has been well known for keeping prices in line right back to their Radeon 8800 (400 CAD as opposed to 700 for the GF3) Once in awhile they flirt with absurdly high prices but it's never been a easy sell for them... no matter how good their product is.
They've been near death for decades but thankfully they are still out there or prices would be a lot higher from Nvidia and Intel.
It's rated at 275W and its a halo card. It has the power input for up to 375W and its cooling solution is rated to 500W, so they are not pushing things to the limits. The card itself is several inches shorter than the traditional high-end GPU would be. I'd wager its just not plausible to dissipate 275W with a heatsink that's 3 inches shorter than it normally would be. And there is the side benefit of being significantly quieter than than an air-cooled design, it is expected to be at around 30db at a normal gaming load.
I have looked at installing radiators on my R9 290s. Including it is AMD's answer to Nvidia's great blower. In terms of value, there is more with a factory included radiator. This is just fanboy hate. I look forward to seeing the benches. Another plus is that this card will fit in smaller cases because it is shorter. I had case issues with my 290.
Based on AMD's pricing, I would expect parity or better performance with the 980 Ti as AMD does well in sales when they price directly while offering better value and performance.
ha coming from a nvidia fanboy,when nvidia rigs games using ant-competitive fraudworks,which artificaily makes nvidia cards seem faster than amd cards.pretty sad when you have to pay game developers to use prioritized closed off software to make your graphic cards seem better than amd's,when in actual raw performance amd wins.
hi buddy i live in the land of facts,not fictionworks(gameworks)like you do.you nvidia fanboys would still think nvidia is better even if amd cards are 100x better.thats how crazy and out of touch you guys really are.theres no reasoning with your type's of people.
Most people won't have a problem finding a place for that radiator. What about 10-11 inch air cooled cards? Some people may not have a place for those and they seem to do fine.
" Having to find a place for that Radiator is deal breaker" LOL you put it in the same place your exhaust fan in the back of the case is NOW. Its not hard to find.
Ain't that the truth. I've been more nvidia biased the last 4 or 5 years mainly because they've just worked, and prior to that i had myriad annoyances with AMD cards (primarily driver related stuff, the hardware was always solid, but just annoying things here and there). That being said, im glad to see AMD has upped their game.
Disagree. NVIDIA is still competing against the NVIDIA card you already have. Nobody with a 980 is going to buy a 1080 or whatever they end up calling the next range if it's merely 10-20% faster. NVIDIA in their own charts frequently compare their latest high/midrange GPUs with their own of a few years ago that they know people are hanging on to, such as the venerable 8800 GT and more recently the GTX 680. I prefer NVIDIA's focus on interesting software features and power efficiency.
@darkfalz, exactly, people who keep repeating this AMD needs to survive for competition meme endlessly, while they are holding on to their 5850 and 6970 or whatever waiting for a worthwhile successor lol.
Ah, the competition is bad capitalist. Tell me again how Comcast needs to earn your repeat business through customer service and infrastructure improvements for growing demand. Thank god were seeing major revolutions in desktop cpu's and my 2500k is practically worthless from the innovations.
Finally a reasonable comment. Real competition is tons better for the consumer, and that's why most companies want to dominate their fields. At that point they can set the prices to whatever they want, and this happens in tech all the time.
Riiiight..... Any official benchmarks yet? Nvidia will simply drop prices on 980 Ti, bump clocks on Titan X and drop its price and eat AMD's lunch again :) Not to mention crap AMD drivers that will take a year to mature to any level of usability.
Well, I'll start. How about the ability to adjust AA and CF bits without replacing the entire game profile? The enthusiast community has been asking for this, forever. AMD took a step in this direction with a loadable binary, but its just that, a binary that the user can't edit. Nvidia on the other hand has exposed these bits forever, going back to coolbits allowing the user almost complete control over SLI, AA, 3D Vision bits. This is important for Day 1 compatibility and general compatibility in cases where there is no Day 1 driver, which is of course especially horrible for AMD, but that of course is an entirely different issue. ;)
I get BSODs with the current "stable" driver, usually when the monitor goes into a low power savings mode. The beta driver appears to fix the BSOD (not sure as I didn't test long enough), but introduces massive graphics corruption at the desktop when resuming from sleep a fair chunk of the time. Sometimes when the computer wakes on any driver, the monitor gets no signal until I unplug then replug it.
Indeed they did, and there will be many more happy customers when Fury releases.
Time to ask those who bought those Titan X if it was worth spending $350 for 2 and a half months, before the 980Ti released. I doubt many are still that happy. If AMD kicks the 980Ti price down below $600, I think they might be happy again.
The people who bought it don't care. The only people who spend $1000 on a graphics card are people who make way more money than they know what to do with, or who are frivolous spenders who just blow money without thinking.
Well, I certainly had to give pause, there's always choices to be made with disposable/discretionary income, but in the end, seeing some of these games at 1440p with full maxed details and 4xMSAA or in 3D Vision (GTA5 at 70FPS per eye in 3D) are just jawdropping, so yeah. money well spent.
You're missing a huge demographic here, and that is professional users. Titans are basically used by scientists, engineers, previz architects etc. A gamer doesn't really need a Titan card, unless they want the latest and greatest.
A) Who cares about double precision, it's used only by actual professionals and mostly for scientific simulations.
B) NVIDIA developed the Maxwell architecture years ago and it was designed to remove DP for the purpose of saving power. Fiji was not even a blip on the radar when Maxwell was in development
Nah, Nvidia won't drop prices at all, even if Fury X is +/-5% as expected. AMD knew exactly what their target was and they hit it, barely, needing a WCE and 275W to do it. Any 980Ti will be able to meet or beat that performance with just a slight OC.
As for Titan X, certainly worth it and happy, it still looks to retain the top end performance and it certainly won't be subject to concerns about VRAM, like the Fury X will at the resolutions and settings it is expected to run at, especially in multi-GPU configs. I wasn't sure so I went through the paces myself, even 6GB with 2x980Ti wasn't enough, so I traded them for a 2nd Titan X and cash. Now I'm happy and will be until Pascal drops next year or 2017. :)
I'm extremely happy with my Titan X that I have under water and running at 1520/8200 clocks. My card will eat any Fury X for dinner (5278 FS Ultra score) as Fury's will be lucky to hit even 1200MHz based on that extremely dense die + what we've seen from GCN in the past.
Also lets not forget about 4GB VRAM. Don't care if it's HBM or not, it's still 4GB. The minute it was clear that Fury would not have 8GB (as originally rumoured) and instead 4, I bought my Titan X.
Yes I see the average OC on water for Titan X core is an astounding 1497... hahahah oh man now wonder it's all silent on everything about it except amd fans whining about the price.
They are so sad. the average OC on water for their failed precious 290X is 1187.
Let's face it, the Titan X is an OVERCLOCKING MONSTER.
Are you happy with AMD drivers Rangoo? If yes, that probably explains why they are so bad. There's no less than half dozen major bugs/feature limitations AMD would have to address before I would consider using them again, and that starts with Day 1 drivers. What do you think the chances are AMD comes out with a Day 1 driver for Batman Arkham Knight on Tuesday? Or is it going to be more CryWorks for AMD and their users?
I'm an AMD fanboy, but giving Nvidia's Titan X and GT 980Ti their last rites before AMD's Fury X launches is premature. As I recall the now named Fury X edged out Nvidia's Titan X and GT 980 despite having twice the memory bandwidth. I would've thought such an advantage would make Nvidia's flagship cards eating the Fury X's dust but instead AMD just manages to edge Nvidia by its front fender. Any AMD fanboy should be concerned.
And all NVIDIA has to do is drop the price on their cards appropriately. They've had a 9 month lead and for all we know Pascal is going to be ready in 6 months. Then we'll be chasing down this rabbit hole again.
I'm not sure why you are sure about that. From what I've seen, 16nm FF+ is scheduled to begin volume production in Q3 2015, which allows for the real possibility of Pascal arriving in 1H 2016. It's just a matter of when Pascal is ready and what the yields of the process are like.
The Fury X has 4GB of RAM, is water cooled, and comes without NVIDIA's software environment. HBM may have a certain marketability, but I think the Fury X is going to need a significant price/performance advantage to outsell the 980 Ti.
That's sort of trying to turn the argument on its head. He said it for sure WOULDN'T be out in 1H 2016. A vague "TSMC has historically had issues" doesn't support that claim. You are replying to me as if I made the claim that Pascal for sure WOULD be out in 1H 2016, which I didn't (I said "which allows for the real possibility of Pascal arriving in 1H 2016").
I don't get how your conclusion follows from your premise. If HBM2 will be ready in Q2 2016 why can't products using HBM2 ship in Q2 2016? Also of great important is what the date of release of that Hynix slide is. Because it doesn't entirely agree with the information from http://cdn.overclock.net/7/7e/500x1000px-LL-7ef5be... for instance. In addition, did Hynix ever introduce the DDR4 3DS as that chart seems to promise for the Q1 2015? I can't find anything about that. The chart may be old and inaccurate.
It's never really the case in this industry that the day you get production going is the day you'll see it in shipping products. These things take a lot of time to get into shipping products. Fabs tell all kinds of stories how x nm products are ready by this or that time, when in reality the products coming with that tech can take another year to actually be available.
That said, yeah, the chart can of course be old or inaccurate, but that's what we've got. At least it's not pure conjecture.
It's already been shown to be inaccurate so it's not very useful. Besides, we don't even know what the dates on that chart were supposed to mean at the time it was made. Is that when we see it in products, when volume production starts, or when they start selling the chips? Also, It just shows a circle centered on the border of Q1 and Q2. Who's to say that products wouldn't be out by the end of Q2 even if there is some time between whatever even they have in mind for the chart and actual products showing up. My point is that everything in that chart is already pure conjecture, so I don't see much reason to be sure that an HBM 2 product can't possibly be ready before H2 2016.
It seems to me a foolish hope to think they'd be ahead of schedule give HBM1 actually started volume production in Q1 2015 instead of what was marked on this product map. If anything, HBM2 is further delayed.
Why do people always talk about NVIDIA's software environment as if it is some major advantage they have on AMD? It seems to me that they are both just as good, and from my experience with NVIDIA and AMD is I've had less driver issues with AMD believe it or not.
But yeah the Fury X has benchmarks released by AMD using Far Cry 4 4k Ultra Settings and it outperforms the Titan X by more than 10 average fps. I know the benchmark isn't as reliable since it was released by AMD obviously but still, it really makes you wonder. I definitely think it will outperform the 980ti especially if AMD claims it can outperform the Titan X but of course we shall see :)
Nvidia certainly spends a lot more money and effort on their software currently. - They have more timely driver updates aligned with big game releases - SLI support is much better than AMD's sparse and late updates to CF - GeForce Experience works much better than AMD's third party equivalent - Better control panel features like DSR, adaptive V-sync. AMD's efforts tend to be like half-baked copies of these. AMD hasn't come up with anything truly new in a long while and all I can do is smh at 'new features' like FRTC that's so simple it should've been in the control panel a decade ago.
I do think for single GPU driver performance and stability there isn't much of a difference between the two, regardless of how many driver updates Nvidia does. Actually the latest Nvidia drivers have been terrible with constant TDR crashes for a lot of people. But that's anecdotal, both sides have issues at times, and on average both have ok drivers for single GPU. It's the above mentioned things that push Nvidia to the top imo.
I like people talking about how AMD didn't get drivers out for Witcher 3 immediately and ignore that NV's drivers were incredibly flaky and they needed to be reminded Kepler cards exist.
"I like people talking about how AMD didn't get drivers out for Witcher 3 immediately and ignore that NV's drivers were incredibly flaky and they needed to be reminded Kepler cards exist."
Pfft....to this DAY the Crossfire support in W3 is terrible, and ditto for GTA5. The former is a TWIMTBP title, the latter is not---even has AMD CHS tech in it. I run Kepler Titan Blacks and 290x(s) and there is no question NVIDIA's drivers are far, far better in both games. Even the launch day Witcher 3 drivers are superior to AMD's half-assed May 27th 15.5 betas, which havent been updated since.
For single cards, I'd agree, AMD drivers are almost as good as Nvidia's, except those Gameworks titles that need to be reoptimized by AMD's driver team.
But there isnt even a question that Nvidia gets betas out much, much quicker and more effectively than AMD.
And if you arent into betas, heck, AMD hasnt released an OFFICIAL R9 2xx driver since December, LOL. Which is what annoys me about this Fury launch---once again, AMD puts out this awesome piece of hardware, and they've been neglecting their current parts' drivers for months. What good is the greatest videocard on the planet (Fury X) if the drivers are rarely and poorly updated/optimized?
@xthetenth - complete rubbish, while AMD fanboys were boycotting the game over PC only features, Nvidia fans were enjoying the game on day 1, courtesy of Nvidia who gave the game away to new GeForce owners.
How is CF+AA doing in Witcher 3 btw? Oh right, still flaky and broken.
Have you heard of Cuda, Gameworks or DAY1 drivers for game releases? You seem to be oblivious to the fact that cuda runs on 200+ pro apps and is taught in 500+ universities. Never mind the fact that NV releases drivers constantly for games when they ship, not 4-6 months (sometimes longer) later. You are aware the last AMD WHQL driver was Dec correct?
http://support.amd.com/en-us/download/desktop?os=W... Dec 8, is NOT GOOD. They can't even afford to put out a WHQL driver every 6 months now. Get real. Nvidia releases one or two EACH MONTH. And no, I don't believe you have more problems with NV drivers ;) I say that as a radeon 5850 owner currently :)
AMD's R&D has been dropping for 4yrs, while NV's has gained and now is more than AMD with less products. Meaning NV's R&D is GREATER and more FOCUSED on gpu/drivers. Passing on consoles was the best thing NV did in the last few years, as we see what it has done to AMD R&D and lack of profits.
AMD needs new management. Hopefully Lisa Su is that person, and ZEN is the right direction. Focus on your CORE products! APU's don't make squat - neither do consoles at the margins they made to get the deals. There was a VERY good reason NV said exactly that. They passed because it would rob from CORE PRODUCTS. We see it has for AMD. It hasn't just robbed from hardware either. Instead of approaching companies like CD Projekt for Witcher 3 to add TressFX 2+yrs ago, they wait until the last 2 months then ask...ROFL. That is lack of funding then excuses why perf sucks and complaints about hairworks killing them. An easy fix in a config/profile for the driver solves tessellation for both sides (only maxwell can handle the load) so it's a non issue anyway, but still AMD should have approached these guys the second they saw wolves on the screen 2+yrs ago showing hairworks.
http://www.tomshardware.com/reviews/amd-radeon-r9-... Check the pro results...AMD's new cards get a total smackdown, 3 of the 5 are by HUGE margins. Showcase, Maya, Catia all massive losses. Note you'd likely see the same in Adobe apps (premiere, AE, not sure about the rest) since they use Cuda. There is a good reason nobody tests Adobe and checks the cuda box for NV vs. OpenCL for AMD. ;) There is reason Anandtech chooses Sony, which sucks on Nvidia (google it). They could just as easily test Adobe with Cuda vs. AMD with Sony vegas. But NOPE. Don't expect an AMD portal site to run either of these tests...LOL. Even toms won't touch it, or even respond to questions about why they don't do it in the forums :(
@colhoop it is largely going to depend on your use cases. For example, GeForce Experience is something many Nvidia users laud because it just works, makes it easy to maximize game settings, get new drivers, record compressed video. Then you have drivers, driver-level features (FXAA, HBAO+, Vsync that works), day 1 optimizations that all just work. I've detailed above some of the special reqs I've come to expect from Nvidia drivers to control 3D, SLI, AA. And the last part is just advertised driver features that just work. G-Sync/SLI, low power mode while driving multiple monitors, VSR, Optimus all of these just work. And finally you have the Nvidia proprietary stuff, 3D Vision, GameWorks, PhysX etc. Amazing if you use them, if you don't, you're not going to see as much benefit or difference.
I disagree with your opinion regarding game optimizations. The default optimization settings are often a bit conservative for me (it doesn't factor in my CPU OC, I suppose). But you can conveniently adjust it with a single slider, instead of having to fiddle with several different in-game settings.
Where are you guys pulling this crap from. Even the most favorable leaked benchmarks show Fury X at at BEST at 10% advantage over a 980ti. In most cases its 3-5%. Explain to me (with the assumption the leaked info is accurate) how that is "tearing the 980ti a new one".
I'm the first to admit I'm a fan of the best, which is why I can't understand why people would choose Amd when they can get better for such a small premium. You've come clean in this thread for once, you've never once wondered why you bother with all that when you could've gotten a suitable alternative from Nvidia for a couple burritos at Chipotle?
True, but the engineer said that it's the most overclockable card from AMD yet, if we can believe that guy, that with a little fiddling it can probably go toward 25% and then the Titan X is behind it. But we'll see I'm not claiming anything, would be nice tho.
They already bastardized their own Titan X with the 980Ti, how many more cuts are they willing to take really, I guess it will be a necessity, but I don't think nvidia will like that one bit.
If that was true why didn't they just put a Fury and 980Ti card side-by-side and benchmark them in front of the world?
I actually hope you are right as it's unhealthy for Nvidia to keep kicking AMD's ass year-after-year. Last thing we need is for AMD to fold and Nvidia to jack prices even higher.
That's what I'm saying. If AMD goes out of business, who's going to compete with them? Intel APU's? With the US Government being the way it is, Nvidia may be able to claim that Intel's on-chip "HD5000" or whatever graphics is "competition", and use that as an excuse to operate as a monopoly.
The best we could hope for is that Samsung or Intel or somebody else would buy the Radeon IP and launch some new GPU's to compete with NVidia.
If AMD folds then if Intel is very smart then they will buy off the ATI division for cheap and with their R&D they will dominate both cpu and gpu technology.
Unfortunately, CPU performance has also enjoyed an unprecedented level of stability since then as well.
I'm still running with an i5 2500K bought back in 2011 overclocked to 4.6 GHz. Frankly once you take the OC into account it's basically up there with the newly released i5 CPUs.
Intel is innovating and improving performance where they are most deficient, we have a number of E5-2699v3 18-core CPUs in our data center at work that would certainly disagree with your assessement. ;) But just look at what they are doing on the GPU side of things (Broadwell-C just eclipsed and destroyed AMD's best APU) or on the low-end with mobile and Core M and you'll see Intel is still innovating.
On the CPU side of things, I've had worthwhile upgrades all along the way and I've paid less for each of them than the $450 AMD wanted for their cheapest Athlon 64, which is coincidentally what made it easy to choose Intel again for the first time in years.
You don't "destroy" something by performing 25% better at double the price. In any case, we know that AMD's most powerful APU - the one powering the PS4 - is significantly faster at graphics, it's just that AMD haven't seen a point in bringing out such an APU for the computing market... yet.
Intel has a far bigger wodge of cash to throw at these things, any way you look at it.
Please don't release that for the PC market. The GPU might be OK (no better than OK, it's a 7870, not made for high end gaming), but the CPU in it sucks balls. It's bobcat.
I saw more than 25%, in some cases 50% uplift,, that is destroying especially given you also get MUCH faster CPU as you would expect from a full blown 4-core Intel x86 Core chip. 2x the price today, but that's just standard IGP tomorrow for Intel, all at lower power than their APUs. Fusion is officially dead.
You can't compare PS4 either, higher TDP and much slower CPU than even AMD's APUs, so what they gain in graphics is lost in CPU and even then Broadwell isn't too far off. Sure Intel has more cash to throw at the problem, but they've basically proven AMD's acquisition to gain competitive advantage in integrated graphics was a complete and total failure that has effectively made 2 competitive companies uncompetitive in their core competencies.
The reason you haven't seen the PS4 APU out in the PC space is remarkably simple - you'd get nowhere near its theoretical performance level owing to the DDR3 memory bottleneck. That's why Sony paired it with GDDR5 - something you can't do on PC.
If you have seen benchmarks, then please, link. Otherwise you are just spewing pointless pontifications with no backing.
Fiji certainly SOUNDS impressive on paper, (though I'd like more than 4GB VRAM for GTA V and Shadows of Mordor) but I have yet to see what the real world result is.
I just want to see the actual benchmarks. I hope it all does play out well for AMD because competition is better for the market no matter which company you want to buy from. That said...
On paper it certainly looks like it will have "unbelievable" performance in all games at 1080p or less and the majority at 1440 however my concern is that the 4gb of memory appears that it will result in a bottleneck at higher resolutions and settings. No matter how how powerful the processor or how fast the memory on the card is, if all the desired data cannot be loaded into the video memory at the same time then the card is going to be at the mercy of the rest of the other computer system's ability to feed it the required data. It may just end up being like driving a Ferrari in rush hour traffic. =/
Again, that's not as huge of an issue right now but who is buying a video card for the games they wanted to play yesterday? Tomorrow is where it's at.
Talk about premature dude. We have not seen outside sites do comprehensive reviews on these cards and you already declare them the best. It would be foolish to spend money now without seeing how the high end cards do at 4K resolutions. AMD PR isn't worth much without legit numbers backing it up.
How's that amazing performance looking now from other site's reviews? Anandtech was being honest with it's readers about these ridiculous re-brands. Now they are paying the price for their professionalism by telling it to their readers and not with AMD's spin. Fury X is a catchup to the 980ti with near zero overclocking room. Notice the sites that have Fury X reviews didn't grade overclocking? When Anandtech's review is out they most certainly will. So hardly a win. One model. All the rest are the same dated, overheating re-brands.
This new AMD release should be mainly judged by its efficiency or performance/watt metric. Of course AMD can double and triple transistor count to achieve a higher performance than that of the competitor, the real achievement is weather AMD engineers can attain that at a lower power budge, which I highly doubt is the case in this new generation given the push for water cooling. I would be very impressed if AMD can beat Maxwell efficiency in this release.
At least on paper, they could. The Titan X performs around 40-50% better than the 290X in general (it depends heavily on the tests, as usual). Compared to the 290X, the fury X has around +40/50% more of everything (SP, Texturing units, memry bandwith) (except ROPs). The power consumption of the Titan X is not that far ahead of the 290X (around 10% less). So on paper, barring potential architectural performance changes per SP, barring the turbo behaviour that should be better with this cooling soltion, barring all factors... (again, it's a paper comparison), you end up with a card that has comparable performance with roughly 10% more power, which is quite decent.
The problem being that this performance/W level would not scale down very well, because you know, HBM. But a GTX980-esque R9 Fury "vanilla" could be possible.
Again, it's a paper comparison, we will see how it is on the 24th. But at least, there is some potential.
And as a side note I forgot, I think the watercooling push on R9 fury models is simply due to the shorter PCB design (because, you know, HBM..), so if you don't want to lose this "advantage", you need a shorter radiator. Which may not be enough to cool down a good GPU. So you have to take some of the radiator off of the card, so you have to use watercooling unless you go for a 4-slots solution...
Have you actually seen the card? The radiator goes on a chassis exhaust port, NOT attached to the PCB...
Unless, of course, you are calling a heatsink a radiator... and are actually referring to what would need to be done for air cooling (in which case you'd just go longer, not wider).
I don't know why, in my mind it was a hybrid air+water solution.
In any case, it still follows the reasoning: when I say "taking some of the radiator off the card", it means that some (some being 100%) of the heatsink (radiateur in my baguette mother tongue, hence the confusion with "radiator", sorry) is not on the PCB anymore ;)
"And as a side note I forgot, I think the watercooling push on R9 fury models is simply due to the shorter PCB design "
I mused about that yesterday as well. I suspect the issue is that the shorter PCB limits the size of fans that would look sensible and as we all know, usually from bitter experience, small fans and GPUs usually result in something that sound like a jet fighter taking off.
I want to see Fury with a custom waterblock for us custom water loopers but having power connections on back of card makes for very clean builds
could go either way. but only because of the performance boost of HBM. Since it's GCN1.2 fury X's stream processors have essentially the same efficiency as the R9 285. bottom line: a gtx 980 ti with HBM would be far more efficient. however since nvidia will only use HBM with pascal, amd has another year to match nvidias efficiency
I don't think they will beat Maxwell's efficiency but I think between the architecture changes they've made and the efficiency benefits of HBM they will approach it. It looks like with Fiji, AMD may have reduced double precision performance in exchange for die space/efficiency.
Or we could judge things that you buy to make frames quickly on the basis of how fast they make frames in comparison to the money you spend on them, and if we want to save some watts go to the places where saving huge numbers of watts is trivial like lightbulbs.
Or are we just going to be entirely about performance/watt, pack the entire desktop segment up and play games on our phones?
All of these refresh cards should have been released months ago to compete against Nvidia's GTX 900 series instead of leaving a giant vacuum for Nvidia to go uncontested and fill up market shares. Also, since none of them uses the new HBM then there's no excuse for not releasing them for so long. AMD's marketing strategy is the worst. This is how you go bankrupt and run their ATI division into the ground.
Fairly well forced to run down their existing 2xx stock first I'd think? They did that at very low prices of course, so probably about as good an effort to compete as this rebadge/refresh.
Really need 14nm to come relative soon and a good range there.
I think they just didn't have all the technology ready. They've really taken a beating the past 9 months. I am guessing an inventory write-down would have been preferable to the loss of market share they've recently incurred.
Really want an anandtech review on the 390 and 390x. There seems to be some great numbers coming from other sites that already have the reviews up. Very surprising, given they are rebrands with more memory and adjusted clocks. I don't know if it was just driver work since I last saw 290x numbers or what?
Anyway what I really want to know is if the 290/290x actually does perform the same as the 390 series (adjusted for clockspeed) So please retest a 290x if you have one handy, not a crappy reference one obviously. If the 290(x) matches the current numbers I am seeing I will be picking one up second hand/ fire sale for sure. I really thought there was more of a gap between gtx980 and the hawaii cards but perhaps I was wrong.
NM this was already done at Hardocp.com with their MSI 390x review. At the same clockspeed the rebrand 390x is the same speed as the 290x. Think I'll be picking up a cheap 290 or 2
Check out a few different reviews. Guru3D show the 390X beating the 290X by up to 15% at higher resolutions, and generally matching or slightly faster than the GTX980.
The 390X is certainly no revelation in gaming, but at $50 less than the 980 and equal to better performance, maybe it and Fury will at least help edge nVidia's pricing down out of "we've got a monopoly and we know it" territory.
Did you bother to read what he wrote? Clock for clock, the chips are identical = rebrand. The 290X scores are either reference, custom cooled, or different stock clocks which is no surprise given 290X has gone over so many permutations over the years. Reality is, the last 8GB 290X that hit the market is really not much different than the 390X except for a different BIOS, cooler shroud and box. Board shots of the XFX have already confirmed this reality.
390x clearly has faster memory, so you can't just overclock the 290x to equal it. 290x might still be a better buy, but they're not entirely identical cards.
They're the same ICs as the 8GB 290X, so yes, you can just over or underclock them. 290X has been on market for nearly 2 years, of course it has gone over gradual improvements in yield for both the GPU and the memory, so if you compare launch reference throttling 290X to 390X you're going to see a big difference. But if you compare an 8GB 290X that was launched last year, you're going to see minimal difference.
Rebrands. Hopefully AT does a similar clock for clock apples to apples comparison, but from the tone of Ryan's intro and his reluctance to call them rebrands, I highly doubt he wants to go in this direction. :)
Aren't there one or two other changes within Hawaii Refresh? I admit that rebrands leave a nasty taste, but if there are tweaks to reduce power and add one or two functions, it's not as bad as it could've been.
Well, Ryan is claiming other tweaks, I've read improved tesselation and hints of VSR but other sites have shown these are just AMD working on their drivers as they should've been all along. We will see if there is validity to the actual silicon tweaks or if Ryan is just going by the script to make sure he gets a Fury X to review.
On the other hand, the benchmarks are pretty interchangeable, and the insight and analysis here is excellent as usual. This is the only coverage to really add value beyond "well it performs like an aftermarket 290 or a bit better".
There probably won't be a whole lot to review. These cards basically already exist, their performance is known, and they're available at lower price points then their respective rebrands in the 300 series.
While some of the changes have been minor there's enough going on with the 390s to warrant a extensive review and how they stack up against the 970/980. Plus everyone likes "updated" benchmarks to reflect what's currently on the market.. or we'd never get the call to throw in older cards that have been reviewed to death.
I didn't say a review wasn't necessary, I said there probably won't be much to review. As I said in my previous comment, the performance of these cards are known, literally. There's no guess work involved, and there are no surprises coming. The only question in my mind was whether the 300 series uses new revisions of these existing GPUs, but it doesn't look like that's the case.
I disagree.. looking at benchmarks from early reviews there's certainly some unknowns as the 390x (performance wise..) is a much needed improvement over the 290x and can compete with the 980.. It's vanilla variant (390.. which competes with the 970..) might even be a better solution in Crossfire than two 980s in SLI for 4K gaming which has my interest peaked... what about yours?
No, the performance of the 290X OC'd to ~1.05-1.1 GHz, which has been done many times in the annals of benchmarking history, is very known. And thus you essentially know the performance of the 390X. I guess I could just repeat what I said in my previous 2 comments again, I basically reiterated the same thing 3 times, but they seem easy enough to go back to.
And I wouldn't quite call it performance competitive with the 980. To me performance competitive implies trading blows, maybe ~3% behind on avg. Most of the time the 390X seems to fall evenly between the 290X and the 980.
"what about yours?" No. If your interest wasn't already peaked with the 290/290X, which had a better price/perf ratio for the past ~6 months, why does the 390/390X with an inferior price/perf ratio peak your interest now?
Whether you see the 390X as competitive with the 980 seems to depend on which reviews you read. Guru3D and Hardware Canucks both have the 390X ahead of the 980 in some instances, with the former showing it overall being ~3% slower, and the latter showing it at ~3% faster.
At an MSRP $50 lower (which includes custom coolers that are quieter than the nVidia reference blower you're getting at the 980's MSRP), I'd say either of those sites' results qualify as competitive.
@Black Obsidian I think the 390X is totally competitive at it's price point, but based on the reviews I've seen I wouldn't call it performance competitive with the 980. The Guru3D results are interesting however.
And I just realized that the MSI card being tested in all of these reviews uses a tipple slot cooler. Is it even relevant to draw comparisons to this anymore? It's becoming borderline absurd what AMD is having to do just to strive for performance competitiveness at certain price points.
The HardOP review suggests competitive as well.. right in line with what any would expect with Amd vs Nvidia Some titles favor one while others favor the other.. which put the cards on equal footing..
Until we hit 4k... Standalone the 980 and 390 are adequate (at best depending on your comfort levels) but once you go into 2 card setups the 390 has a edge where it's memory can finally be utilized.. (theory..) so even the vanilla 390s in crossfire might be a better buy over 2 980s.
@just4U Well, I realized why some of these reviews are showing the 390X getting so close to the the 980. They're using the highest clocked version of the 390X currently available. In fact it's the only version available that doesn't hover around the 1050 stock clock, and it's being measured against stock 980's. The review I referenced on Tom's used an OC'd 980.
I do love how you keep bringing the heat argument up regardless of the fact that most 290/X cards have aftermarket cooling solutions.
There are aspects to DX12 that AMD stands to do better at than NVIDIA. In fact, let's examine this further. In the time that AMD have released one new version of GCN, NVIDIA have launched two versions of Maxwell with differing levels of support - Maxwell 1 is 12_0 compliant, Maxwell 2 is 12_1 compliant. GCN 1.1 (you know, Hawaii) is 12_0 compliant, and came out before Maxwell 1 (about four months, actually).
Resource Binding: Hawaii supports Tier 3, Maxwell 1 supports Tier 2 Tiled Resources: Both support Tier 2 Typed UAV Formats: Hawaii supports Tier 2, Maxwell 1 supports Tier 1
Now, let's compare GCN1.0 and Kepler, both of which are 11_1 compliant (again, GCN1.0 came out before Kepler):
Now, you might want to clarify "dated feature sets", especially considering that, Maxwell 2 excepted, AMD has beaten NVIDIA to the punch on DirectX compliance for the past two generations. Morever, we don't actually know what Fiji's complaince level is yet. Additionally, how many people will have bought the 970/980 for its compliance level?
"I do love how you keep bringing the heat argument up regardless of the fact that most 290/X cards have aftermarket cooling solutions."
It doesn't matter to me if they have aftermarket cooling solutions or not. Heat is heat and it has to go somewhere, usually out into the room I'm sitting in. A difference of 50-60w over the course of several hours is significant. Personally, if I'm going to buy a high power card, how much performance I'm getting to deal with that many watts is important to me. I debated getting the 980 Ti over the 980 because I game at 1080p on my 55" TV and it's 85w more at full chat. However, I do plan on having a VR headset (or various headsets, probably) and the 980 Ti is a great card for that.
Why didn't I wait until later? My last PC was built in early 2008 and its last video card upgrade was a GTX 275. It was time and I was tired of waiting. No regrets on my purchase yet. I make custom water cooling loops and the AMD CLC looks like a child's toy to me.
I think quite a few would buy Maxwell over AMD when they've said since September they would have full DX12 support with demos of games with Microsoft to back their claims. AMD's message about DX12 support however, has been much less clear.
From my perspective the good news comes in the form of the 390 vanilla card.. It's on par with the 290X according to early benchmarks (1% faster..lol..) comes with 8G of ram and is a $100 less.. in Canada with some actual stock. Since the bit coin craze is over on the Radeons it will also hold it's resale value.. not bad..
Prices look to be slightly cheaper than the 970 which it is comparable to. Not bad overall... but I'd have preferred the 390X to have come in at that price point. Oh well.
Hope both Nvidia and AMD releases Mid-High Range power effecient GPU's.... less heat at more compact for us on ITX form factors... still stuck on 750Ti.... no alternatives
Hi Strike, Can't really speak for either side there as what you see is what you get for the next year or two Im guessing.. but Mid-High range? Nvidia has the 960 and 970 in mini-itx type solutions..
I've been eyeing the MSI and Asus Variants for sometime especially on the 970.
I wish, few years ago you had a lot more choice in that department, right now not that much. I'm still looking for a single slot solution in the mid-range segment.
Interesting stuff. I hope that the 300 series has more tweaks than just better binning and higher clocks at lower voltages. Based on what they've said about the 390's, I'm guessing they've made at least some slight architectural tweaks.
That said, I feel that the Fiji / Fury line is the real stuff to watch with this launch, and to a lesser extent, the 4gb version of Tonga in the 380. If they release a 380x with 384-bit vram, that might be quite interesting too.
The fact that the 380 / Tonga has full support for Freesync and good DX12 support makes it the one I'm most interested in. I hope AMD can turn it around with this launch and get Nvidia down from its 75% marketshare dominance, which is not great for competition at all.
I like NVidia (especially EVGA cards), but I'm going to try my best to buy and promote AMD until the marketshare's more even.
I was looking at the 4G variant of 380 released by Asus/MSI today.. I'd be impressed if the price wasn't $80 CAD higher than the 2G variants.. which pushes me away from it and looking at the 390 instead.
HardOCP's review of the R9 390 and R9 390X show them to be a straight up rebrand of the 290 and 290X with 8GB of VRAM a single card can't use. No one should be surprised though - its been known that the R9 and R7 series will be nothing but two to three year old cards with slightly different clocks and more memory. But yeah, the niche high end Fiji cards - woo hoo and all. Two new cards with new architecture after two to three years of circling the airport. Bravo AMD.
What review were you reading? I read HardOCP review as well.. and all they did was call AMD out on it's 4K push.. as they stated that it will take two cards to fully utilize the additional memory (no surprise there.. as neither the 980 nor the 290x are great cards for that but the 390x may be once you factor in crossfire)
"As it is, we will leave it there for you all to discuss the merits of Radeon R9 390X. At $429 the MSI R9 390X GAMING 8G is priced better than GeForce GTX 980, and finally gives the GTX 980 competition from AMD, which was lacking until now."
So, apparently AMD has allowed the press to publish AMD's internal benchmarks. Sorry, the article is in german, but scroll down for the slides (these are in english). One is benchmarks, one is benchmark settings. They are directly from AMD: http://www.pcgameshardware.de/AMD-Radeon-Grafikkar...
Looking good so far i'd say. :) While they are likely cherry picked and all, the provided settings should help put things more into perspective.
What's more interesting is why AMD turns off AF in more than half of those games? Is something wrong with Fiji's TMUs? Doesn't make any sense, 16xAF has been trivial for years since 9700pro (typically 10% perf hit or less) and a staple for any benchmark since at least a few gens before then.
Probably because a 10% deficit from AF puts it within the same performance or less than a 980ti. Nvidia's AF performance has been pretty stellar for the past many generations. We'll know for sure soon I guess.
Yep, that's what I'm guessing as well, glad others are thinking the same thing with such a glaring omission of what really should be a standard checkbox setting at the driver level.
Some of these seem pretty compelling, but others are really lame. A full tonga card with 4gb would be pretty cool, would fill the gap between Nvidias 960 and 970.
The 390 and 390x are both really solid, except for that damn TDP. The fact that they beat the 980 ti in memory capacity AMD bandwidth is great, and they work well as either 1440p or entry level 4k cards, but the GPU performance compared to the much cooler and lower power 970 and 980 is only slightly better.
Frankly, none of these cards give anyone a reason to upgrade like the 200 series did. The 290 was all the rage for quite some time. On the other hand, prices are very reasonable, and will only go down.
It doesn't beat up on it.. It's comparable and dependent upon the games you play.. It's trading off higher power consumption for a lower price.. It should shine over the 980 for 4k gaming.. but I'd be more inclined to go for the Vanilla 390s instead .. lower price, lower watts.. and it may actually beat the 980s in SLI as well.
Same thing was when Hawaii debuted... New GPU's line Hawaii and Older Thaiti with new name and a little better performance (better FABS) and lauch price ~$549 now we have: New GCN and Hi-Tek GPU with HBM named Fijii with price $549 for Air Cooled ver. Fijii Pro (3xFan) + we have Fiji XT WC for OC $649... Great Job AMD
Well, for me, getting the 390 or 390X over a GTX 970 / 980 would require a new PSU, adding at least $100 - $120 to the cost. The GTX 970 and 980 on the other hand actually use less power than my old GTX 670. So those poor power consumption figures really do matter.
The GTX 670 was drawing what... about 250W at load if I remember right.. So chances are your using a 600W PSU already just to be on the safe side.. which could easily handle a 390/X
The Fury line is amazing, no doubt about it. Fury X is the graphic card to get for high end gaming, its small form factor, its water cooled which means really quiet, really cool, keeps the inside case temperatures low as well.
That said their rest of the lineup is garbage, actual garbage! All of the 300 series are rebagged 200 series cards with absolutely no optimizations either. I thought that they would at least update the feature set and introduce new stuff, but no.
The expected price cuts are nowhere to be seen either. From the low end to mid range at $150 to the high end at $330 and $430 these are all high prices. I can find a 290 for $240 these days, I can find a 290x for $350 these days. Why are the rebaged turds more expensive than the 200 series turds?
AMD are done, I expected a new line, a new architecture or at least significant changes to the point its almost a new architecture, but no we got the same old shit cards from 4 years ago and the 200 series are regabed turds from the 7000 series.
Same fucking price, same performance, same power consumption, same crap features as 4 years ago and higher prices. Bye, bye AMD I'm going to Nvidia you morons! I was waiting for AMD to release their "new" line to upgrade, but no they are morons and they release 4 years old turds that can't even run windows OS!
If the re-branded line stays competitive, the general public does not care, neither do I. If we go by the benches, the R9 390X is on par with the 980GTX or slightly above it at certain resolutions, for $70 cheaper this is a deal, regardless of re-branding or not.
Welp. Rebrandeon 300 series happened contrary to what many of you said months ago after the AMD Financial Analsysts Day, so I guess I told you so. :)
Fury looks to be a solid part though, good thing AMD priced it accordingly, those early pricing rumors wouldn't have held up well in the marketplace, I don't think.
Still some unknowns however going forward, how badly 4GB will impact Fury, how much HBM will benefit, and exactly what features AMD GCN can and cannot do in DX12. We'll see soon enough I am sure, hopefully AMD doesn't forget to send out some Fury's to AT in the next few weeks! :)
I just don't see these rebrands as being at all competitive. The Hawaii rebadges, in terms of pure performance, are roughly on par with the GTX 970 and GTX 980 respectively, but they use about twice as much power and have a far more outdated feature set (to name a few examples: GM204 has HDMI 2.0, hybrid HEVC decoding, better support for DirectX 12 features, and DSR is superior to AMD's VSR except perhaps on GCN 1.2 cards). Given that, pricing the Hawaii rebadges so close to the GM204 offerings just isn't realistic. Worse for AMD, Nvidia has a lot more room to drop prices (GTX 980 should really be quite a bit lower - the big price gap between it and the GTX 970 only made sense when it was a flagship card.) Because GM204 has a smaller die, a memory bus half as wide, and much lower power requirements, it's much cheaper to product GM204 cards than Hawaii cards. So AMD can't gain profits if they try to compete on price.
What AMD really should have done was release Tonga as the R9 380 (instead of the R9 285) in the first place. They could then have rebranded Hawaii to R9 390/R9 390X at the same time (last September). If done as a "virtual release" (no reference cards), this would serve the purpose of getting the terrible reference Hawaii benchmarks off the charts and replaced with more representative figures from AIB cards. AMD could have stuck with the old 200-series branding for everything below Tonga, and just discontinued the Tahiti cards. This would have saved AMD the humiliation of having to rebrand the over three-year-old Pitcairn chip yet again. The impact of rebadging would have been reduced, since there would have been one truly new chip (Tonga) and only one rebranded chip (Hawaii). And when the Fury release came around, it wouldn't be marred by having to share the stage with a bunch of shoddy rebadges.
One thing is for sure, AMD really needs to have a whole new lineup for 2016 when the FinFET process finally rolls around. The fact that they were only able to afford two new designs for all of 2015 (Fiji and Carrizo) is worrisome. They're going to be bringing out the server/HEDT version of Zen, plus a 28nm desktop Excavator APU, in 2016. Can they afford to spin three or more FinFET GPUs on top of that? Southern Islands (7000 series) had 3 new chips released in the first wave, so I'd consider that a minimum requirement for a viable launch of a new generation. If AMD releases only one FinFET chip and rebadges everything else yet again, I think even their remaining die-hard fans are going to desert them.
Fully agree with the first paragraph, but it is obvious Nvidia will have to adjust the 980 price again, not so much against the 290X, but more against pressure from Fury Pro and Nano. They have some time before this happens. The 970 still has no peer though, its pretty amazing AMD didn't try to be more competitive here.
2nd paragraph, I'd disagree slightly. AMD was clearly waiting for Fiji to be ready to combat Nvidia's Maxwell series, but I guess HBM growing pains and their biggest die ever delayed that process. I still think AMD was caught unprepared on 28nm pt. 2 and they just didn't think Nvidia would launch a whole new generation on 28nm. Once Nvidia came out with the 970/980 they had to scramble and go forward with Fiji and just HBM1.
Personally, I think they should've just gone with their old series designations. Fury X/Pro/Nano just aren't fast enough or priced high enough to justify a different nomenclature. 390X WCE, 390X, 390 would've been just fine, which would have allowed them to sell Hawaii rebrands as 380/X, Tonga rebrands as 370, Bonaire as 360. No Rebrandeon chuckles. :)
They'll certainly have a whole new lineup for 14/16nm FinFET, but how they release will be a telling sign on how far behind their R&D has fallen. They can get a pass for expecting 20nm to be ready and getting caught offguard with 28nm redux, but they won't get a pass for 14/16nm.
Biggest pressure on the 980 is probably from the ti ;)
If the rumours about (really very limited to start with) availability for Fury are true, they couldn't really have put it in the stack as a 390 on those grounds alone. Feels like they had to launch it a little earlier than really ideal (the 4GB too of course) but I suppose its more about getting some mindshare at this point anyway.
You can, I think, see a good chunk of their future finfet line up. Just die shrink fury, half its TDP and there you go for the mid range line :) Might be quite effective if doing it that way lets them get there a bit ahead of NV.
Top end less clear, but that'll probably need HBM2 which seems like it might be a hold up.
I'd agree there is pressure from the 980Ti, but it won't be as immedate as a $500 Fury Nano or $550 Fury Pro right on top of it, depending on performance of course.
But yeah good points about availability and mindshare, I guess it does make some sense if they think it will be limited quantities, but at the same time, I think sell outs of Fiji even as a 390X would go a lot further than the readily availble Add to Cart we see now at Newegg.
amd fury is a new dawn in graphic cards,setting a new standard in pc graphics.the first to use hbm and the most to benefit from dx12. the fury massively obliterates nvidia,and nvidia fanboys is going to say'well nvidia is coming out with pascal soon"sorry but by the time pascal comes out amd arctic islands will be out to smash pascal.
Zzzzzzz.... 28nm and no architectural optimization for lower power consumption. this is not worth the press. come 16/14nm finfet+ next year, we'll have something really worth talking about.
I won't apologise for them as 100W is a lot, however you need to ask whether the money you save buying the card will translate into more cost in the long run.
We're still waiting on why Pitcairn's TDP appears to have dropped a significant amount - perhaps not every part is suffering from higher power consumption.
"Along these lines, because AMD is not releasing new GPUs in this range, the company is also forgoing releasing reference cards. Reference cards were built for testing/promotional purposes" so there will be no reference cards??? i really like to have one, because i can't afford r9 fury/nano :(
KitGuru getting their sample pulled is probably due to their youtube guy constantly puking over AMD with rumors and wrong data. There will be enough samples out there so that we'll get a proper picture of the Fiji.
Proper picture? Sounds like AMD is trying to punish anyone who doesn't go according to script, awful lot like those canned FreeSync "reviews". What is even more amazing is major tech sites are going along with it all lol.
And you can see its not just KitGuru, its also Eteknix. Shame too, lots of readers are going to be misled with AMD's attempt to cover up these rebrands because major sites like AT are too worried about losing out on press samples and making those Day 1 ad revenue generating reviews.
From what I've read, Eteknix is not being pulled, it' s just that there's 10 samples for Europe and they are not on the priority list, but they will eventually get the sample to test. Frankly I can understand AMD pulling samples from people who trash talk them and running the rumor mill with false data, it's simple, you wouldn't shake hands with someone who trash talks you constantly.
As for the low sample number, afaik nvidia had one sample only of one Titan card, think it was Z, and there was no such outcry.
Which goes to show that Nvidia has better products than the ones they sell to their fanboy club. When they see their competitors releasing a faster gpu they just pull it off the shelf and say hey look here. A faster gpu magically appears and people continue to praise Nvidia. Without AMD, they would probably continue to rebrand the GTX 700 series. Also, the Fury X2 will bury anything from Nvidia.
How does AMD justify the R9 390X at $429 with just about half the processors and 8 X less bandwidth bus the Fury X. I understand it has 8Gb of ram, expensive as that is. Maybe applying the HBM process to that card would have been better. In fact if I'm guessing correctly the 390X may only need 2Gb of HBM and still perform with a higher bandwidth than the 512bit bus it's working with. My concern is will it be able to use the 8Gb of ram before it hits a bottleneck on the bus.
Double-stacked memory cards very rarely perform better than the 4GB counterpart, you might see a fps or 2 increase, if any at all. @1080 resolution, no game@full ultra will utilize more than 3GB of Vram, just makes AMD look silly.
Why don't you look at this from another perspective. AMD is bringing 4 Fiji cards out till the end of the year, Fury X2, Fury X, Fury, Nano, that's 4 cards on the new architecture and the rest of cards it is refreshes.
Now what makes this so different from nvidia. They have Titan X, 980ti, 980, 970 and 960. They have one more card in the mix as AMD, if you want to go lower with nvidia, you have to go a generation back.
The difference being the R9 390X is performing around the 980GTX for a lot cheaper. The 970GTX and 960GTX being Maxwell perform rather silly in this big picture. So by the end of the year AMD will have 5 cards (4 new + 1 rebrand) that are performing on the high end.
The difference is Nvidia didn't slice and permutate GM200 4x and then rebrand Kepler 3x as if they were new parts. You do realize the difference right?
The whole point of new generations and new chips is to support new features and improvements across the entire range, vs. the highly stratified and in many ways, deficient limitations of even Fiji (no HDMI 2.0, no HVEC, no DX12_1 etc).
According to TechReport, Fiji does indeed support HEVC.
Agreed, though, the lack of HDMI 2.0 is a big deal, especially since Project Quantum tries to position Fiji as a set-top gaming option. And there were quite a few people looking forward to Fury Nano for set-top HTPC/gaming systems... until they learned that they wouldn't be able to hook it up to their 4K TVs at anything above 30 Hz.
As for DX12_1, it won't really matter until long after the card is obsolete anyway.
I would like to see the 390X with the HBM architecture. Soon 4K will be the standard and if bandwidth holds back the performance of the card I think we should move to the new tech. Also I'm curious to see how a system with half the processors of the high end card with high bandwidth capabilities will work. Will it be able to use the bandwidth, or will the processors just max out and not fill/use it up?
Soon 4k will be standard... Well 1080p is around more than a decade and is still not standard :( I don't think 4k will make it anytime soon even with prices falling down to more affordable levels.
We are still far, far away from lets say affordable $350 4k decent quality gaming display and $350 GPU which would drive 4k games with ease. Even if tech was there today i presume vendors prefer to give us all this in very small steps... so they make more money
to article: I'd prefer AMD to come with new tech first even if it was delayed a few extra weeks, have all ready(drivers, benchmarks, GPU available from day 1) then this pathetic more expensive re-brands :(
Disappointed in AMD and this article that reads like an advertisement. Why haven't they switched to 20nm at the very least by now? Has their normal fab really been unable to get their 16nm line up and running? All this PR and rebranding should be used to buy access to 20nm.
To clarify things a bit, AMD won't be going with TSMC for FinFET. Instead, they will be using Global Foundries, which is licensing the 14nm FinFET process from Samsung.
Sorry, not a completely valid excuse, Nvidia surprisingly did release 4 entirely new ASICs even though we stalled on 28nm with impressive gains for each over their predecessor (GM107, GM204, GM206, GM200).
Nvidia did plenty, they realized they wouldn't get 20nm and went forward with impressive results on 28nm. AMD probably wasn't as prepared, and ended up with 1 seemingly impressive GPU, but was forced to rebrand the rest.
The rumor has been for a while that the 20nm node process at TSMC isn't suitable for high power chips, like desktop GPU's (http://www.extremetech.com/computing/199101-amd-nv... That's why Apple can make a 20nm mobile SoC, but AMD and Nvidia can't build a powerful GPU with it. I don't know this for a fact, but if true, building a desktop GPU on that node could prove disastrous. So it's not about "buying in". And if I might point you to an already famous 20nm product: the Snapdragon 810, which is prone to overheating. Both AMD and Nvidia will be skipping 20nm and go for 16nm finfet immediately (the fact that TSMC's isn't finfet, probably has something to do with the overheating).
The rebranding was known/expected but what happened to 3xx series was going to be OEM? Now they've gone full retail with the rebranding nonsense and only added the Fury as something new (which isn't even out yet) and more VRAM to Hawaii (which probably won't even improve much except a higher price)? WTF?
Sad part is there's no 380X. 'Nice' that they removed the venerable Tahiti (given it is "old" and missing features like TrueAudio, HEVC, etc.) but why doesn't Tonga get an XT variant? R9 285 is Tonga Pro, R9 380 is Tonga Pro. No XT or "full" Tonga? Why NOT?! I mean if you're going to take rebranding to this degree (and you did nearly the same in HD 7xxx/8xxx to R9 cards) you'd expect they put a *little* effort into doing a few things different...at the very least.
But nope we get Fury (Fiji) and some more VRAM on Hawaii and that's all she wrote. Fiji, impressive as it may end up being, is going to be beyond what most people can either afford or justify spending. 8GB of ram on Hawaii does nothing except for benefit maybe a few games at 4K+ resolutions, while "justifying" keeping the prices high on Hawaii based cards, no doubt.
Then take out Tahiti, a GPU that was still more than competent despite its age, and replace with something in between it's Pro and XT variants--Tonga Pro. Don't offer something to fill the gap left by removing Tahiti XT, just make people jump to Hawaii Pro or buy into what I'm sure will still be an overpriced Tonga card. Yay they've raised the clock a bit--so what? It's not like that couldn't be achieved with a mild OC anyway. If Tonga is anything like Tahiti, 1Ghz is an easy/mild/no-sweat OC, so going from 918 to 970 on it is trivial. What is needed is a Tonga GPU with over 2000 shaders and not a penny more than $199.
This is a mess, that's all I can say. I'm glad for Fury to be coming out sometime soon but they could have at least done the rebranding part a little more elegantly and offered a few more things than new names and similar pricing. I mean there's rebranding and then there's just repackaging and this seems more like the latter.
At the very least I thought there would be a Tonga Pro card at $150-180 and a Tonga XT at $200 ish. But nope, forget that, just new names and Fury hype. And that's all this is anyway...hype leading to Fury. Same cards, very close to zero tweaks/improvments, new names, and a pathway for Fury to launch and blow people sock's off...at prices of $600+. Good grief.
I understand that AMD is maybe stuck in making progress in performance under 28nm. What I cannot understand and I think is a complete insult to us, the customers, is all this bullshit about "the new era of PC gaming" when your 300 series is a complete rebrand of the 200 series.
So I see now that the 'new era in PC gaming' was just same or less speed than a 980 Ti, while having to deal with the issues of fitting a wc into a case which more than likely already has one, sold at a price which isn't usefully less than a 980 Ti. *sigh*
ES_Revenge, I should add, sadly my socks are very much still in place, and that was from reading one of the more +ve leaning reviews on Fury X (comments suggest other site reviews are less favourable).
Well it's not actually less than an 980Ti, but it is "effectively" less in a way. One you're getting a CLC water cooler, so that's about $60-80 value; two you're getting HBM which is some pricey tech as well. Without those things it would (or rather should) have cost like $200 less. But then again maybe it would have then just had Hawaii-level performance without the HBM.
Oh and another note Fury X doesn't even have HDMI 2.0, HDCP 2.2, or [full] HEVC decode, from the looks of it.
I agree, it's pretty underwhelming. Really bad handling of the rebrands and then Fury is really nothing spectacular either. At (and often below) a 980Ti is certainly not what the AMD fanboys were expecting; and nothing, as you put it, to blow anyone's socks off.
Looks like Nvidia has squarely won this round and I think if they *really* wanted to put the squeeze on AMD, they'd only need to drop prices of the 970, 980, and 980 Ti a little bit. I mean Fury X still has the above "advantages" where one might still consider purchasing one over a 980 Ti. But what if the 980 Ti sold for $50-100 *less*? AMD would be in some hot water then!
Despite all this though, my disappointment still comes from the fact that we're not going to see any advances in the mid-range with either price or performance, until at least 2016 (by the looks of it). R9 280X, R9 380, and GTX 960 will remain the only choices...at high prices. Only one of those can do HDMI 2.0 and HEVC though, and again it ain't AMD.
Due to the midrange-blahs, I don't see myself getting rid of my R9 280 any time soon but if I were it certainly would *not* be an AMD card, that's for sure. And this is probably the first time I've had to say this since...well, actually this is the first time I've had to say that. Pretty bad picture for AMD right now, IMO.
We are going into day two now and still no Fury X review when every single one of the other major tech sites have had their reviews up, including AT's sister site, Tom's Hardware. As a member here since circa 2000, it's a real shame that my time spent here continues to diminish and gaps get filled by competitors. I knew it wouldn't be the same after Anand sold and split.
Where is Fury X review? Are you completing it (so delay is more than acceptable seen your usual review quality) or you are not going to publish it at all? Did you have some problems? Card is too fast and nvidia asked you to change tests and settings? Or it is too slow and AMD is threating not to give you any future sample to review? I'm just curious :D
I think a simple apology in a pipeline mini-article or a podcast with a brief description of what's causing the delay would have made you look so much better.. There are those between us who are not following closely and don't know that the guy responsible for GPU reviews 'Ryan Smith' is/was sick, for those guys, you're so much behind! The problem with this particular launch is that it's sooooo much hyped, if it's a mid-range card launch or a OEM-card launch or one of those refreshing cards silent launch then no one would have really cared about it, but such a card with so much anticipation and too many what if's questions you really can't be that silent about your cause of delay!
I, and I think the majority who hangs here, hold too many good memories about this place, I've gained almost all of my technical knowledge from this place, I hold it dear to me, but there has been some nasty mistakes in the past couple of weeks that made me really worried about the place, first, the miss-titling of Fury's webcast "paper-launching", second, falling behind and not reviewing any of the already-available-by-the-time-of-launch 3xx radeons, third, this stretching delay of Fury X review "without" any apology.
mistakes has to be done so that we learn from it, I still have faith in this place and I hope that it won't turn into another prime thing that miss-management has turned it into a dud.
PS. you already have most of the tasks lifted from above your shoulder, making so many mistakes in a short time while your primary task is the editorial stuff makes me inclined to whisper in your ears to take sometime off and re-examine how Anand was managing all things at once, I still believe in you!
So I can return my graphics card under reason of false advertisement? I purchased an R7 370 Nitro(Sapphire) and it states "latest version of GCN," when, in fact, it's actually the 2012 version of 1.0? Screw that. I'm going to return it for a 380. They can't be doing this to people. I just bought a 7850 is all. What was also misleading, was I was assuming it was the direct upgrade of the 270x. How the heck did the 370 become a downgrade? Actually, I'm just going to take it back for Nvidia and get a 970. They don't do this crap to people.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
290 Comments
Back to Article
happym777 - Thursday, June 18, 2015 - link
Wowsas! AMD hit a home run with the Fiji archictecture.FuryX will be unbelievable perfomance, very likely the fastest GPU in the world and that water cooling system is incredible. quiet and cool running. A little birdie had to have been whispering in nvidias ear about how amazing this card was going to be when they decided to launch their 980Ti so quickly. FuryX for 650 dollars may well leave the 980Ti in a questionable position.
I'm surprised how well AMD did here with the execution of this part after coming off the Hawaii launch parts. FuryX is the card to get.
firemediumtard - Thursday, June 18, 2015 - link
RIP titan xRIP 980 ti
#RIPMAXWELL2015
sums it up
happym777 - Thursday, June 18, 2015 - link
fury reviews on 24th ? find out thenNfarce - Thursday, June 25, 2015 - link
So what do you AMD fans have to say about the Fury X now? Yeah that's what I thought...deafeningly silent.royalcrown - Thursday, June 18, 2015 - link
What a bunch of shit talking, nobody here has ANY idea what Fiji will do, it might be awesome, it might also be "meh" !nathanddrews - Thursday, June 18, 2015 - link
So far, all the benchmarks (yes, they're leaked, but logical) have shown Fury X to be as fast or faster than the 980Ti while being quieter and cooler thanks to its stock CLC. Also, you get the best version for overclocking for $649 out of the box, whereas you pay over $700 for the higher-end OC 980Ti models. Hopefully any reviews we get from AT will include stock/stock and OC/OC comparisons and then we'll know for sure.Whatever happened to FCAT?
firemediumtard - Thursday, June 18, 2015 - link
http://i.imgur.com/jiuredx.jpgFury X eating 980 TI FOR lunch.
Morawka - Thursday, June 18, 2015 - link
imagine if 980 TI had HBM. That's the only reason amd card is faster.. it's definitely not due to the shader arch.. kudos to them either way but the difference in most games is <5 FPSMorawka - Thursday, June 18, 2015 - link
Another thing to consider is you give up day 1 drivers going to AMD. Nvidia always has day 1 drivers for every popular game. And lets acknowledge A lot of games run like shit on AMD regardless of drivers.Cryio - Thursday, June 18, 2015 - link
That's a myth.AMD didn't have day 1 drivers for Witcher 3 (and didn't have drivers for the next 3 weeks either) and all their GPUs had no problems with the game. W3 even performed spectacularly after the tessellation override improved performance at zero image quality loss.
As for Nvidia's Day1 drivers, you are talking about the drivers that basically destroyed 600/700 performance, right? Lmao.
n13L5 - Wednesday, July 15, 2015 - link
The only thing I'll miss from Nvidia (for desktop cards anyway) will be Cuda which cuts my BluRay conversion down to under 10 minutes.Otherwise I'm no fan of Nvidia with their fraudulent re-badging of old stuff and asinine lawsuits.
nilepez - Thursday, June 18, 2015 - link
Day 1 purchases are for suckers. Best to wait until the beta testers find the problems and the devs fix them. That said, until there are reviews, I have no way of knowing if this card is as good as AMD says it is or not.HunterKlynn - Friday, June 19, 2015 - link
Go back to the NVidia forums, troll.firemediumtard - Thursday, June 18, 2015 - link
Unfortunately you have not absorbed the new information on Fiji.An engineering marvel.
Not only that but an incredible cooling system. The 980 ti has a
firemediumtard - Thursday, June 18, 2015 - link
..a hot loud noisy blower. AMD driver is great.Syphadeus - Friday, June 19, 2015 - link
Are you on funny pills? The stock Nvidia Titan style blower coolers have been around for years no for a reason; they're extremely good. They are quiet and exhaust most of the hot air out of the case.It seems YOU have failed to notice that AMD are giving all their rebadged chips to OEMs to fit their own coolers because AMDs stock coolers were so terrible, being both very noisy and completely inadequate to cool hot Hawaii chips.
Yes, it's great that their new top end is using a CLC but you could argue the only reason it needs this is because the new Fury chips are going to be very hot and very power hungry. An engineering marvel would be engineering a high performance chip with lower power use and heat output, instead of making something hot and power hungry and then trying to make up for it afterwards using liquid cooling.
FMinus - Thursday, June 18, 2015 - link
It's pretty much always 5-15FPS between same segment card from AMD/Nvidia, yet the fanboys always take sites and one side always "stomps" the other. I usually buy the cheaper card and laugh at those 10 frames.Cryio - Thursday, June 18, 2015 - link
Lulwhat. Most reviewers are comparing the 290X with the 980 when the 290X is the 970 segment.It's always like this. Nvidia has a faster (and much more expensive GPU) that is slightly faster than AMD's flagship GPU occasionally (but most of the time faster though), even though AMD's GPU is usually cheaper than Nvidia's 2nd best GPU and faster than it too.
Ddog45 - Thursday, June 18, 2015 - link
LOL your sentence started with "imagine if" after that nothing matters. Keep imagining fella.3DVagabond - Sunday, June 21, 2015 - link
nVidia doesn't have the tech know how to pull off HBM yet. AMD is simply a gen ahead with Fiji.Kutark - Thursday, June 18, 2015 - link
I love how the settings in all your charts are cherry picked to show the AMD card in the best light. Non consistency at all in any of those, and in most situations its less than a 10% difference. Considering the Fury X is already watercooled, im really curious to see what kind of OC headroom it has compared to GM-200 (which is very high OC headroom).Cryio - Thursday, June 18, 2015 - link
Not only headroom for OC, but more silent and cooler than the 980 Ti probably.And we've yet to see GCN 1.2 on a high-end GPU, so I'm curious.
soldier45 - Thursday, June 18, 2015 - link
Gtfo.Da W - Friday, June 19, 2015 - link
Both AMD and Nvidia leapfrog each others every generation. It was AMD's turn.SkyBill40 - Friday, June 19, 2015 - link
Nah, this was playing catch up seeing that AMD was already behind.Ryan Smith - Thursday, June 18, 2015 - link
The plan is to include FCAT with the Fury X review.As for OCing, you'll see the usual OC benchmarks.
nathanddrews - Thursday, June 18, 2015 - link
Cool beans!chizow - Friday, June 19, 2015 - link
So it looks like AMD is punishing sites that cast Rebrandeon in a negative light by withholding Fury samples. Any truth to this?@Ryan, care to comment on the validity of this? Have anything to do with why you don't have 300 series to review? I know you've repeatedly stated AT's independent status, but it looks like AMD is making tech sites stick to the "script" or face punishment.
http://www.eteknix.com/things-go-from-bad-to-worse...
http://www.kitguru.net/site-news/announcements/zar...
Trenter - Friday, June 19, 2015 - link
It's nothing nvidia has never done, they've blacklisted several review sites in the past. Kitguru made their own bed by throwing jabs in their articles before they had any idea what the hardware was capable of. In eteknix case I'm not familiar with them so I don't know. Ryan probably doesn't reply to blatant fanboys by the way, I'm sure he's seen your fanboyish comments littering the forums.chizow - Friday, June 19, 2015 - link
Sounds like typical apologist BS from known AMD fanboy, I do recall Nvidia having a similar incident with Hardware Secrets but it turned out to be just a miscommunication.http://webcache.googleusercontent.com/search?q=cac...
But that didn't stop AMD fanboys, like you, from getting all that fake angst and fanboy anger out with torches and pitchforks in tow, did it? And now AMD does the same thing and its all "well Nvidia did it first" kinda ironic since AMD fanboys love to claim this is the type of behavior that turned them off Nvidia for good? :D
Hypocrite much? Of course, AMD fanboys.
FlushedBubblyJock - Wednesday, June 24, 2015 - link
I totally agree chizow, and I'm glad someone calls them out on it, because it's ridiculous.bigboxes - Saturday, June 20, 2015 - link
C'mon chizow. Stop with the nVidia fanboy antics. Just stop. It's way past old.FlushedBubblyJock - Wednesday, June 24, 2015 - link
It's not old at all someone has to be able to counter the lies and the spins, are you against that ?D. Lister - Saturday, June 20, 2015 - link
AMD has a history of sending cherry-picked samples to reviewers:http://www.tomshardware.com/reviews/radeon-r9-290-...
http://techreport.com/review/25712/are-retail-rade...
http://www.legitreviews.com/amd-radeon-r9-290x-pre...
http://www.maximumpc.com/amd-radeon-r9-290x-press-...
Maybe this time around, there just aren't enough cherries to pick.
FlushedBubblyJock - Wednesday, June 24, 2015 - link
November 2013, and now, they released it again...I saw a review on the 390x with power usage and it used 468 watts max draw and 368 watts average.
I think AMD has done an amazing job slamming voltage and power into that old core to crank it up, again, but the 290x can just be overclocked to match it, since no one has found any real differences at all yet, and that is pretty sad.
just4U - Thursday, June 18, 2015 - link
Well.. no. It's going to be pretty good according to leaks, but I do get what you mean by the earlier comments.. until we get some full on testing in fanboys from "both" camps can bite me. The majority of us like both companies I think and constantly switch between the two, there is no favoritism.anandreader106 - Thursday, June 18, 2015 - link
amen.Refuge - Thursday, June 18, 2015 - link
+1TEAMSWITCHER - Thursday, June 18, 2015 - link
Actually, for me....AMD is......near DEATH. Their CPU business died years ago, and today their GPUs all seem to lack....competence. The very presence of that radiator on the FURY X is a sign of their desperation. They are going to clock the bejesus out of that card to put up numbers that resemble a 980ti. But look at the pricing...if it were any faster...any faster at all...they would demand a higher price to pay for the more expensive HBM implementation. I know that benchmarks are coming, but the bigger story will be the aestethics. Having to find a place for that Radiator is deal breaker, and the air cooled Fury Vanilla (or whatever the hell it's called) is not going to put up some great numbers....obviously just look at the pricing...it never lies.xthetenth - Thursday, June 18, 2015 - link
Yes, clearly offering a product with vastly better cooling is a sign of desperation, as shown by the EVGA hybrid kits to make up for the shortcomings of the 980, 980 Ti, and Titan X.But then again we actually know that the clock speed on the Fury X is going to be and it's not going to be ridiculously clocked, which any idiot could figure out if they extrapolated from numbers for a properly cooled 290X.
It takes a really special sort of customer to manage to have a case with no 120mm fan slots and to confuse pricing for quality. You probably think the 980 and Titan X's performance is proportional to their price. You also probably think a 780 Ti was an acceptable buy compared to a 290X in retrospect.
Kutark - Thursday, June 18, 2015 - link
Its only offering a better cooling solution if the clocks are the same as it would have been air cooled. Slapping a water cooler on it then overclocking it 20% to get better framerates isn't "better". You could easily do the same thing with a GM-200 and get huge gains. Even a moderate overclock on the stock air cooler on a 980ti already puts any of the leaked numbers in the ground. So, i'm still not sure what everyone is so jazzed over.Its a great card, and im glad to see they're finally on level playing ground (or slightly better). This is ultimately good for the consumer as its going to force nvidia to up their game. That being said. The card is a solid competitor, nothing else. It's not "beating" hte 980ti in any meaningful way.
Trenter - Friday, June 19, 2015 - link
Gm 200 doesn't benefit from watercooling because nvidia artificially limit board partners. Gpu boost theottles like mad in demanding games because it's always power limited.chizow - Friday, June 19, 2015 - link
@Trenter really? How much FUD left in the tank?http://www.techpowerup.com/gpudb/b3260/inno3d-ichi...
http://www.evga.com/articles/00935/EVGA-GeForce-GT...
just4U - Thursday, June 18, 2015 - link
Amd has been well known for keeping prices in line right back to their Radeon 8800 (400 CAD as opposed to 700 for the GF3) Once in awhile they flirt with absurdly high prices but it's never been a easy sell for them... no matter how good their product is.They've been near death for decades but thankfully they are still out there or prices would be a lot higher from Nvidia and Intel.
just4U - Thursday, June 18, 2015 - link
err.. 8500 lol.e36Jeff - Thursday, June 18, 2015 - link
It's rated at 275W and its a halo card. It has the power input for up to 375W and its cooling solution is rated to 500W, so they are not pushing things to the limits. The card itself is several inches shorter than the traditional high-end GPU would be. I'd wager its just not plausible to dissipate 275W with a heatsink that's 3 inches shorter than it normally would be. And there is the side benefit of being significantly quieter than than an air-cooled design, it is expected to be at around 30db at a normal gaming load.eanazag - Thursday, June 18, 2015 - link
I have looked at installing radiators on my R9 290s. Including it is AMD's answer to Nvidia's great blower. In terms of value, there is more with a factory included radiator. This is just fanboy hate. I look forward to seeing the benches. Another plus is that this card will fit in smaller cases because it is shorter. I had case issues with my 290.Based on AMD's pricing, I would expect parity or better performance with the 980 Ti as AMD does well in sales when they price directly while offering better value and performance.
amd_furion - Thursday, June 18, 2015 - link
ha coming from a nvidia fanboy,when nvidia rigs games using ant-competitive fraudworks,which artificaily makes nvidia cards seem faster than amd cards.pretty sad when you have to pay game developers to use prioritized closed off software to make your graphic cards seem better than amd's,when in actual raw performance amd wins.amd_furion - Thursday, June 18, 2015 - link
hi buddy i live in the land of facts,not fictionworks(gameworks)like you do.you nvidia fanboys would still think nvidia is better even if amd cards are 100x better.thats how crazy and out of touch you guys really are.theres no reasoning with your type's of people.Trenter - Friday, June 19, 2015 - link
Most people won't have a problem finding a place for that radiator. What about 10-11 inch air cooled cards? Some people may not have a place for those and they seem to do fine.piiman - Saturday, June 20, 2015 - link
" Having to find a place for that Radiator is deal breaker"LOL you put it in the same place your exhaust fan in the back of the case is NOW. Its not hard to find.
DCide - Thursday, June 18, 2015 - link
How can anyone be a fan of NVIDIA, but not AMD?If AMD fails in the GPU market, then NVIDIA will become just as complacent as Intel has in the CPU market. The NVIDIA you love will be no more.
Jumangi - Thursday, June 18, 2015 - link
Fanboys ignore things like logic and common sense.Kutark - Thursday, June 18, 2015 - link
Ain't that the truth. I've been more nvidia biased the last 4 or 5 years mainly because they've just worked, and prior to that i had myriad annoyances with AMD cards (primarily driver related stuff, the hardware was always solid, but just annoying things here and there). That being said, im glad to see AMD has upped their game.darkfalz - Friday, June 19, 2015 - link
Disagree. NVIDIA is still competing against the NVIDIA card you already have. Nobody with a 980 is going to buy a 1080 or whatever they end up calling the next range if it's merely 10-20% faster. NVIDIA in their own charts frequently compare their latest high/midrange GPUs with their own of a few years ago that they know people are hanging on to, such as the venerable 8800 GT and more recently the GTX 680. I prefer NVIDIA's focus on interesting software features and power efficiency.chizow - Friday, June 19, 2015 - link
@darkfalz, exactly, people who keep repeating this AMD needs to survive for competition meme endlessly, while they are holding on to their 5850 and 6970 or whatever waiting for a worthwhile successor lol.hero4hire - Saturday, June 20, 2015 - link
Ah, the competition is bad capitalist. Tell me again how Comcast needs to earn your repeat business through customer service and infrastructure improvements for growing demand. Thank god were seeing major revolutions in desktop cpu's and my 2500k is practically worthless from the innovations.RSIlluminator - Friday, June 19, 2015 - link
Finally a reasonable comment. Real competition is tons better for the consumer, and that's why most companies want to dominate their fields. At that point they can set the prices to whatever they want, and this happens in tech all the time.Ubercake - Friday, June 19, 2015 - link
True. Too many times AMD has promised us greatness and not delivered...For the sake of competition, I really hope Fiji is the game-changer.
The0ne - Monday, June 22, 2015 - link
Look on the bright side, when you want to sell something you know who to go too. It's like religion, they'll believe anything.The0ne - Wednesday, June 24, 2015 - link
Reviews are out. Eat you people but keep believing in your religion!SkyBill40 - Thursday, June 18, 2015 - link
Pfft. Talk and hype is cheap. We'll see once some actual head-to-head benchmarks are made. Until then, keep your rah-rah to yourself.Zak - Thursday, June 18, 2015 - link
Riiiight..... Any official benchmarks yet? Nvidia will simply drop prices on 980 Ti, bump clocks on Titan X and drop its price and eat AMD's lunch again :) Not to mention crap AMD drivers that will take a year to mature to any level of usability.extide - Thursday, June 18, 2015 - link
WHY dont YOU like AMD's drivers? Be specific. What exactly is "So shitty" about them...chizow - Thursday, June 18, 2015 - link
Well, I'll start. How about the ability to adjust AA and CF bits without replacing the entire game profile? The enthusiast community has been asking for this, forever. AMD took a step in this direction with a loadable binary, but its just that, a binary that the user can't edit. Nvidia on the other hand has exposed these bits forever, going back to coolbits allowing the user almost complete control over SLI, AA, 3D Vision bits. This is important for Day 1 compatibility and general compatibility in cases where there is no Day 1 driver, which is of course especially horrible for AMD, but that of course is an entirely different issue. ;)Will Robinson - Friday, June 19, 2015 - link
Back from WCCF with more lies and FUD eh?I laughed as you got pwned there by the regulars.
The sting of defeat bites deep,try to toughen up princess.
chizow - Friday, June 19, 2015 - link
@will are you a regular? If you're done getting pwned here feel free to join in over there. :) Put on your big boy pants and join the conversation!Gigaplex - Thursday, June 18, 2015 - link
I get BSODs with the current "stable" driver, usually when the monitor goes into a low power savings mode. The beta driver appears to fix the BSOD (not sure as I didn't test long enough), but introduces massive graphics corruption at the desktop when resuming from sleep a fair chunk of the time. Sometimes when the computer wakes on any driver, the monitor gets no signal until I unplug then replug it.fingerbob69 - Friday, June 19, 2015 - link
Have you tried updating your >>monitor's<< driver?FMinus - Thursday, June 18, 2015 - link
If nvidia drops the price, which I doubt they will, the customer won thanks to AMD.chizow - Thursday, June 18, 2015 - link
AMD already dropped their prices thanks to Nvidia, so the customer already won, you're welcome.FMinus - Thursday, June 18, 2015 - link
Indeed they did, and there will be many more happy customers when Fury releases.Time to ask those who bought those Titan X if it was worth spending $350 for 2 and a half months, before the 980Ti released. I doubt many are still that happy. If AMD kicks the 980Ti price down below $600, I think they might be happy again.
Nagorak - Thursday, June 18, 2015 - link
The people who bought it don't care. The only people who spend $1000 on a graphics card are people who make way more money than they know what to do with, or who are frivolous spenders who just blow money without thinking.chizow - Friday, June 19, 2015 - link
Well, I certainly had to give pause, there's always choices to be made with disposable/discretionary income, but in the end, seeing some of these games at 1440p with full maxed details and 4xMSAA or in 3D Vision (GTA5 at 70FPS per eye in 3D) are just jawdropping, so yeah. money well spent.RSIlluminator - Friday, June 19, 2015 - link
You're missing a huge demographic here, and that is professional users. Titans are basically used by scientists, engineers, previz architects etc. A gamer doesn't really need a Titan card, unless they want the latest and greatest.Trenter - Friday, June 19, 2015 - link
Fury x will dominate titan x in double precision, nvidia had to strip all their dp hardware out to compete with fiji.MapRef41N93W - Saturday, June 20, 2015 - link
A) Who cares about double precision, it's used only by actual professionals and mostly for scientific simulations.B) NVIDIA developed the Maxwell architecture years ago and it was designed to remove DP for the purpose of saving power. Fiji was not even a blip on the radar when Maxwell was in development
C) How much is AMD paying you for these comments?
chizow - Friday, June 19, 2015 - link
Nah, Nvidia won't drop prices at all, even if Fury X is +/-5% as expected. AMD knew exactly what their target was and they hit it, barely, needing a WCE and 275W to do it. Any 980Ti will be able to meet or beat that performance with just a slight OC.As for Titan X, certainly worth it and happy, it still looks to retain the top end performance and it certainly won't be subject to concerns about VRAM, like the Fury X will at the resolutions and settings it is expected to run at, especially in multi-GPU configs. I wasn't sure so I went through the paces myself, even 6GB with 2x980Ti wasn't enough, so I traded them for a 2nd Titan X and cash. Now I'm happy and will be until Pascal drops next year or 2017. :)
MapRef41N93W - Saturday, June 20, 2015 - link
I'm extremely happy with my Titan X that I have under water and running at 1520/8200 clocks. My card will eat any Fury X for dinner (5278 FS Ultra score) as Fury's will be lucky to hit even 1200MHz based on that extremely dense die + what we've seen from GCN in the past.Also lets not forget about 4GB VRAM. Don't care if it's HBM or not, it's still 4GB. The minute it was clear that Fury would not have 8GB (as originally rumoured) and instead 4, I bought my Titan X.
FlushedBubblyJock - Sunday, June 21, 2015 - link
Yes I see the average OC on water for Titan X core is an astounding 1497... hahahah oh man now wonder it's all silent on everything about it except amd fans whining about the price.They are so sad. the average OC on water for their failed precious 290X is 1187.
Let's face it, the Titan X is an OVERCLOCKING MONSTER.
Ranger101 - Friday, June 19, 2015 - link
Resorting to criticising AMD drivers now eh Chizoo? Scraping the bottom ofthe green barrel like this is worth it's weight in GOLD dude.
Nvidia's finest definitely getting flaky. Not too long to go now...
chizow - Friday, June 19, 2015 - link
Are you happy with AMD drivers Rangoo? If yes, that probably explains why they are so bad. There's no less than half dozen major bugs/feature limitations AMD would have to address before I would consider using them again, and that starts with Day 1 drivers. What do you think the chances are AMD comes out with a Day 1 driver for Batman Arkham Knight on Tuesday? Or is it going to be more CryWorks for AMD and their users?soldier45 - Thursday, June 18, 2015 - link
Pfft really, 2 980TIs > any and all AMD.FMinus - Thursday, June 18, 2015 - link
it's like saying 295x2 still eats all nvidia cards including the titan.7beauties - Thursday, June 18, 2015 - link
I'm an AMD fanboy, but giving Nvidia's Titan X and GT 980Ti their last rites before AMD's Fury X launches is premature. As I recall the now named Fury X edged out Nvidia's Titan X and GT 980 despite having twice the memory bandwidth. I would've thought such an advantage would make Nvidia's flagship cards eating the Fury X's dust but instead AMD just manages to edge Nvidia by its front fender. Any AMD fanboy should be concerned.llMattll - Friday, June 19, 2015 - link
Titan X > Fury X > 980 ti. Titan X still winsSkyBill40 - Friday, June 19, 2015 - link
Bwahahahaha... NO.MapRef41N93W - Saturday, June 20, 2015 - link
Where's the ignore buttons for idiots such as SkyBill40 on this forum.fivefeet8 - Wednesday, June 24, 2015 - link
It's so funny reading these types of comments now.chizow - Monday, June 29, 2015 - link
Haha yeap, as usual, I feel vindicated with each of my comments and AMD fanboys are running for the hills or crawling back under rocks. :DThatguy97 - Sunday, June 28, 2015 - link
rest in pasta fermixenol - Thursday, June 18, 2015 - link
And all NVIDIA has to do is drop the price on their cards appropriately. They've had a 9 month lead and for all we know Pascal is going to be ready in 6 months. Then we'll be chasing down this rabbit hole again.Nothing new here.
firemediumtard - Thursday, June 18, 2015 - link
Pascal is tsmc 16mm finfet. not ready until 2h 2016. a year away and irrelevant currently.nvidia price adjustments? possible with furyx tearing 980 ti a new one
Yojimbo - Thursday, June 18, 2015 - link
I'm not sure why you are sure about that. From what I've seen, 16nm FF+ is scheduled to begin volume production in Q3 2015, which allows for the real possibility of Pascal arriving in 1H 2016. It's just a matter of when Pascal is ready and what the yields of the process are like.The Fury X has 4GB of RAM, is water cooled, and comes without NVIDIA's software environment. HBM may have a certain marketability, but I think the Fury X is going to need a significant price/performance advantage to outsell the 980 Ti.
Shadow7037932 - Thursday, June 18, 2015 - link
TSMC has historically had issues with their new nodes/manufacturing process. I would not be so confident in getting Pascal as soon as Q2 2016.Yojimbo - Thursday, June 18, 2015 - link
That's sort of trying to turn the argument on its head. He said it for sure WOULDN'T be out in 1H 2016. A vague "TSMC has historically had issues" doesn't support that claim. You are replying to me as if I made the claim that Pascal for sure WOULD be out in 1H 2016, which I didn't (I said "which allows for the real possibility of Pascal arriving in 1H 2016").Pantsu - Thursday, June 18, 2015 - link
According to SK Hynix roadmaps HBM2 has been slated as starting Q2 2016, so I wouldn't expect Pascal with HBM2 coming until H2 earliest.http://hexus.net/media/uploaded/2015/3/18f593cf-ca...
Yojimbo - Thursday, June 18, 2015 - link
I don't get how your conclusion follows from your premise. If HBM2 will be ready in Q2 2016 why can't products using HBM2 ship in Q2 2016? Also of great important is what the date of release of that Hynix slide is. Because it doesn't entirely agree with the information from http://cdn.overclock.net/7/7e/500x1000px-LL-7ef5be... for instance. In addition, did Hynix ever introduce the DDR4 3DS as that chart seems to promise for the Q1 2015? I can't find anything about that. The chart may be old and inaccurate.Pantsu - Thursday, June 18, 2015 - link
It's never really the case in this industry that the day you get production going is the day you'll see it in shipping products. These things take a lot of time to get into shipping products. Fabs tell all kinds of stories how x nm products are ready by this or that time, when in reality the products coming with that tech can take another year to actually be available.That said, yeah, the chart can of course be old or inaccurate, but that's what we've got. At least it's not pure conjecture.
Yojimbo - Friday, June 19, 2015 - link
It's already been shown to be inaccurate so it's not very useful. Besides, we don't even know what the dates on that chart were supposed to mean at the time it was made. Is that when we see it in products, when volume production starts, or when they start selling the chips? Also, It just shows a circle centered on the border of Q1 and Q2. Who's to say that products wouldn't be out by the end of Q2 even if there is some time between whatever even they have in mind for the chart and actual products showing up. My point is that everything in that chart is already pure conjecture, so I don't see much reason to be sure that an HBM 2 product can't possibly be ready before H2 2016.Pantsu - Sunday, June 21, 2015 - link
It seems to me a foolish hope to think they'd be ahead of schedule give HBM1 actually started volume production in Q1 2015 instead of what was marked on this product map. If anything, HBM2 is further delayed.colhoop - Thursday, June 18, 2015 - link
Why do people always talk about NVIDIA's software environment as if it is some major advantage they have on AMD? It seems to me that they are both just as good, and from my experience with NVIDIA and AMD is I've had less driver issues with AMD believe it or not.But yeah the Fury X has benchmarks released by AMD using Far Cry 4 4k Ultra Settings and it outperforms the Titan X by more than 10 average fps. I know the benchmark isn't as reliable since it was released by AMD obviously but still, it really makes you wonder. I definitely think it will outperform the 980ti especially if AMD claims it can outperform the Titan X but of course we shall see :)
Pantsu - Thursday, June 18, 2015 - link
Nvidia certainly spends a lot more money and effort on their software currently.- They have more timely driver updates aligned with big game releases
- SLI support is much better than AMD's sparse and late updates to CF
- GeForce Experience works much better than AMD's third party equivalent
- Better control panel features like DSR, adaptive V-sync. AMD's efforts tend to be like half-baked copies of these. AMD hasn't come up with anything truly new in a long while and all I can do is smh at 'new features' like FRTC that's so simple it should've been in the control panel a decade ago.
I do think for single GPU driver performance and stability there isn't much of a difference between the two, regardless of how many driver updates Nvidia does. Actually the latest Nvidia drivers have been terrible with constant TDR crashes for a lot of people. But that's anecdotal, both sides have issues at times, and on average both have ok drivers for single GPU. It's the above mentioned things that push Nvidia to the top imo.
xthetenth - Thursday, June 18, 2015 - link
I like people talking about how AMD didn't get drivers out for Witcher 3 immediately and ignore that NV's drivers were incredibly flaky and they needed to be reminded Kepler cards exist.Zak - Thursday, June 18, 2015 - link
What? Zero issues playing Witcher 3 since day one.blppt - Thursday, June 18, 2015 - link
"I like people talking about how AMD didn't get drivers out for Witcher 3 immediately and ignore that NV's drivers were incredibly flaky and they needed to be reminded Kepler cards exist."Pfft....to this DAY the Crossfire support in W3 is terrible, and ditto for GTA5. The former is a TWIMTBP title, the latter is not---even has AMD CHS tech in it. I run Kepler Titan Blacks and 290x(s) and there is no question NVIDIA's drivers are far, far better in both games. Even the launch day Witcher 3 drivers are superior to AMD's half-assed May 27th 15.5 betas, which havent been updated since.
For single cards, I'd agree, AMD drivers are almost as good as Nvidia's, except those Gameworks titles that need to be reoptimized by AMD's driver team.
But there isnt even a question that Nvidia gets betas out much, much quicker and more effectively than AMD.
And if you arent into betas, heck, AMD hasnt released an OFFICIAL R9 2xx driver since December, LOL. Which is what annoys me about this Fury launch---once again, AMD puts out this awesome piece of hardware, and they've been neglecting their current parts' drivers for months. What good is the greatest videocard on the planet (Fury X) if the drivers are rarely and poorly updated/optimized?
chizow - Thursday, June 18, 2015 - link
@xthetenth - complete rubbish, while AMD fanboys were boycotting the game over PC only features, Nvidia fans were enjoying the game on day 1, courtesy of Nvidia who gave the game away to new GeForce owners.How is CF+AA doing in Witcher 3 btw? Oh right, still flaky and broken.
Yojimbo - Thursday, June 18, 2015 - link
I am mostly referring to their growing investment in gaming library middleware, i.e., GameWorks.TheJian - Thursday, June 18, 2015 - link
Have you heard of Cuda, Gameworks or DAY1 drivers for game releases? You seem to be oblivious to the fact that cuda runs on 200+ pro apps and is taught in 500+ universities. Never mind the fact that NV releases drivers constantly for games when they ship, not 4-6 months (sometimes longer) later. You are aware the last AMD WHQL driver was Dec correct?http://support.amd.com/en-us/download/desktop?os=W...
Dec 8, is NOT GOOD. They can't even afford to put out a WHQL driver every 6 months now. Get real. Nvidia releases one or two EACH MONTH. And no, I don't believe you have more problems with NV drivers ;) I say that as a radeon 5850 owner currently :)
AMD's R&D has been dropping for 4yrs, while NV's has gained and now is more than AMD with less products. Meaning NV's R&D is GREATER and more FOCUSED on gpu/drivers. Passing on consoles was the best thing NV did in the last few years, as we see what it has done to AMD R&D and lack of profits.
AMD needs new management. Hopefully Lisa Su is that person, and ZEN is the right direction. Focus on your CORE products! APU's don't make squat - neither do consoles at the margins they made to get the deals. There was a VERY good reason NV said exactly that. They passed because it would rob from CORE PRODUCTS. We see it has for AMD. It hasn't just robbed from hardware either. Instead of approaching companies like CD Projekt for Witcher 3 to add TressFX 2+yrs ago, they wait until the last 2 months then ask...ROFL. That is lack of funding then excuses why perf sucks and complaints about hairworks killing them. An easy fix in a config/profile for the driver solves tessellation for both sides (only maxwell can handle the load) so it's a non issue anyway, but still AMD should have approached these guys the second they saw wolves on the screen 2+yrs ago showing hairworks.
http://www.tomshardware.com/reviews/amd-radeon-r9-...
Check the pro results...AMD's new cards get a total smackdown, 3 of the 5 are by HUGE margins. Showcase, Maya, Catia all massive losses. Note you'd likely see the same in Adobe apps (premiere, AE, not sure about the rest) since they use Cuda. There is a good reason nobody tests Adobe and checks the cuda box for NV vs. OpenCL for AMD. ;) There is reason Anandtech chooses Sony, which sucks on Nvidia (google it). They could just as easily test Adobe with Cuda vs. AMD with Sony vegas. But NOPE. Don't expect an AMD portal site to run either of these tests...LOL. Even toms won't touch it, or even respond to questions about why they don't do it in the forums :(
chizow - Thursday, June 18, 2015 - link
@colhoop it is largely going to depend on your use cases. For example, GeForce Experience is something many Nvidia users laud because it just works, makes it easy to maximize game settings, get new drivers, record compressed video. Then you have drivers, driver-level features (FXAA, HBAO+, Vsync that works), day 1 optimizations that all just work. I've detailed above some of the special reqs I've come to expect from Nvidia drivers to control 3D, SLI, AA. And the last part is just advertised driver features that just work. G-Sync/SLI, low power mode while driving multiple monitors, VSR, Optimus all of these just work. And finally you have the Nvidia proprietary stuff, 3D Vision, GameWorks, PhysX etc. Amazing if you use them, if you don't, you're not going to see as much benefit or difference.Trenter - Friday, June 19, 2015 - link
Shadowplay is nice but game optimization is crap.D. Lister - Wednesday, June 24, 2015 - link
I disagree with your opinion regarding game optimizations. The default optimization settings are often a bit conservative for me (it doesn't factor in my CPU OC, I suppose). But you can conveniently adjust it with a single slider, instead of having to fiddle with several different in-game settings.Kutark - Thursday, June 18, 2015 - link
Where are you guys pulling this crap from. Even the most favorable leaked benchmarks show Fury X at at BEST at 10% advantage over a 980ti. In most cases its 3-5%. Explain to me (with the assumption the leaked info is accurate) how that is "tearing the 980ti a new one".chizow - Thursday, June 18, 2015 - link
Well, fanboys. I don't see the AT Fanboy Defense Force (kyuuuu where are uuuuu) tearing them a new one, been a few hours too. ;)Gigaplex - Friday, June 19, 2015 - link
Chizow calling someone a fanboy? Pot, meet kettle.chizow - Friday, June 19, 2015 - link
I'm the first to admit I'm a fan of the best, which is why I can't understand why people would choose Amd when they can get better for such a small premium. You've come clean in this thread for once, you've never once wondered why you bother with all that when you could've gotten a suitable alternative from Nvidia for a couple burritos at Chipotle?FMinus - Thursday, June 18, 2015 - link
True, but the engineer said that it's the most overclockable card from AMD yet, if we can believe that guy, that with a little fiddling it can probably go toward 25% and then the Titan X is behind it. But we'll see I'm not claiming anything, would be nice tho.MapRef41N93W - Saturday, June 20, 2015 - link
Right because Titan X's can't just clock up to 1500MHz or anything...FMinus - Thursday, June 18, 2015 - link
They already bastardized their own Titan X with the 980Ti, how many more cuts are they willing to take really, I guess it will be a necessity, but I don't think nvidia will like that one bit.Murloc - Thursday, June 18, 2015 - link
I still haven't seen the reviews.zlandar - Thursday, June 18, 2015 - link
If that was true why didn't they just put a Fury and 980Ti card side-by-side and benchmark them in front of the world?I actually hope you are right as it's unhealthy for Nvidia to keep kicking AMD's ass year-after-year. Last thing we need is for AMD to fold and Nvidia to jack prices even higher.
AS118 - Thursday, June 18, 2015 - link
That's what I'm saying. If AMD goes out of business, who's going to compete with them? Intel APU's? With the US Government being the way it is, Nvidia may be able to claim that Intel's on-chip "HD5000" or whatever graphics is "competition", and use that as an excuse to operate as a monopoly.The best we could hope for is that Samsung or Intel or somebody else would buy the Radeon IP and launch some new GPU's to compete with NVidia.
royalcrown - Thursday, June 18, 2015 - link
Not Samsung, they'd keep promising it's finished soon, 3 years later...Samsung 270x 2 GB 2019 edition !FMinus - Thursday, June 18, 2015 - link
If AMD goes down someone will buy them up and it will be business as usual.tomc100 - Thursday, June 18, 2015 - link
If AMD folds then if Intel is very smart then they will buy off the ATI division for cheap and with their R&D they will dominate both cpu and gpu technology.Senti - Thursday, June 18, 2015 - link
But we will cry with their prices then...chizow - Thursday, June 18, 2015 - link
How so? My Intel CPU prices have enjoyed an unprecedented level of stability since Intel Conroe'd AMD in 2006.Nagorak - Thursday, June 18, 2015 - link
Unfortunately, CPU performance has also enjoyed an unprecedented level of stability since then as well.I'm still running with an i5 2500K bought back in 2011 overclocked to 4.6 GHz. Frankly once you take the OC into account it's basically up there with the newly released i5 CPUs.
chizow - Friday, June 19, 2015 - link
Intel is innovating and improving performance where they are most deficient, we have a number of E5-2699v3 18-core CPUs in our data center at work that would certainly disagree with your assessement. ;) But just look at what they are doing on the GPU side of things (Broadwell-C just eclipsed and destroyed AMD's best APU) or on the low-end with mobile and Core M and you'll see Intel is still innovating.On the CPU side of things, I've had worthwhile upgrades all along the way and I've paid less for each of them than the $450 AMD wanted for their cheapest Athlon 64, which is coincidentally what made it easy to choose Intel again for the first time in years.
E6600 $284
Q6600 $299
i7 920 $199! (middle of recession pricing)
i7 4770K $229
i7 5820K $299
silverblue - Friday, June 19, 2015 - link
You don't "destroy" something by performing 25% better at double the price. In any case, we know that AMD's most powerful APU - the one powering the PS4 - is significantly faster at graphics, it's just that AMD haven't seen a point in bringing out such an APU for the computing market... yet.Intel has a far bigger wodge of cash to throw at these things, any way you look at it.
rtho782 - Friday, June 19, 2015 - link
Please don't release that for the PC market. The GPU might be OK (no better than OK, it's a 7870, not made for high end gaming), but the CPU in it sucks balls. It's bobcat.silverblue - Friday, June 19, 2015 - link
No no, I don't mean the Jaguar cores, I just mean an APU with a large GPU in the same sort of vein. Throw a quad core Carrizo in there, or something.Even if it's "only" a 7870, that's still a lot faster than even Iris 6200.
chizow - Friday, June 19, 2015 - link
I saw more than 25%, in some cases 50% uplift,, that is destroying especially given you also get MUCH faster CPU as you would expect from a full blown 4-core Intel x86 Core chip. 2x the price today, but that's just standard IGP tomorrow for Intel, all at lower power than their APUs. Fusion is officially dead.You can't compare PS4 either, higher TDP and much slower CPU than even AMD's APUs, so what they gain in graphics is lost in CPU and even then Broadwell isn't too far off. Sure Intel has more cash to throw at the problem, but they've basically proven AMD's acquisition to gain competitive advantage in integrated graphics was a complete and total failure that has effectively made 2 competitive companies uncompetitive in their core competencies.
OrphanageExplosion - Saturday, June 20, 2015 - link
The reason you haven't seen the PS4 APU out in the PC space is remarkably simple - you'd get nowhere near its theoretical performance level owing to the DDR3 memory bottleneck. That's why Sony paired it with GDDR5 - something you can't do on PC.Yojimbo - Thursday, June 18, 2015 - link
Yes did you see its game benchmarks, wow! (Maybe your message should have been scheduled to be released on the 24th instead?)bill.rookard - Thursday, June 18, 2015 - link
Cautiously optimistic here. I'd love to see some testing of some actual hardware before breaking out the champagne.Akrovah - Thursday, June 18, 2015 - link
If you have seen benchmarks, then please, link. Otherwise you are just spewing pointless pontifications with no backing.Fiji certainly SOUNDS impressive on paper, (though I'd like more than 4GB VRAM for GTA V and Shadows of Mordor) but I have yet to see what the real world result is.
fingerbob69 - Friday, June 19, 2015 - link
Here you go...http://www.forbes.com/sites/jasonevangelho/2015/06...
fivefeet8 - Friday, June 19, 2015 - link
Those are AMD provided internal benchmarks. Not that I think they are fake or wrong mind you. Probably best case scenarios.masouth - Thursday, June 18, 2015 - link
I just want to see the actual benchmarks. I hope it all does play out well for AMD because competition is better for the market no matter which company you want to buy from. That said...On paper it certainly looks like it will have "unbelievable" performance in all games at 1080p or less and the majority at 1440 however my concern is that the 4gb of memory appears that it will result in a bottleneck at higher resolutions and settings. No matter how how powerful the processor or how fast the memory on the card is, if all the desired data cannot be loaded into the video memory at the same time then the card is going to be at the mercy of the rest of the other computer system's ability to feed it the required data. It may just end up being like driving a Ferrari in rush hour traffic. =/
Again, that's not as huge of an issue right now but who is buying a video card for the games they wanted to play yesterday? Tomorrow is where it's at.
Jumangi - Thursday, June 18, 2015 - link
Talk about premature dude. We have not seen outside sites do comprehensive reviews on these cards and you already declare them the best. It would be foolish to spend money now without seeing how the high end cards do at 4K resolutions. AMD PR isn't worth much without legit numbers backing it up.FlushedBubblyJock - Sunday, June 21, 2015 - link
that's amazing - I'm at the 390/390X review spin, and instead of seeing amd lovers crooning over the new cards....they are squealing about fiji and fury
an there's 7+ pages of them doing it - not saying a word about 390 and 390x
that tells me how bad amd and it's fans are
I am astounded
fivefeet8 - Wednesday, June 24, 2015 - link
Wow. Someone's got egg somewhere.Chaser - Saturday, June 27, 2015 - link
How's that amazing performance looking now from other site's reviews? Anandtech was being honest with it's readers about these ridiculous re-brands. Now they are paying the price for their professionalism by telling it to their readers and not with AMD's spin.Fury X is a catchup to the 980ti with near zero overclocking room. Notice the sites that have Fury X reviews didn't grade overclocking? When Anandtech's review is out they most certainly will. So hardly a win. One model. All the rest are the same dated, overheating re-brands.
texasti89 - Thursday, June 18, 2015 - link
This new AMD release should be mainly judged by its efficiency or performance/watt metric. Of course AMD can double and triple transistor count to achieve a higher performance than that of the competitor, the real achievement is weather AMD engineers can attain that at a lower power budge, which I highly doubt is the case in this new generation given the push for water cooling. I would be very impressed if AMD can beat Maxwell efficiency in this release.nightbringer57 - Thursday, June 18, 2015 - link
At least on paper, they could.The Titan X performs around 40-50% better than the 290X in general (it depends heavily on the tests, as usual). Compared to the 290X, the fury X has around +40/50% more of everything (SP, Texturing units, memry bandwith) (except ROPs). The power consumption of the Titan X is not that far ahead of the 290X (around 10% less). So on paper, barring potential architectural performance changes per SP, barring the turbo behaviour that should be better with this cooling soltion, barring all factors... (again, it's a paper comparison), you end up with a card that has comparable performance with roughly 10% more power, which is quite decent.
The problem being that this performance/W level would not scale down very well, because you know, HBM. But a GTX980-esque R9 Fury "vanilla" could be possible.
Again, it's a paper comparison, we will see how it is on the 24th. But at least, there is some potential.
Gigaplex - Friday, June 19, 2015 - link
HBM would help it scale as HBM uses less power than GDDR5.nightbringer57 - Thursday, June 18, 2015 - link
And as a side note I forgot, I think the watercooling push on R9 fury models is simply due to the shorter PCB design (because, you know, HBM..), so if you don't want to lose this "advantage", you need a shorter radiator. Which may not be enough to cool down a good GPU. So you have to take some of the radiator off of the card, so you have to use watercooling unless you go for a 4-slots solution...looncraz - Thursday, June 18, 2015 - link
Have you actually seen the card? The radiator goes on a chassis exhaust port, NOT attached to the PCB...Unless, of course, you are calling a heatsink a radiator... and are actually referring to what would need to be done for air cooling (in which case you'd just go longer, not wider).
nightbringer57 - Thursday, June 18, 2015 - link
I don't know why, in my mind it was a hybrid air+water solution.In any case, it still follows the reasoning: when I say "taking some of the radiator off the card", it means that some (some being 100%) of the heatsink (radiateur in my baguette mother tongue, hence the confusion with "radiator", sorry) is not on the PCB anymore ;)
cjs150 - Thursday, June 18, 2015 - link
"And as a side note I forgot, I think the watercooling push on R9 fury models is simply due to the shorter PCB design "I mused about that yesterday as well. I suspect the issue is that the shorter PCB limits the size of fans that would look sensible and as we all know, usually from bitter experience, small fans and GPUs usually result in something that sound like a jet fighter taking off.
I want to see Fury with a custom waterblock for us custom water loopers but having power connections on back of card makes for very clean builds
xthetenth - Thursday, June 18, 2015 - link
On the other hand, Nano. Of course that looks like they went for one big and good fan.bernstein - Thursday, June 18, 2015 - link
could go either way.but only because of the performance boost of HBM. Since it's GCN1.2 fury X's stream processors have essentially the same efficiency as the R9 285.
bottom line: a gtx 980 ti with HBM would be far more efficient.
however since nvidia will only use HBM with pascal, amd has another year to match nvidias efficiency
SantaAna12 - Thursday, June 18, 2015 - link
I will not be judging it this way.I will be looking at the fps Fury can provide in higher resolutions/Mhz displays.
Yojimbo - Thursday, June 18, 2015 - link
I don't think they will beat Maxwell's efficiency but I think between the architecture changes they've made and the efficiency benefits of HBM they will approach it. It looks like with Fiji, AMD may have reduced double precision performance in exchange for die space/efficiency.xthetenth - Thursday, June 18, 2015 - link
Or we could judge things that you buy to make frames quickly on the basis of how fast they make frames in comparison to the money you spend on them, and if we want to save some watts go to the places where saving huge numbers of watts is trivial like lightbulbs.Or are we just going to be entirely about performance/watt, pack the entire desktop segment up and play games on our phones?
Yojimbo - Thursday, June 18, 2015 - link
Saving Watts with your GPU is also trivial.tomc100 - Thursday, June 18, 2015 - link
All of these refresh cards should have been released months ago to compete against Nvidia's GTX 900 series instead of leaving a giant vacuum for Nvidia to go uncontested and fill up market shares. Also, since none of them uses the new HBM then there's no excuse for not releasing them for so long. AMD's marketing strategy is the worst. This is how you go bankrupt and run their ATI division into the ground.Qwertilot - Thursday, June 18, 2015 - link
Fairly well forced to run down their existing 2xx stock first I'd think? They did that at very low prices of course, so probably about as good an effort to compete as this rebadge/refresh.Really need 14nm to come relative soon and a good range there.
Yojimbo - Thursday, June 18, 2015 - link
I think they just didn't have all the technology ready. They've really taken a beating the past 9 months. I am guessing an inventory write-down would have been preferable to the loss of market share they've recently incurred.Cailin - Thursday, June 18, 2015 - link
Really want an anandtech review on the 390 and 390x. There seems to be some great numbers coming from other sites that already have the reviews up. Very surprising, given they are rebrands with more memory and adjusted clocks. I don't know if it was just driver work since I last saw 290x numbers or what?Anyway what I really want to know is if the 290/290x actually does perform the same as the 390 series (adjusted for clockspeed) So please retest a 290x if you have one handy, not a crappy reference one obviously. If the 290(x) matches the current numbers I am seeing I will be picking one up second hand/ fire sale for sure. I really thought there was more of a gap between gtx980 and the hawaii cards but perhaps I was wrong.
Cailin - Thursday, June 18, 2015 - link
NM this was already done at Hardocp.com with their MSI 390x review. At the same clockspeed the rebrand 390x is the same speed as the 290x. Think I'll be picking up a cheap 290 or 2Black Obsidian - Thursday, June 18, 2015 - link
Check out a few different reviews. Guru3D show the 390X beating the 290X by up to 15% at higher resolutions, and generally matching or slightly faster than the GTX980.The 390X is certainly no revelation in gaming, but at $50 less than the 980 and equal to better performance, maybe it and Fury will at least help edge nVidia's pricing down out of "we've got a monopoly and we know it" territory.
chizow - Thursday, June 18, 2015 - link
Did you bother to read what he wrote? Clock for clock, the chips are identical = rebrand. The 290X scores are either reference, custom cooled, or different stock clocks which is no surprise given 290X has gone over so many permutations over the years. Reality is, the last 8GB 290X that hit the market is really not much different than the 390X except for a different BIOS, cooler shroud and box. Board shots of the XFX have already confirmed this reality.Nagorak - Thursday, June 18, 2015 - link
390x clearly has faster memory, so you can't just overclock the 290x to equal it. 290x might still be a better buy, but they're not entirely identical cards.chizow - Friday, June 19, 2015 - link
They're the same ICs as the 8GB 290X, so yes, you can just over or underclock them. 290X has been on market for nearly 2 years, of course it has gone over gradual improvements in yield for both the GPU and the memory, so if you compare launch reference throttling 290X to 390X you're going to see a big difference. But if you compare an 8GB 290X that was launched last year, you're going to see minimal difference.http://www.hardocp.com/article/2015/06/18/msi_r9_3...
Rebrands. Hopefully AT does a similar clock for clock apples to apples comparison, but from the tone of Ryan's intro and his reluctance to call them rebrands, I highly doubt he wants to go in this direction. :)
silverblue - Friday, June 19, 2015 - link
Aren't there one or two other changes within Hawaii Refresh? I admit that rebrands leave a nasty taste, but if there are tweaks to reduce power and add one or two functions, it's not as bad as it could've been.I say "if", of course.
chizow - Friday, June 19, 2015 - link
Well, Ryan is claiming other tweaks, I've read improved tesselation and hints of VSR but other sites have shown these are just AMD working on their drivers as they should've been all along. We will see if there is validity to the actual silicon tweaks or if Ryan is just going by the script to make sure he gets a Fury X to review.der - Thursday, June 18, 2015 - link
yaaaaaaaaaaa!jjj - Thursday, June 18, 2015 - link
Isn't the die 596mm not 594?Seen a slide somewhere with 596 for the die and 1011mm2 the interposer.
Ryan Smith - Thursday, June 18, 2015 - link
Right you are. Thanks for pointing that out.I have 594 in my notes from my time with AMD, so it's possible I transcribed that wrong (or they changed something on me!).
jjj - Thursday, June 18, 2015 - link
it just moves with the temps :PT1beriu - Thursday, June 18, 2015 - link
Disappointed that you don't have the 300 series reviews to post on launch date, considering that more that 5 review sites got theirs up today. :(Ryan Smith - Thursday, June 18, 2015 - link
Unfortunately I don't have a card quite yet. Even if I did, I just got back from the AMD briefing yesterday evening.creed3020 - Thursday, June 18, 2015 - link
Surprised that AT doesn't even have a card yet when numerous other publications already have cards and results....BillyONeal - Thursday, June 18, 2015 - link
Other publications are willing to repost numbers from press release "leaks". AT is not.extide - Thursday, June 18, 2015 - link
No, they have cards in hand ...xthetenth - Thursday, June 18, 2015 - link
On the other hand, the benchmarks are pretty interchangeable, and the insight and analysis here is excellent as usual. This is the only coverage to really add value beyond "well it performs like an aftermarket 290 or a bit better".chizow - Thursday, June 18, 2015 - link
Wow really? That seems hard to believe AMD would launch these cards without seeding the press properly, even Best Buy had these cards last week.dragonsqrrl - Thursday, June 18, 2015 - link
There probably won't be a whole lot to review. These cards basically already exist, their performance is known, and they're available at lower price points then their respective rebrands in the 300 series.just4U - Thursday, June 18, 2015 - link
While some of the changes have been minor there's enough going on with the 390s to warrant a extensive review and how they stack up against the 970/980. Plus everyone likes "updated" benchmarks to reflect what's currently on the market.. or we'd never get the call to throw in older cards that have been reviewed to death.dragonsqrrl - Thursday, June 18, 2015 - link
I didn't say a review wasn't necessary, I said there probably won't be much to review. As I said in my previous comment, the performance of these cards are known, literally. There's no guess work involved, and there are no surprises coming. The only question in my mind was whether the 300 series uses new revisions of these existing GPUs, but it doesn't look like that's the case.just4U - Thursday, June 18, 2015 - link
I disagree.. looking at benchmarks from early reviews there's certainly some unknowns as the 390x (performance wise..) is a much needed improvement over the 290x and can compete with the 980.. It's vanilla variant (390.. which competes with the 970..) might even be a better solution in Crossfire than two 980s in SLI for 4K gaming which has my interest peaked... what about yours?dragonsqrrl - Thursday, June 18, 2015 - link
No, the performance of the 290X OC'd to ~1.05-1.1 GHz, which has been done many times in the annals of benchmarking history, is very known. And thus you essentially know the performance of the 390X. I guess I could just repeat what I said in my previous 2 comments again, I basically reiterated the same thing 3 times, but they seem easy enough to go back to.And I wouldn't quite call it performance competitive with the 980. To me performance competitive implies trading blows, maybe ~3% behind on avg. Most of the time the 390X seems to fall evenly between the 290X and the 980.
"what about yours?"
No. If your interest wasn't already peaked with the 290/290X, which had a better price/perf ratio for the past ~6 months, why does the 390/390X with an inferior price/perf ratio peak your interest now?
Black Obsidian - Thursday, June 18, 2015 - link
Whether you see the 390X as competitive with the 980 seems to depend on which reviews you read. Guru3D and Hardware Canucks both have the 390X ahead of the 980 in some instances, with the former showing it overall being ~3% slower, and the latter showing it at ~3% faster.At an MSRP $50 lower (which includes custom coolers that are quieter than the nVidia reference blower you're getting at the 980's MSRP), I'd say either of those sites' results qualify as competitive.
dragonsqrrl - Thursday, June 18, 2015 - link
@Black Obsidian I think the 390X is totally competitive at it's price point, but based on the reviews I've seen I wouldn't call it performance competitive with the 980. The Guru3D results are interesting however.And I just realized that the MSI card being tested in all of these reviews uses a tipple slot cooler. Is it even relevant to draw comparisons to this anymore? It's becoming borderline absurd what AMD is having to do just to strive for performance competitiveness at certain price points.
just4U - Thursday, June 18, 2015 - link
The HardOP review suggests competitive as well.. right in line with what any would expect with Amd vs Nvidia Some titles favor one while others favor the other.. which put the cards on equal footing..Until we hit 4k... Standalone the 980 and 390 are adequate (at best depending on your comfort levels) but once you go into 2 card setups the 390 has a edge where it's memory can finally be utilized.. (theory..) so even the vanilla 390s in crossfire might be a better buy over 2 980s.
dragonsqrrl - Thursday, June 18, 2015 - link
@just4U Well, I realized why some of these reviews are showing the 390X getting so close to the the 980. They're using the highest clocked version of the 390X currently available. In fact it's the only version available that doesn't hover around the 1050 stock clock, and it's being measured against stock 980's. The review I referenced on Tom's used an OC'd 980.chizow - Thursday, June 18, 2015 - link
290/X was always competitive with the 970/980 too after AMD slashed prices, people just didn't want old, hot chips with dated feature sets.silverblue - Friday, June 19, 2015 - link
I do love how you keep bringing the heat argument up regardless of the fact that most 290/X cards have aftermarket cooling solutions.There are aspects to DX12 that AMD stands to do better at than NVIDIA. In fact, let's examine this further. In the time that AMD have released one new version of GCN, NVIDIA have launched two versions of Maxwell with differing levels of support - Maxwell 1 is 12_0 compliant, Maxwell 2 is 12_1 compliant. GCN 1.1 (you know, Hawaii) is 12_0 compliant, and came out before Maxwell 1 (about four months, actually).
Resource Binding: Hawaii supports Tier 3, Maxwell 1 supports Tier 2
Tiled Resources: Both support Tier 2
Typed UAV Formats: Hawaii supports Tier 2, Maxwell 1 supports Tier 1
Now, let's compare GCN1.0 and Kepler, both of which are 11_1 compliant (again, GCN1.0 came out before Kepler):
Resource Binding: Tahiti supports Tier 3, Kepler supports Tier 2
Tiled Resources: Tahiti supports Tier 1, Kepler supports Tier 2 (Kepler is therefore ahead here)
Typed UAV Formats: Both support Tier 1
And what about asynchronous compute engines? Here, I'll let AnandTech explain:
http://www.anandtech.com/show/9124/amd-dives-deep-...
Now, you might want to clarify "dated feature sets", especially considering that, Maxwell 2 excepted, AMD has beaten NVIDIA to the punch on DirectX compliance for the past two generations. Morever, we don't actually know what Fiji's complaince level is yet. Additionally, how many people will have bought the 970/980 for its compliance level?
sabrewings - Friday, June 19, 2015 - link
"I do love how you keep bringing the heat argument up regardless of the fact that most 290/X cards have aftermarket cooling solutions."It doesn't matter to me if they have aftermarket cooling solutions or not. Heat is heat and it has to go somewhere, usually out into the room I'm sitting in. A difference of 50-60w over the course of several hours is significant. Personally, if I'm going to buy a high power card, how much performance I'm getting to deal with that many watts is important to me. I debated getting the 980 Ti over the 980 because I game at 1080p on my 55" TV and it's 85w more at full chat. However, I do plan on having a VR headset (or various headsets, probably) and the 980 Ti is a great card for that.
Why didn't I wait until later? My last PC was built in early 2008 and its last video card upgrade was a GTX 275. It was time and I was tired of waiting. No regrets on my purchase yet. I make custom water cooling loops and the AMD CLC looks like a child's toy to me.
chizow - Friday, June 19, 2015 - link
Briefly read all that, none of what you wrote refutes the fact Maxwell sold incredibly well on the platform of effficiency and updated feature sets.FL DX12_1 > FL DX12_0
HDMI 2.0 > HDMI 1.4
HEVC encode > N/A
Full DSR > hodgepodge VSR
Maxwell ACEs > GCN 1.0 ACEs
I think quite a few would buy Maxwell over AMD when they've said since September they would have full DX12 support with demos of games with Microsoft to back their claims. AMD's message about DX12 support however, has been much less clear.
fingerbob69 - Friday, June 19, 2015 - link
RIP gtx480 :/asmian - Saturday, June 20, 2015 - link
Should be "pique", not "peak", ones interest. Curse those pesky homonyms. ;)T1beriu - Thursday, June 18, 2015 - link
Vanilla Fury is missing from your price comparison table at $549. The price is official and it was announced by Dr. Su at AMD E3 keynote.happy medium - Thursday, June 18, 2015 - link
" However it also puts AMD in an interesting position versus NVIDIA, who has stuck to a flat 2GB on their competing GTX 960"The gtx960 has had a 4gb card for months on Newegg for about 229$.
just4U - Thursday, June 18, 2015 - link
From my perspective the good news comes in the form of the 390 vanilla card.. It's on par with the 290X according to early benchmarks (1% faster..lol..) comes with 8G of ram and is a $100 less.. in Canada with some actual stock. Since the bit coin craze is over on the Radeons it will also hold it's resale value.. not bad..Prices look to be slightly cheaper than the 970 which it is comparable to. Not bad overall... but I'd have preferred the 390X to have come in at that price point. Oh well.
strike101 - Thursday, June 18, 2015 - link
Hope both Nvidia and AMD releases Mid-High Range power effecient GPU's.... less heat at more compact for us on ITX form factors... still stuck on 750Ti.... no alternativesjust4U - Thursday, June 18, 2015 - link
Hi Strike, Can't really speak for either side there as what you see is what you get for the next year or two Im guessing.. but Mid-High range? Nvidia has the 960 and 970 in mini-itx type solutions..I've been eyeing the MSI and Asus Variants for sometime especially on the 970.
FMinus - Thursday, June 18, 2015 - link
I wish, few years ago you had a lot more choice in that department, right now not that much. I'm still looking for a single slot solution in the mid-range segment.AS118 - Thursday, June 18, 2015 - link
Interesting stuff. I hope that the 300 series has more tweaks than just better binning and higher clocks at lower voltages. Based on what they've said about the 390's, I'm guessing they've made at least some slight architectural tweaks.That said, I feel that the Fiji / Fury line is the real stuff to watch with this launch, and to a lesser extent, the 4gb version of Tonga in the 380. If they release a 380x with 384-bit vram, that might be quite interesting too.
The fact that the 380 / Tonga has full support for Freesync and good DX12 support makes it the one I'm most interested in. I hope AMD can turn it around with this launch and get Nvidia down from its 75% marketshare dominance, which is not great for competition at all.
I like NVidia (especially EVGA cards), but I'm going to try my best to buy and promote AMD until the marketshare's more even.
just4U - Thursday, June 18, 2015 - link
I was looking at the 4G variant of 380 released by Asus/MSI today.. I'd be impressed if the price wasn't $80 CAD higher than the 2G variants.. which pushes me away from it and looking at the 390 instead.Leyawiin - Thursday, June 18, 2015 - link
HardOCP's review of the R9 390 and R9 390X show them to be a straight up rebrand of the 290 and 290X with 8GB of VRAM a single card can't use. No one should be surprised though - its been known that the R9 and R7 series will be nothing but two to three year old cards with slightly different clocks and more memory. But yeah, the niche high end Fiji cards - woo hoo and all. Two new cards with new architecture after two to three years of circling the airport. Bravo AMD.just4U - Thursday, June 18, 2015 - link
What review were you reading? I read HardOCP review as well.. and all they did was call AMD out on it's 4K push.. as they stated that it will take two cards to fully utilize the additional memory (no surprise there.. as neither the 980 nor the 290x are great cards for that but the 390x may be once you factor in crossfire)revanchrist - Thursday, June 18, 2015 - link
Learn more before you post. There are 4 Fury cards getting launched.just4U - Thursday, June 18, 2015 - link
From their bottom Line:"As it is, we will leave it there for you all to discuss the merits of Radeon R9 390X. At $429 the MSI R9 390X GAMING 8G is priced better than GeForce GTX 980, and finally gives the GTX 980 competition from AMD, which was lacking until now."
gorg_graggel - Thursday, June 18, 2015 - link
So, apparently AMD has allowed the press to publish AMD's internal benchmarks.Sorry, the article is in german, but scroll down for the slides (these are in english). One is benchmarks, one is benchmark settings. They are directly from AMD:
http://www.pcgameshardware.de/AMD-Radeon-Grafikkar...
Looking good so far i'd say. :)
While they are likely cherry picked and all, the provided settings should help put things more into perspective.
gorg_graggel - Thursday, June 18, 2015 - link
Oh yeah, I meant Benchmarks for Fury ofc. :)gorg_graggel - Thursday, June 18, 2015 - link
Another addendum: (this comments section lacks an edit button. :P)It's in 4K (and includes Shadow of Mordor).
chizow - Thursday, June 18, 2015 - link
What's more interesting is why AMD turns off AF in more than half of those games? Is something wrong with Fiji's TMUs? Doesn't make any sense, 16xAF has been trivial for years since 9700pro (typically 10% perf hit or less) and a staple for any benchmark since at least a few gens before then.fivefeet8 - Friday, June 19, 2015 - link
Probably because a 10% deficit from AF puts it within the same performance or less than a 980ti. Nvidia's AF performance has been pretty stellar for the past many generations. We'll know for sure soon I guess.chizow - Friday, June 19, 2015 - link
Yep, that's what I'm guessing as well, glad others are thinking the same thing with such a glaring omission of what really should be a standard checkbox setting at the driver level.fralexandr - Thursday, June 18, 2015 - link
Are all of the memory clocks supposed to be Gbps instead of GHz?Ryan Smith - Thursday, June 18, 2015 - link
Yes. Technically GHz was always wrong due to how DDR works, but it wasn't a real problem until HBM came along.TallestJon96 - Thursday, June 18, 2015 - link
Some of these seem pretty compelling, but others are really lame. A full tonga card with 4gb would be pretty cool, would fill the gap between Nvidias 960 and 970.The 390 and 390x are both really solid, except for that damn TDP. The fact that they beat the 980 ti in memory capacity AMD bandwidth is great, and they work well as either 1440p or entry level 4k cards, but the GPU performance compared to the much cooler and lower power 970 and 980 is only slightly better.
Frankly, none of these cards give anyone a reason to upgrade like the 200 series did. The 290 was all the rage for quite some time. On the other hand, prices are very reasonable, and will only go down.
ExarKun333 - Thursday, June 18, 2015 - link
Still strange we don't have HDMI 2.0 on these or Fury. I wonder if that will hamper 4k adoption for AMD on this new gen?jstabb - Thursday, June 18, 2015 - link
What's the word on HDMI 2.0 support? Has either the 300 or Fury series added it?jwcalla - Thursday, June 18, 2015 - link
300 series does not and word is Fury does not either. But there is DP.amilayajr - Thursday, June 18, 2015 - link
AMD 390x just kicks Nvidia GTX 980.See bench marks http://www.forbes.com/sites/jasonevangelho/2015/06...
I let you boys decide..... I hope nvidia drops prices.......
AMD 390x sells for $429...... Nvidia GTX 980 sells for $499
just4U - Thursday, June 18, 2015 - link
It doesn't beat up on it.. It's comparable and dependent upon the games you play.. It's trading off higher power consumption for a lower price.. It should shine over the 980 for 4k gaming.. but I'd be more inclined to go for the Vanilla 390s instead .. lower price, lower watts.. and it may actually beat the 980s in SLI as well.just4U - Thursday, June 18, 2015 - link
when paired in crossfire of course.. (sigh..) wtb edit.chizow - Thursday, June 18, 2015 - link
lol AMD internal benchmarks? ggOnce upon a time AT would've torn AMD a new one for this kind of stunt.
just4U - Thursday, June 18, 2015 - link
Forbes isn't really what any of us would call a tech site..nandnandnand - Thursday, June 18, 2015 - link
Table says R9 instead of R7 on page 2.Ryan Smith - Thursday, June 18, 2015 - link
Could you please be a bit more specific? As far as I can see everything is as it should be.LoccOtHaN - Thursday, June 18, 2015 - link
Same thing was when Hawaii debuted... New GPU's line Hawaii and Older Thaiti with new name and a little better performance (better FABS) and lauch price ~$549 now we have:New GCN and Hi-Tek GPU with HBM named Fijii with price $549 for Air Cooled ver. Fijii Pro (3xFan) + we have Fiji XT WC for OC $649... Great Job AMD
LoccOtHaN - Thursday, June 18, 2015 - link
And then in Summer follows Fijii NANO + later September-october Dual Fijii VR -> Now i'm waiting for delivery of AMD Fijii X WC or maby PRO 3xFan...JimmiG - Thursday, June 18, 2015 - link
Well, for me, getting the 390 or 390X over a GTX 970 / 980 would require a new PSU, adding at least $100 - $120 to the cost. The GTX 970 and 980 on the other hand actually use less power than my old GTX 670. So those poor power consumption figures really do matter.just4U - Thursday, June 18, 2015 - link
The GTX 670 was drawing what... about 250W at load if I remember right.. So chances are your using a 600W PSU already just to be on the safe side.. which could easily handle a 390/Xjabber - Thursday, June 18, 2015 - link
Some folks commenting here that are desperately in need of a BJ!If this is all that's really important to some of you guys...wow!
soldier45 - Thursday, June 18, 2015 - link
AMD fanboys as bad as Apple ones...Michael Bay - Friday, June 19, 2015 - link
Just more desperate.chizow - Friday, June 19, 2015 - link
And fans of inferior products. At least Apple products excel in end-user experience and functionality even if they tend to skimp on pure hardware.slickr - Thursday, June 18, 2015 - link
The Fury line is amazing, no doubt about it. Fury X is the graphic card to get for high end gaming, its small form factor, its water cooled which means really quiet, really cool, keeps the inside case temperatures low as well.That said their rest of the lineup is garbage, actual garbage! All of the 300 series are rebagged 200 series cards with absolutely no optimizations either. I thought that they would at least update the feature set and introduce new stuff, but no.
The expected price cuts are nowhere to be seen either. From the low end to mid range at $150 to the high end at $330 and $430 these are all high prices. I can find a 290 for $240 these days, I can find a 290x for $350 these days. Why are the rebaged turds more expensive than the 200 series turds?
AMD are done, I expected a new line, a new architecture or at least significant changes to the point its almost a new architecture, but no we got the same old shit cards from 4 years ago and the 200 series are regabed turds from the 7000 series.
Same fucking price, same performance, same power consumption, same crap features as 4 years ago and higher prices. Bye, bye AMD I'm going to Nvidia you morons! I was waiting for AMD to release their "new" line to upgrade, but no they are morons and they release 4 years old turds that can't even run windows OS!
FMinus - Thursday, June 18, 2015 - link
If the re-branded line stays competitive, the general public does not care, neither do I. If we go by the benches, the R9 390X is on par with the 980GTX or slightly above it at certain resolutions, for $70 cheaper this is a deal, regardless of re-branding or not.redcloudsk - Thursday, June 18, 2015 - link
Why....Why.....Why.....no HDMI 2.0..............huge disappointment for peopl who use 4k TV as a monitor......chizow - Thursday, June 18, 2015 - link
Welp. Rebrandeon 300 series happened contrary to what many of you said months ago after the AMD Financial Analsysts Day, so I guess I told you so. :)Fury looks to be a solid part though, good thing AMD priced it accordingly, those early pricing rumors wouldn't have held up well in the marketplace, I don't think.
Still some unknowns however going forward, how badly 4GB will impact Fury, how much HBM will benefit, and exactly what features AMD GCN can and cannot do in DX12. We'll see soon enough I am sure, hopefully AMD doesn't forget to send out some Fury's to AT in the next few weeks! :)
JDG1980 - Thursday, June 18, 2015 - link
I just don't see these rebrands as being at all competitive. The Hawaii rebadges, in terms of pure performance, are roughly on par with the GTX 970 and GTX 980 respectively, but they use about twice as much power and have a far more outdated feature set (to name a few examples: GM204 has HDMI 2.0, hybrid HEVC decoding, better support for DirectX 12 features, and DSR is superior to AMD's VSR except perhaps on GCN 1.2 cards). Given that, pricing the Hawaii rebadges so close to the GM204 offerings just isn't realistic. Worse for AMD, Nvidia has a lot more room to drop prices (GTX 980 should really be quite a bit lower - the big price gap between it and the GTX 970 only made sense when it was a flagship card.) Because GM204 has a smaller die, a memory bus half as wide, and much lower power requirements, it's much cheaper to product GM204 cards than Hawaii cards. So AMD can't gain profits if they try to compete on price.What AMD really should have done was release Tonga as the R9 380 (instead of the R9 285) in the first place. They could then have rebranded Hawaii to R9 390/R9 390X at the same time (last September). If done as a "virtual release" (no reference cards), this would serve the purpose of getting the terrible reference Hawaii benchmarks off the charts and replaced with more representative figures from AIB cards. AMD could have stuck with the old 200-series branding for everything below Tonga, and just discontinued the Tahiti cards. This would have saved AMD the humiliation of having to rebrand the over three-year-old Pitcairn chip yet again. The impact of rebadging would have been reduced, since there would have been one truly new chip (Tonga) and only one rebranded chip (Hawaii). And when the Fury release came around, it wouldn't be marred by having to share the stage with a bunch of shoddy rebadges.
One thing is for sure, AMD really needs to have a whole new lineup for 2016 when the FinFET process finally rolls around. The fact that they were only able to afford two new designs for all of 2015 (Fiji and Carrizo) is worrisome. They're going to be bringing out the server/HEDT version of Zen, plus a 28nm desktop Excavator APU, in 2016. Can they afford to spin three or more FinFET GPUs on top of that? Southern Islands (7000 series) had 3 new chips released in the first wave, so I'd consider that a minimum requirement for a viable launch of a new generation. If AMD releases only one FinFET chip and rebadges everything else yet again, I think even their remaining die-hard fans are going to desert them.
chizow - Thursday, June 18, 2015 - link
Fully agree with the first paragraph, but it is obvious Nvidia will have to adjust the 980 price again, not so much against the 290X, but more against pressure from Fury Pro and Nano. They have some time before this happens. The 970 still has no peer though, its pretty amazing AMD didn't try to be more competitive here.2nd paragraph, I'd disagree slightly. AMD was clearly waiting for Fiji to be ready to combat Nvidia's Maxwell series, but I guess HBM growing pains and their biggest die ever delayed that process. I still think AMD was caught unprepared on 28nm pt. 2 and they just didn't think Nvidia would launch a whole new generation on 28nm. Once Nvidia came out with the 970/980 they had to scramble and go forward with Fiji and just HBM1.
Personally, I think they should've just gone with their old series designations. Fury X/Pro/Nano just aren't fast enough or priced high enough to justify a different nomenclature. 390X WCE, 390X, 390 would've been just fine, which would have allowed them to sell Hawaii rebrands as 380/X, Tonga rebrands as 370, Bonaire as 360. No Rebrandeon chuckles. :)
They'll certainly have a whole new lineup for 14/16nm FinFET, but how they release will be a telling sign on how far behind their R&D has fallen. They can get a pass for expecting 20nm to be ready and getting caught offguard with 28nm redux, but they won't get a pass for 14/16nm.
Qwertilot - Friday, June 19, 2015 - link
Biggest pressure on the 980 is probably from the ti ;)If the rumours about (really very limited to start with) availability for Fury are true, they couldn't really have put it in the stack as a 390 on those grounds alone. Feels like they had to launch it a little earlier than really ideal (the 4GB too of course) but I suppose its more about getting some mindshare at this point anyway.
You can, I think, see a good chunk of their future finfet line up. Just die shrink fury, half its TDP and there you go for the mid range line :) Might be quite effective if doing it that way lets them get there a bit ahead of NV.
Top end less clear, but that'll probably need HBM2 which seems like it might be a hold up.
chizow - Friday, June 19, 2015 - link
I'd agree there is pressure from the 980Ti, but it won't be as immedate as a $500 Fury Nano or $550 Fury Pro right on top of it, depending on performance of course.But yeah good points about availability and mindshare, I guess it does make some sense if they think it will be limited quantities, but at the same time, I think sell outs of Fiji even as a 390X would go a lot further than the readily availble Add to Cart we see now at Newegg.
sorten - Thursday, June 18, 2015 - link
275 watts? Um, no.beck2050 - Thursday, June 18, 2015 - link
These rebrands won't put significant pressure on anyone.amd_furion - Thursday, June 18, 2015 - link
amd fury is a new dawn in graphic cards,setting a new standard in pc graphics.the first to use hbm and the most to benefit from dx12. the fury massively obliterates nvidia,and nvidia fanboys is going to say'well nvidia is coming out with pascal soon"sorry but by the time pascal comes out amd arctic islands will be out to smash pascal.agentbb007 - Thursday, June 18, 2015 - link
I am fully invested in nvidia with 2 Titan X's and 3 ROG Swift monitors, but for competition sake I hope furyx lives up to the hype.Xpl1c1t - Thursday, June 18, 2015 - link
Zzzzzzz.... 28nm and no architectural optimization for lower power consumption.this is not worth the press. come 16/14nm finfet+ next year, we'll have something really worth talking about.
Gothmoth - Friday, June 19, 2015 - link
these "new" AMD 300 cards are basically renamed old cards with a higher TDP.. why all the fuzz?275 watt are you joking me. 100 watt more to get 10% more performance than nvidia.. are people retarded to buy this AMD stuff?
silverblue - Friday, June 19, 2015 - link
I won't apologise for them as 100W is a lot, however you need to ask whether the money you save buying the card will translate into more cost in the long run.We're still waiting on why Pitcairn's TDP appears to have dropped a significant amount - perhaps not every part is suffering from higher power consumption.
JDG1980 - Friday, June 19, 2015 - link
Because they didn't feel the need to overclock the balls off Pitcairn.lokee999 - Friday, June 19, 2015 - link
"Along these lines, because AMD is not releasing new GPUs in this range, the company is also forgoing releasing reference cards. Reference cards were built for testing/promotional purposes"so there will be no reference cards??? i really like to have one, because i can't afford r9 fury/nano :(
llMattll - Friday, June 19, 2015 - link
Titan X > Fury X > 980ti.NvidiaWins - Friday, June 19, 2015 - link
Metal>TitanX>Fury>TiNvidiaWins - Friday, June 19, 2015 - link
@AMD pulling samples from sites that posting negative reviews.....GJ AMDhttp://www.eteknix.com/things-go-from-bad-to-worse...
FMinus - Friday, June 19, 2015 - link
KitGuru getting their sample pulled is probably due to their youtube guy constantly puking over AMD with rumors and wrong data. There will be enough samples out there so that we'll get a proper picture of the Fiji.JDG1980 - Friday, June 19, 2015 - link
It's an attempt to intimidate journalists into not saying negative stuff about AMD's products. Whether it succeeds or not, it needs to be condemned.chizow - Friday, June 19, 2015 - link
Proper picture? Sounds like AMD is trying to punish anyone who doesn't go according to script, awful lot like those canned FreeSync "reviews". What is even more amazing is major tech sites are going along with it all lol.And you can see its not just KitGuru, its also Eteknix. Shame too, lots of readers are going to be misled with AMD's attempt to cover up these rebrands because major sites like AT are too worried about losing out on press samples and making those Day 1 ad revenue generating reviews.
FMinus - Tuesday, June 23, 2015 - link
From what I've read, Eteknix is not being pulled, it' s just that there's 10 samples for Europe and they are not on the priority list, but they will eventually get the sample to test. Frankly I can understand AMD pulling samples from people who trash talk them and running the rumor mill with false data, it's simple, you wouldn't shake hands with someone who trash talks you constantly.As for the low sample number, afaik nvidia had one sample only of one Titan card, think it was Z, and there was no such outcry.
NvidiaWins - Friday, June 19, 2015 - link
Nvidia's releasing a new version of the 980, codename "Metal" this 980 will bury the Fury, just saying folks....releasing in 2-3 months.tomc100 - Friday, June 19, 2015 - link
Which goes to show that Nvidia has better products than the ones they sell to their fanboy club. When they see their competitors releasing a faster gpu they just pull it off the shelf and say hey look here. A faster gpu magically appears and people continue to praise Nvidia. Without AMD, they would probably continue to rebrand the GTX 700 series. Also, the Fury X2 will bury anything from Nvidia.FMinus - Wednesday, June 24, 2015 - link
Will it come 2 months after 980Ti at cheaper price so it can screw over people who bought the 980Ti, like they screwed those who bought the Titan X?gscindian - Friday, June 19, 2015 - link
How does AMD justify the R9 390X at $429 with just about half the processors and 8 X less bandwidth bus the Fury X. I understand it has 8Gb of ram, expensive as that is. Maybe applying the HBM process to that card would have been better. In fact if I'm guessing correctly the 390X may only need 2Gb of HBM and still perform with a higher bandwidth than the 512bit bus it's working with. My concern is will it be able to use the 8Gb of ram before it hits a bottleneck on the bus.NvidiaWins - Friday, June 19, 2015 - link
Double-stacked memory cards very rarely perform better than the 4GB counterpart, you might see a fps or 2 increase, if any at all. @1080 resolution, no game@full ultra will utilize more than 3GB of Vram, just makes AMD look silly.FMinus - Friday, June 19, 2015 - link
Why don't you look at this from another perspective. AMD is bringing 4 Fiji cards out till the end of the year, Fury X2, Fury X, Fury, Nano, that's 4 cards on the new architecture and the rest of cards it is refreshes.Now what makes this so different from nvidia. They have Titan X, 980ti, 980, 970 and 960. They have one more card in the mix as AMD, if you want to go lower with nvidia, you have to go a generation back.
The difference being the R9 390X is performing around the 980GTX for a lot cheaper. The 970GTX and 960GTX being Maxwell perform rather silly in this big picture. So by the end of the year AMD will have 5 cards (4 new + 1 rebrand) that are performing on the high end.
chizow - Friday, June 19, 2015 - link
The difference is Nvidia didn't slice and permutate GM200 4x and then rebrand Kepler 3x as if they were new parts. You do realize the difference right?The whole point of new generations and new chips is to support new features and improvements across the entire range, vs. the highly stratified and in many ways, deficient limitations of even Fiji (no HDMI 2.0, no HVEC, no DX12_1 etc).
JDG1980 - Saturday, June 20, 2015 - link
According to TechReport, Fiji does indeed support HEVC.Agreed, though, the lack of HDMI 2.0 is a big deal, especially since Project Quantum tries to position Fiji as a set-top gaming option. And there were quite a few people looking forward to Fury Nano for set-top HTPC/gaming systems... until they learned that they wouldn't be able to hook it up to their 4K TVs at anything above 30 Hz.
As for DX12_1, it won't really matter until long after the card is obsolete anyway.
gscindian - Friday, June 19, 2015 - link
I would like to see the 390X with the HBM architecture. Soon 4K will be the standard and if bandwidth holds back the performance of the card I think we should move to the new tech. Also I'm curious to see how a system with half the processors of the high end card with high bandwidth capabilities will work. Will it be able to use the bandwidth, or will the processors just max out and not fill/use it up?milkod2001 - Monday, June 22, 2015 - link
Soon 4k will be standard... Well 1080p is around more than a decade and is still not standard :( I don't think 4k will make it anytime soon even with prices falling down to more affordable levels.We are still far, far away from lets say affordable $350 4k decent quality gaming display and $350 GPU which would drive 4k games with ease. Even if tech was there today i presume vendors prefer to give us all this in very small steps... so they make more money
to article: I'd prefer AMD to come with new tech first even if it was delayed a few extra weeks, have all ready(drivers, benchmarks, GPU available from day 1) then this pathetic more expensive re-brands :(
toyotabedzrock - Friday, June 19, 2015 - link
Disappointed in AMD and this article that reads like an advertisement.Why haven't they switched to 20nm at the very least by now? Has their normal fab really been unable to get their 16nm line up and running? All this PR and rebranding should be used to buy access to 20nm.
jwcalla - Friday, June 19, 2015 - link
It's simple:- There is no 20nm HP node.
- TSMC is working on 16nm FF+ but it won't be available until next year.
There isn't much AMD and nvidia can do in this regard.
JDG1980 - Friday, June 19, 2015 - link
To clarify things a bit, AMD won't be going with TSMC for FinFET. Instead, they will be using Global Foundries, which is licensing the 14nm FinFET process from Samsung.chizow - Friday, June 19, 2015 - link
Sorry, not a completely valid excuse, Nvidia surprisingly did release 4 entirely new ASICs even though we stalled on 28nm with impressive gains for each over their predecessor (GM107, GM204, GM206, GM200).Nvidia did plenty, they realized they wouldn't get 20nm and went forward with impressive results on 28nm. AMD probably wasn't as prepared, and ended up with 1 seemingly impressive GPU, but was forced to rebrand the rest.
Macpoedel - Friday, June 19, 2015 - link
The rumor has been for a while that the 20nm node process at TSMC isn't suitable for high power chips, like desktop GPU's (http://www.extremetech.com/computing/199101-amd-nv... That's why Apple can make a 20nm mobile SoC, but AMD and Nvidia can't build a powerful GPU with it. I don't know this for a fact, but if true, building a desktop GPU on that node could prove disastrous. So it's not about "buying in". And if I might point you to an already famous 20nm product: the Snapdragon 810, which is prone to overheating. Both AMD and Nvidia will be skipping 20nm and go for 16nm finfet immediately (the fact that TSMC's isn't finfet, probably has something to do with the overheating).chizow - Friday, June 19, 2015 - link
Its true and has been know for some time, the next GPUs from Nvidia/AMD will be 14/16nm FinFET (3D transistors).JDG1980 - Saturday, June 20, 2015 - link
Yes, during Financial Analyst Day, Lisa Su specifically said that 20nm was not viable and that the upcoming AMD products in 2016 would be on FinFET.NvidiaWins - Saturday, June 20, 2015 - link
No real performance increase in 300 series cards.........good job AMD, another star studded failure of re-brand.....ES_Revenge - Sunday, June 21, 2015 - link
The rebranding was known/expected but what happened to 3xx series was going to be OEM? Now they've gone full retail with the rebranding nonsense and only added the Fury as something new (which isn't even out yet) and more VRAM to Hawaii (which probably won't even improve much except a higher price)? WTF?Sad part is there's no 380X. 'Nice' that they removed the venerable Tahiti (given it is "old" and missing features like TrueAudio, HEVC, etc.) but why doesn't Tonga get an XT variant? R9 285 is Tonga Pro, R9 380 is Tonga Pro. No XT or "full" Tonga? Why NOT?! I mean if you're going to take rebranding to this degree (and you did nearly the same in HD 7xxx/8xxx to R9 cards) you'd expect they put a *little* effort into doing a few things different...at the very least.
But nope we get Fury (Fiji) and some more VRAM on Hawaii and that's all she wrote. Fiji, impressive as it may end up being, is going to be beyond what most people can either afford or justify spending. 8GB of ram on Hawaii does nothing except for benefit maybe a few games at 4K+ resolutions, while "justifying" keeping the prices high on Hawaii based cards, no doubt.
Then take out Tahiti, a GPU that was still more than competent despite its age, and replace with something in between it's Pro and XT variants--Tonga Pro. Don't offer something to fill the gap left by removing Tahiti XT, just make people jump to Hawaii Pro or buy into what I'm sure will still be an overpriced Tonga card. Yay they've raised the clock a bit--so what? It's not like that couldn't be achieved with a mild OC anyway. If Tonga is anything like Tahiti, 1Ghz is an easy/mild/no-sweat OC, so going from 918 to 970 on it is trivial. What is needed is a Tonga GPU with over 2000 shaders and not a penny more than $199.
This is a mess, that's all I can say. I'm glad for Fury to be coming out sometime soon but they could have at least done the rebranding part a little more elegantly and offered a few more things than new names and similar pricing. I mean there's rebranding and then there's just repackaging and this seems more like the latter.
At the very least I thought there would be a Tonga Pro card at $150-180 and a Tonga XT at $200 ish. But nope, forget that, just new names and Fury hype. And that's all this is anyway...hype leading to Fury. Same cards, very close to zero tweaks/improvments, new names, and a pathway for Fury to launch and blow people sock's off...at prices of $600+. Good grief.
neonisin - Wednesday, June 24, 2015 - link
Wow, Anandtech really dropped the ball on the review of the Fury X. Bye guys.epuigvros - Thursday, June 25, 2015 - link
I understand that AMD is maybe stuck in making progress in performance under 28nm. What I cannot understand and I think is a complete insult to us, the customers, is all this bullshit about "the new era of PC gaming" when your 300 series is a complete rebrand of the 200 series.mapesdhs - Thursday, June 25, 2015 - link
So I see now that the 'new era in PC gaming' was just same or less speed than a 980 Ti, while having to deal with the issues of fitting a wc into a case which more than likely already has one, sold at a price which isn't usefully less than a 980 Ti. *sigh*And really AT, no Fury X review? Wow...
mapesdhs - Thursday, June 25, 2015 - link
ES_Revenge, I should add, sadly my socks are very much still in place, and that was from reading one of the more +ve leaning reviews on Fury X (comments suggest other site reviews are less favourable).ES_Revenge - Thursday, July 2, 2015 - link
Well it's not actually less than an 980Ti, but it is "effectively" less in a way. One you're getting a CLC water cooler, so that's about $60-80 value; two you're getting HBM which is some pricey tech as well. Without those things it would (or rather should) have cost like $200 less. But then again maybe it would have then just had Hawaii-level performance without the HBM.Oh and another note Fury X doesn't even have HDMI 2.0, HDCP 2.2, or [full] HEVC decode, from the looks of it.
I agree, it's pretty underwhelming. Really bad handling of the rebrands and then Fury is really nothing spectacular either. At (and often below) a 980Ti is certainly not what the AMD fanboys were expecting; and nothing, as you put it, to blow anyone's socks off.
Looks like Nvidia has squarely won this round and I think if they *really* wanted to put the squeeze on AMD, they'd only need to drop prices of the 970, 980, and 980 Ti a little bit. I mean Fury X still has the above "advantages" where one might still consider purchasing one over a 980 Ti. But what if the 980 Ti sold for $50-100 *less*? AMD would be in some hot water then!
Despite all this though, my disappointment still comes from the fact that we're not going to see any advances in the mid-range with either price or performance, until at least 2016 (by the looks of it). R9 280X, R9 380, and GTX 960 will remain the only choices...at high prices. Only one of those can do HDMI 2.0 and HEVC though, and again it ain't AMD.
Due to the midrange-blahs, I don't see myself getting rid of my R9 280 any time soon but if I were it certainly would *not* be an AMD card, that's for sure. And this is probably the first time I've had to say this since...well, actually this is the first time I've had to say that. Pretty bad picture for AMD right now, IMO.
Nfarce - Thursday, June 25, 2015 - link
We are going into day two now and still no Fury X review when every single one of the other major tech sites have had their reviews up, including AT's sister site, Tom's Hardware. As a member here since circa 2000, it's a real shame that my time spent here continues to diminish and gaps get filled by competitors. I knew it wouldn't be the same after Anand sold and split.CiccioB - Friday, June 26, 2015 - link
Where is Fury X review? Are you completing it (so delay is more than acceptable seen your usual review quality) or you are not going to publish it at all?Did you have some problems? Card is too fast and nvidia asked you to change tests and settings? Or it is too slow and AMD is threating not to give you any future sample to review?
I'm just curious :D
@DoUL - Saturday, June 27, 2015 - link
I think a simple apology in a pipeline mini-article or a podcast with a brief description of what's causing the delay would have made you look so much better.. There are those between us who are not following closely and don't know that the guy responsible for GPU reviews 'Ryan Smith' is/was sick, for those guys, you're so much behind!The problem with this particular launch is that it's sooooo much hyped, if it's a mid-range card launch or a OEM-card launch or one of those refreshing cards silent launch then no one would have really cared about it, but such a card with so much anticipation and too many what if's questions you really can't be that silent about your cause of delay!
I, and I think the majority who hangs here, hold too many good memories about this place, I've gained almost all of my technical knowledge from this place, I hold it dear to me, but there has been some nasty mistakes in the past couple of weeks that made me really worried about the place, first, the miss-titling of Fury's webcast "paper-launching", second, falling behind and not reviewing any of the already-available-by-the-time-of-launch 3xx radeons, third, this stretching delay of Fury X review "without" any apology.
mistakes has to be done so that we learn from it, I still have faith in this place and I hope that it won't turn into another prime thing that miss-management has turned it into a dud.
PS. you already have most of the tasks lifted from above your shoulder, making so many mistakes in a short time while your primary task is the editorial stuff makes me inclined to whisper in your ears to take sometime off and re-examine how Anand was managing all things at once, I still believe in you!
neonisin - Tuesday, June 30, 2015 - link
Like.oppositelock84 - Friday, July 3, 2015 - link
Well said, sir.J.W.M. - Saturday, June 27, 2015 - link
Unless you live in International Falls, Minnesota - you don't need a 275W furnace to unnecessary heat up the house/ apt.!scanex - Friday, October 30, 2015 - link
I want a R7 370 with FreeSync or I will buy a Nvidia card!JMarlowe - Thursday, February 18, 2016 - link
So I can return my graphics card under reason of false advertisement?I purchased an R7 370 Nitro(Sapphire) and it states "latest version of GCN," when, in fact, it's actually the 2012 version of 1.0? Screw that. I'm going to return it for a 380. They can't be doing this to people. I just bought a 7850 is all. What was also misleading, was I was assuming it was the direct upgrade of the 270x. How the heck did the 370 become a downgrade? Actually, I'm just going to take it back for Nvidia and get a 970. They don't do this crap to people.
JMarlowe - Thursday, February 18, 2016 - link
...sorry, I meant I thought it was an upgrade to the 270, not 270x.