I kind of lost interest in Nvidia, besides their new multiprojection technique, when they made the efficiency graphs insane but they released a graphics card with chopped off cores and ROPs and evened up the progress difference by increase the MHz while price gouging on a reference cooler design that at most should cost $50 more but instead $100.
I have to stay Nvidia because G-sync and multiprojection. I think AMD is targeting the mid-end area but not super high-end. Most of the market in the world is low/mid-end. Look at the console gamers. PC master race.. cost like 500% more than a console plus the people don't know how to make and run one properly. Some settings remain unchanged. All they use is presets.
His situation is exactly why I haven't bought NVidia in a while, stumbling into pitfalls like that isn't something I like to do. Having to decide whether to void some value of a monitor I purchased vs a going with competitors card even if it's the more sensible choice it not a position I'm willing to put myself in. There are probably many more in his shoes.
nVidia has been working hard to create vendor lock-in with all of their proprietary feature-adds just so they had the upper hand when AMD or Intel became competitive.
G-Sync is a poor reason. Just lock your frame rates. Moving frame rates are just as bad a sync issues. Smooth game play comes from a steady frame rate. And since the human eyes can detect more than 30 FPS, I'm still at odds as to why people bother. More is not always better, or useful at all.
Because tearing can still happen at lower than fixed framerates too. If it were as simple as locking my games to 60fps without vsync, I wouldnt even consider buying a gsync/freesync monitor. And tearing bugs the hell out of me---what's the point of getting a card able to play the latest high-end graphics when those wonderful graphics are tearing?
again, why do we have to log in now for every post...
But human eyes can see 50-60fps if you have normal vision. Some people with extremely sensitive yes can get closer to 70. 30 is laggy as hell in action sequences.
nonsense. human eyes can recognize the fluidity difference all the way up to 175 FPS. Just pull out your iphone and put it in video mode and select 120 or 240 FPS capture and notice how much fluid the movements are.
Humans don't see in "FPS", but we tend to normalize to FPS which has some issues since we perceive different different aspects of FPS.
Human vision has a minimum of around 24 FPS to see "motion" before we start to see a slide show of rapidly succeeding images. Humans process images at a rate of about 30 updates per second. It can take up to 100ms to integrate an image into consciousness. While 10ms is considered instant for delay purposes, the brain can detect timing anomalies close to 1ms. We can visually recognize unexpected differences in images as fast as 300fps.
The best way to describe human vision in a nutshell is we process about 30fps, but we continuously integrate and can detect visual anomalies at a rate of around 300fps.
I have myself been able to reliably tell the difference between 70fps and 85fps in my hayday of Counter-Strike, and I was able to eventually notice after several seconds if someone was using a 120hz screen because of the higher real FPS. Fast paced motions were noticeably smoother and easier to predict instead of the strobe-effect slide-show of 85hz. In both situations the rendering FPS was about 150-200, but the refresh was either 85hz or 120hz, and it was noticeable.
I suggest reading some scientific facts about this human eye fps issue. For one I recommend Michael Duggan's "The Official Guide to 3D Game Studio", the most interesting part here: https://books.google.cz/books?id=weMLAAAAQBAJ&...
I am by no means a competitive gamer, but I can perceive the drop from 120 to 100 fps on my 144hz monitor in games a simple as Heroes of the Storm. Even my wife notices a reduction in smoothness of game play in Borderlands when dropping from 72 to sub 60fps.
Your stupid.. your eyes can see past 30 frames easy.. I can sure tell by any weak game on a console in comparison .. I also have issues with some games dipping below 60 down to 30 when I play games and noticed the slow down so yeah it is a big deal having plenty of frames when action picks up in a game because they drop.. So you really don't know what your talking about. Your eyes don't work like that MR Doc Narg.. By the way it's clear to see the difference in 4k video but guess what the average is 72 frames during video. Just stick to your console where you belong..
How do u lock your framerates if lets say your vide card cannot withstand the framerates you are supposedly locking? thats the whole point of gsync. to give you smooth gameplay not only at high fps but also at LOW fps. So unless you plan on running a very high end video setup and thus locking that 60,75, or 100 refresh rate in absolutely all games based on your display, then g sync will sound like an attractive option. Besides, vsync introduces significant lag which may matter to you if you play twich shooters so oftentimes even with a high end setup you will have to choose between tearing or increased lag. TBH I think nowadays if you purchase a nice gaming monitor that comes with a high refresh rate then will most likely have either freesync or gsync
It's the variance (standard deviation) of the frame time that is the problem. 24fps consistent with zero deviation as in movie is not actually a problem for the eyes, our brain is very good at intepolation. Except there will be more latency with computer game input with low fps. That's why going to cinema is fine.
The other reason going to the cinema is fine is because movie directors plan their shots around 24 fps. Certain types of shots, such as horizontal pans, look like crap at 24 fps. But directors know that, and avoid it.
Sports, on the other hand, is generally shot and broadcast at 60 fps, because it wouldn't look good at 24 fps. The shots aren't planned ahead of time, or edited before being seen.
Citation please. You are full of BS, the 970 was not the best selling individual discrete GPU of the last generation. Where in the world did you pull that nugget from? (I have an idea, and it's a dark smelly place)
Enjoy while it last because the rumour said that Kyro would be the last custom core from Qualcomm. If ARM were to release a new arch to replace v8-a or the newer cortex were to much better than Kyro, then we're back to many cores madness. http://idramakorea.com
people bitch if they take their time to give thorough info on products, then there's the opposite spectrum of bitching like yours where it's a problem if they talk about stuff early on like everyone else
Should have been called the "RX 460" and then leave room for faster products. a 460 is a bigger number than 380 anyway. AMD with 14nm up against Nvidia's 16nm should have given AMD an advantage. The 14nm process being used is obviously not fully optimised and not as mature at the 16nm used elsewhere. AMD need to increase the clocks by 30% to 60% and release the enthusist model with atleast 80% more GPU resources (re. geometry, texture, shader, ROP, MEM) and 30% higher clock rate. More than double the previous performance in the same form factor. The old pattern of improvement applied to the 28nm to 14nm shift would give an ideal of 4x improvement (wishful thinking of the good old days).
I know they have more to come and this isn't the fastest model they will release in the family. Whatever they are planning for in the next 6 to 12months need to be brought forward as quick as possible. Having a better lead could be worth an extra $200M revenue a quarter (and 40% of that as profit). If you think leading the market costs a lot, being the follower (delaying and limiting R&D) leads to less revenue and considerably more losses.
What are you talking about? This just from the raw numbers seems like its at least 40% faster than the R9380 and both cost $200. If architecture improvements are even semi decent we could be seeing close to 50% faster speeds than the r9 380 at lower wattage and the same price point.
Considering Nvidia only has the 1070 which goes for at least $400, while the RX 480 would come close to the r9 390x performance, for half the price that is very good.
I mean lets take Far Cry Primal a new game that is running on an engine equally optimized for both Nvidia and AMD, historically very consistent results for both companies, the 390x comes in at about 50fps at 1440p, the 1070 comes in at about 65fps.
So basically for HALF the price of the 1070 you are getting 85% of the performance of the 1070.
Does that really make sense? The difference is 120W worst case. But how long is the computer going to be on? And, of that time, how much will be strong gaming? And, if the card is running at 60% capability, will it really take 150W? Step by step, the difference actually means less and less.
so much fucking this, my power bill is ~120EUR/month, when I look at the dissection of that, the actually power is 45EUR, rest is just fucking taxes, I'd just so god damn wrong.
Most power "generation" added in the future will come in the form of energy conservation. It has to be incentivized in some way, otherwise people will keep wasting power.
150W is the limit on that reference board, I'd wager the whole card runs at like 120-130W at worst under load. The GTX 1080 is pushing past 180W quite often.
AMD has been very open about the roll-out of the 14nm parts and which market segments they would address and when they would address them - that they would be hitting the meaty mid-range part of the market first (which is not a bad move, IMO) has been known for a while.
Comments like this are infuriating to me. "Good luck finding money there"
So. Fking. What. If AMD puts out a 480 (or 490) at a loss how do I, the consumer, lose? Are we all financial investors playing the market?
If some unknown 3rd party put out a card at $25 with open source drivers that delivered 5 tflops and bare minimum gaming support/features, I'd buy. When that company blows through $10B of investor money killing countless hedge funds through an unsustainable market presence; will I cry about their poor ROI?
I'm here as a consumer. Give me it all for free if you want. Lots of high margin items out there for you to buy and help those corporate bottom lines. Better hurry before there's a price drop!
You don't have to be an investor to know AMD needs to regain some market share. Nvidia needs a competitor. Same with the AMD vs Intel battle. If AMD goes down, you'll be paying more for CPUs and GPUs.
Wrong. Intel and Nvidia (or any company that sells non-perishable/non-consumable goods) can only raise the price to a point where it is still viable for someone who already has an older model. Otherwise, they wouldn't make any new sales.
If AMD was dead now, you think the GTX 1080 would have had an MSRP of $2,000? How would they convince someone who already has a 980Ti, or even a 780Ti for that matter, to part with two grands?
When AMD goes down, nothing would change in the world, except perhaps eventually normal people would not have to put up with AMD's fanbase anymore.
A consumer GPU is an entertainment product, or rather a mere component of one, which not only has to compete with other brands in the same line of products, but also with the value provided by consoles/handheld/mobile, and even other forms of entertainment, like TV and movies.
@D. Lister You don't really get how competition works do you? Monopoly = Bad simple as that. In fact intel don't want AMD to fail, simply to be noncompetitive. If AMD dies then intel faces being broken into smaller companies to remove their monopoly. As long as AMD exists intel can claim it's a free market and there is no monopoly, even when their high end product prices reveal otherwise.
I am not attacking you just pointing out the truth, this is not intel bashing or nVidia bashing or AMD bashing. It is simply the fact that where markets are competitive prices fall and innovation increases but while there is no or little competition the opposite is true.
I currently have an intel Core i7 4970K and a Radeon R9 290X. My last CPU was a Core i7 920 and my last GPU was an nvidia GTX 580.
I don't care which company makes my hardware, I simply care to get the best product for the best price that is suitable for purpose.
I will consider buying AMD Zen if it has benefits over intel, even if it's slower as the competing intel CPU as long as the price reflects this and the CPU is up to the task I require it for. The same goes for my next GPU, nVidia or AMD doesn't matter, price and performance relative to my needs is important, not brand loyalty.
"You don't really get how competition works do you? Monopoly = Bad simple as that. "
And you obviously don't understand the word "monopoly" beyond being some sort of a scary monetary bogeyman. The state has monopoly on law enforcement and military, monopolies that are necessary and practical. Then there are natural monopolies that are actually beneficial. The point is not a direct comparison, but to make you realize the world isn't quite as much in black and white as you may like to think.
Secondly, in the grander scheme of things, a duopoly isn't really an immense improvement over a monopoly anyway.
"If AMD dies then intel faces being broken into smaller companies to remove their monopoly."
For heaven's sake, there is a big difference between having monopoly status, and actually being charged for abuse of monopoly. Intel could only be broken down at the demise of AMD, if it was proven in court that Intel was somehow directly responsible for AMD's bankruptcy.
Nvidia's current market share and stock position suggest you may have to wait a bit for those good times. Don't lose hope though, it could very well happen in your lifetime.
There is a huge difference in a "nvidiot" (aka someone who defends the company even if they fist them so hard on the butt) to "educated gamers". educated gamers have no brand and switch whatever brand gives better perf for the money with less problems. The ball is on Nvidia's side right now.
Especially on the higher-end market. I have a 144Hz 1440p IPS G-sync monitor (best thing I ever did to improve my gaming experience) and the only card I can see getting close to pushing my monitor on demanding games is the 1080, or even the 1080ti if I am patient enough
Since competition has no bearing on price, I'd like to offer you my services at no charge. Simply agree to route all your online purchases through me and I guarantee that I get you the absolute best price while dealing with all the order entry, tracking, etc. Win-Win.
If there was no AMD we wouldn't be talking about the 1080 or 1070 as they wouldn't likely exist. Market stagnation and rising prices WOULD BE and ARE a result of a monopoly.
Just look at the prices of intel's high end offerings, they have been steadily rising and the benefits seen between generations are incremental at best. They are milking the market with high prices and less innovation, why not AMD doesn't currently have anything on the market to compete beyond the Core i5 have they? This is not a knock on AMD simply a fact of the market.
With Zen and Vega AMD look set to return to the high end in both CPU and GPU. This is a good thing for all. No one is saying you have to buy either but that intel and nVidia will be forced to up their game by dropping prices and increasing innovation. It's a win win for all.
"If there was no AMD we wouldn't be talking about the 1080 or 1070 as they wouldn't likely exist."
Nvidia (or Intel, for that matter) doesn't make their products just to be in some sort of a technological pissing contest with AMD. Companies cater to the needs of the market. If there is demand for more power, then there is money in making more powerful hardware. Sure, without a race, the technology may grow slower, but that means software developers would have more time to optimize for every arch (a la consoles), which means that at the end of the day, we as consumers, would see the same growth in graphic quality, but only with lesser bugs.
"Just look at the prices of intel's high end offerings, they have been steadily rising and the benefits seen between generations are incremental at best."
You are SO right, it's not like Intel with it's dominant monopolistic position on the high end just increased their price for their top CPU by 80% or anything. Oh wait.....
The GTX 1070 is $379.00 and offers Titan X/980Ti performance. So in this case please tell us all which AMD product forced Nvidia to offer a less expensive, more powerful, more efficient product this round?
Well, last round AMD forced Nvidia to release gtx980ti. You were not gonna get gtx980ti which performs close to titan without Fury X coming out. I reckon it'll repeat next year with 1080ti........ They had no competition this round so they increased the prices of gtx1080 and gtx1070 by 'a small amount'. Monopoly doesn't come in a day, nor does it mean that prices rise up suddenly like inflation. Nvidia can offer a lot more with competition. Now they will be forced to unveil a good gtx1060 which might not have come had there been no rx480 in my opinion. Talk about Intel....just check the prices of i7-6950x.......
Competition in the mass market segment is very important. Look at the difference between gtx950, gtx960 and gtx970. It made everyone recommend the later over former two cards for mid-level segment. I remember a few years ago the most common recommendation was gtx560ti, gtx660ti for the same. This is how you play market games.
All of the 9th series of GTX? Once new tech is out, almost nobody wants obsolete one. There is tons of those chips on the market. Other then that I don't expect NVIDIA to sell 1080 and 1070 at volume, it's early yields, meaning they will still have to sell Maxwell until the switch to new node is fully complete. In the meantime market will start to be flooded with small die size AMD cards that will certainly be at higher volume.
That is the msrp on the card, but since it will only be a paper launch and Nvidia is the one selling first "founders" edition cards at over $450, in fact market price is close to $480, I wouldn't be surprised if market prices for the 1070 are at least $420-430 for 3-4 months before it goes down to $400 and I'd expect some competition from AMD to get the price to a more reasonable and sensible price such as $330-$350.
In the UK the price of the cheapest 1080 is £525 and that has a really horrid cheap plastic shroud over the heatsink and the blurb makes no mention of a Vapor Chamber cooling system which comes on the 'Founders Edition' for £619. The decent 3rd party cooler 1080 cards range in price from £580 to £690 and the hybrid and water cooled versions go up past £700.
This is an expensive card, a really good card, but an expensive one. When AMD releases their competing product these prices will fall.
Again this is not against AMD or nVidia, just stating the facts.
If tomorrow AMD launched the RXXX Wizzbang Wallop x2 that was $500 and was the same performance as the GTX 1080 then nVidia would be forced to....oh that's right, COMPETE!
You would soon see the price of the 1080 drop and maybe even see the introduction of the 1080 ti to fill the highest end void to appease the fanboys who need their brand to be the best, even if the top end cards are way out of their league they can aspire to have them. This creates brand loyalty to an almost occultist level where the hardcore fans will argue against fact, logic and truth to defend their brand.
This is not a bash on nVidia, I'm quite sure that if AMD were in nVidias position they would be acting in the same fashion. it is how business works, like it or not.
LOL! Lets me guess you live in the USA or Canada, have never went more than your state and you are going to be telling how people are in the world?
Have you been to Russia, India, Slovenia, Morocco, Brazil, etc...? There is a major market out there for $200 and cheaper graphic cards. If AMD can fill the price points from $100 to $200 with amazing graphic solutions that make sense, offer great value you'll see them selling tens of millions of graphics and making big profits!
Yeah I doubt Nvidia are expecting to pay all the bills from sales of the 1080. Mercedes doesn't make its profits from the top S class which has all the latest tech. The money comes in being able to push all that expensive and tested R&D out to the masses.
@pashhtk27 Exactly, I think people here seem to overestimate what budget gaming is.
£25-£100 = budget gaming £101-£250 = Mainstream £250-£400 = High End £400+ = Enthusiast
This is roughly how I define the levels. I consider myself High End that aspires to the Enthusiast level :-) This means that I always buy the best card for about £300-£400 but would be willing to spend more if the performance levels justified it. When I bought my R9 290X 4GB the only other card that I could consider was the 780 ti 3GB as that was the best nVidia had at that time but that was £100+ more expensive and had 1GB less framebuffer so I went with the R9 290X 4GB.
At $200 the RX 480 should perform similarly to the R9 290X and at that price I wouldn't hesitate to recommend it to a mainstream 1080P+ gamer. This may change when nVidia inevitably responds with their 1060 but that's a good thing as there's more competition and more choice.
"I predicted that AMD would abandon the high end and just be a budget brand. " Are you claiming victory? Because someone should tell AMD to cancel Vega SOON! Its seems to me AMD is about ready to make Nvidia slash the prices of their line of cards from top to bottom.
The standard is to measure memory by bandwidth per pin. Measuring by Hz stopped being sensible when GDDR5 came out, and GDDR5X makes this worse (there are several different frequencies you could measure).
Actually this site stopped using actual Hz rating in the 90s- DDR made it problematic, it simply has been compounded since.
Throwing out 256GB/sec for the actual bandwidth compared to say the 298GB/sec the only 290 had or the 357GB/sec that the 390 was offering would likely be a bit more friendly for people looking for quick information and not an AMD sales brochure :)
If anything AMD is being wasteful again with a 256 bit bus here. You can bet that Nvidia will go with less and save some on area and power. Ofc AMD will do better at higher res since some might even buy this for 4k - would struggle in 4k but with 4k monitors even bellow 300$, it's an option.
I can appreciate the price point but it is really disappointing to see that AMD couldn't reach higher frequencies with such a huge jump in litography and the switch to FinFET This part would be positioned much higher with the smaller Polaris (which they couldn't even mention at this point) actually introducing the VR ready lineup. Instead we're pretty much the old 390 at a TDP that is merely acceptable. Wow.
If the performance claims are accurate and we get 390-390x performance with 1x6 Pin connector from a ~$200 card, that's a pretty decent offering from AMD IMHO, who really cares about clockspeeds? I'd be more disappointed, actually, if AMD did have to push the clocks really hard to reach their performance target. Maybe this way we'll get aftermarket cards with an 8 Pin connector and a heap of overclocking headroom. Maybe not, of course. Nvidia did say that had to put a lot of work into their designs to hit the 1700+ Mhz they have on the 1080.
But whatever way you shake it, this is good news for the gaming masses who are running 1080P @ 60hz monitors. If (and that's still an "if" at that stage), performance, price and availability are as promised, you'd have a hard time recommending anything other than this card for a decent 1080P@60 gaming rig until Nvidia release their mid range or respond with significant price cuts.
Yup. To me it's either this RX 480 or the eventual 1060. I've bought in the x70 range for the past few generations, but since my monitor is still 1080p60, x70 level cards are not well "balanced" for this anymore. Example: I'm on a 770 that, while speedy enough, gets starved in modern games due to its 2GB VRAM (AC Unity is a great example). At this point in time, buying x60 level cards and renewing every couple years seems like the more sensible option.
35%? Based on what's been said thus far, you should be expecting a card that's starting to approach double the performance of your 7870. Even a 290 (with its awful stock cooler) is approaching those kind of gains: http://www.anandtech.com/bench/product/1034?vs=106...
Let's wait and see, of course, but I'd expect at least 80% or more from what we've seen.
What? Assuming conservative performance numbers based on all the data we have, it will perform as a r9 390 which is over 50% of your GPU. For only $200 and 150w that is amazing.
Aside from the usual "don't knock it till we see the benchmarks" routine..
This thing is $130 cheaper than the 390, and even if the clocks aren't as high the architecture is supposed to be more efficient so I wouldn't count this out just yet. I was already considering getting the GTX 1070 but if this thing is comparable I might just get the 480.
It won't be comparable to the 1070. I say that not as a fanboi, but the argument of economics. They've announced the price target as being $200, they simply would not sell a card that trades blows with a 1070 for half the cost, that would literally be throwing away free money.
I have 15 years of economics under my belt, so I'd say "nonsense" Or if you're a Brit, "Pish and likewise Tosh" :)
From what I've read and seen, mostly deeply technical and geeky, not economic. What this is, is a land grab. In 4x terms this is a Rush. At a stroke, AMD will own, the laptop segment, with Polaris 11, the Console segment, since Nvidia wasn't interested. Apple, and the consumer market. They also own the graphic subsystem with Mantle/Vulkan, So they get Android and iOS, etc. Since they have the consoles, and games port to the PC from consoles, it makes more sense to code to Vulkan.
At a stroke AMD now owns PC gaming into the future. MSFT continues to screw up, since Direct X12 doesn't support dual cards, which Vulkan does out of the box. I have Vulkan drivers on windows 7 right now.
I own an R9 390, before that I had a GTX 660. I bought the 660 as it was a 200+ Euro card at the time, the sweet spot of a gaming PC. I bought the R9 390 as the game I play mostly is Skyrim, and there isn't a card available that will excel at fully modded Skyrim, so I went for brute power and lots of VRAM, at the expense of Power consumption, (who cares about Performance per Watt?) It cost me something like 360 Euro, plays everything else at Max. Modded Witcher3, DA:I, that sort of thing. Plays Skyrim OK, Gave me about 50% more fps than the 660 with over 600+ mods Allows me to use 5GB of VRAM even with an ENB, Which takes main memory into VRAM to stop the game crashing. This is on Windows 7, MSFT screwed the pooch for DX9 in Windows 10, limits all card to 4GB, even the GTX 980 Ti. But I digress...
What I really want is Skyrim in VR, for that I need 4k to get stereoscopic 1080p. Given that Occulus is actually running two screens I figure the software must drive this. the problem is that are no 4K screens, So I can probably use two 2k screens with a high PPI, and drive them from separate cards. I say "probably" as until I get my hands on one or two, Or I see indepth reviews, I won't know. However, given that I have two monitors at work as single view pane, (under Linux) I figure it can't be that hard.
What this does is make Dual card setups possible for the mainstream, and AMD are making a push for VR.
tldr; Your economics argument is specious, AMD aren't going for the performance crown, as nobody who bought a 970, will buy AMD anyway. They'll buy the 1070 and lust after a 1080. What AMD are going for is market dominance, and market share, They will sell millions of these, to OEM's looking to build mass consumer products, and the console manufacturers & Apple. They sold it at $200 not to make excessive profit. But to own the market. With Intel retiring to the Perfomance arena too, (with the loss of 12K jobs recently) then AMD may well end up owning the CPU market with Xen too.
This is not about performance, it's about price. This is not winner takes all, this is commodity pricing, you have the wrong economic model. :)
Do remember that in high end (1070) the value is lower than in midrange. People got excited about the 1070 value because they are shortsighted, it might be good value for high end but the best value is never there. AMD here is likely to be offering very solid value and make it hard for Nvidia to beat them by a large margins. If Nvidia decides to beat them they can by a little but they can't make AMD look bad and AMD can slightly adjust prices if needed. Assuming this card is a bit faster than the 390, the pricing is just right. We'll see what the final power numbers are and the die size and only then we can figure out where AMD really is.
I think you're missing the point. In economic terms people who buy the high end cards are price-elastic. they don't care how much it cost.
Back in the day I bought an ATI 9700Pro, it was, for about 4-6 months, the fastest graphic card in the world. I paid a lot of money for it, I would have paid more, because I wanted the best. I wanted to own the fastest graphic card in the world. It broke my heart when it died, a heat pad shrivelled and detached and the GPU fried itself, even when I installed an aftermarket cooler it stayed dead. :(
Nobody but geeks and gamers actually upgrades a PC, there are far more gamers that play on a console, simply because it's easier and cheaper, than having to maintain a PC. Simply plug and play. This is why they write for the console and back port to the PC these days.
Similarly very few people actually build a PC from scratch, this is why Valve's survey is so useful. this is what the many people who use Steam are gaming on, and hence what developers should aim for to gain the largest audience.
By our very nature, taking about hardware specs on a technology web site, we are not "normal" indeed the Normals are by definition the mainstream, people who buy at best Buy or other big box retail stores.
This is why I say that people who consciously go out and buy a new GPU as an upgrade are the minority, and likely already have a preference/prejudice as to what they will buy, and the new 480 is aimed squarely at the sweet spot, of price & performance for people who are limited on what they can spend (price-inelastic) either with having somebody else tell them about it, or having somebody else build it. This is "normal" for PC hardware these days.
The stuff you'll find in big box retail is always exploitative. They usually give you a good CPU, typically over specified for gaming which is GPU limited, and give you a passable GPU, that will likely need to be replaced within a year. On a recent trawl through a refitted Consumer electronics big box I found that most of the PC's where offering an i7 6700 with a 970 for around a 1000 Euro. That is a serious chunk of change to drop (price elastic) for a PC, when you buy a console for $400 or less.
PC Master Race on Reddit have builds, from $331 to $901 all but one of them socket an AMD GPU because of price. The two cheaper boxes socket AMD Athalon's the two more expensive builds socket an i5.
There is an article on the Verge today all about Intel realising that increasingly they are not going to be 'Inside' on what remains of the new PC market.
So I say again, this is a land grab, colonising the parts of the market that the previous incumbents have left behind, and doing so at a price that appeals directly to the price-inelastic consumer, and OEM's that build for normals.
Whether people here like AMD or not, doesn't matter, this is not aimed at us.
What you seem to forget is the 9700 PRO came out after the Radeon 8000 series.. What happened there precisely? Do you remember? They were offering Geforce 3 performance at close to half the price... The next 2 generations saw cards from both companies coming out at in around that benchmark price (slightly higher.. but not much)
They did the same thing with the 3800 series against those magical 8800s..and didn't quit match the performance but came close.. by the time the 4000s came out they'd surpassed NVidia again and kept prices at that sweet spot.
Nvidia started raising prices again in the 500 series..even though Amd had comparable products (although late..) at cheaper prices.. now we see those bumps again just in time for AMD to bring them back down to reasonable levels..
ofcourse their not doing it for us (forgot to add that lol..) it's to brand recognition, and market share. AMD knows where it plays at and what it can do.. but it's been Nvidia all the way since the Geforce 2 to push the pricing envelope.
Do I remember? No :) I built my first computer from a kit in 1981, and went through many "home" computers, including a DEC Alpha UDB, (I'm a UNIX admin) before I owned a PC. The first PC I ever owned was powered by a P4 Northwood, and some crappy Nvidia card, a 240 or a 440, something like that. Then I spent more on the 9700 Pro than I had on the computer :)
That said after watching some videos last night I came to the conclusion that IMO, while AMD's products are "competitive" I believe that they have chosen not to compete for the performance crown, and are instead playing their own game. Namely occupying the low-middle ground, and selling millions of SKU's to OEM's, the console makers and Apple. We get to see discrete graphics cards, but we're not their audience.
praxis22 writes: > The stuff you'll find in big box retail is always exploitative. They usually give > you a good CPU, typically over specified for gaming which is GPU limited, > and give you a passable GPU, that will likely need to be replaced within a year.
(good posts btw!)
This happens for pro sales aswell. I'm helping a guy who bought a system which was mostly to be for pro tasks, with some gaming too. It had a decent enough CPU (i7 4770) but a pretty naff GPU (GTX 645 1GB), and cost quite a lot. I'm building him a replacement with a 4.8GHz 3930K, Quadro 6000 6GB and GTX 580 3GB for extra CUDA.
Seems to be a common approach by generic builders, they overspec the CPU and then fit a lame GPU. The PC the guy bought also had a bizarre RAM config (12GB with 3 DIMMs). At least he had the sense to fit an SSD himself (I'm so tired of seeing OEM systems supplied with 1TB rust spinners).
I was with you all the way till the Zen bit. It was like when you talk to someone and have a really good conversation then they say "and have you ever accepted the Lord Jesus into your heart?"
I run an i5 like most sensible gamers, I'm in now way an AMD fanboy, I bought my GPU for one game, so I'm defacto not normal. I just happen to think that the evident long running AMD strategy is very clever. I'm unlikely to buy a Zen/Xen (not sure how you spell it) but it will surely sell a lot if priced well.
My wife would like me to accept Jesus, but my heart belongs to Danu :)
Sadly with a lot of 'PC Enthusiasts' these days their brand loyalty is getting close to religious worship. Arguing against facts, logic and truth as they can't imagine a situation where their beloved brand could be wrong or worse or imperfect.
The reality is that AMD, nVidia and intel are all corporations out to make profits, if the market allows them to price gouge then that is what they will do, if the market allows them to rebrand old products as new ones then that is what they will do, if the market forces them to compete then that is what they will do, if the market forces them to innovate and release new products then that is what they will do.
None of these companies hold any true loyalty to any of you, they just want you to invest in their products and advertise them through word of mouth, which incidentally is not a regulated form of advertising and so can be far more effective than having to tell the truth about your products, your fanboys will say whatever makes them feel they have won the argument regardless of it's basis in reality.
Very true. When I built my pc 4 years ago, I was very proud not to have two components from the same brand except the RAM and PSU from corsair. And with a few updates, I still have no component and peripheral same except those two......and well, a gamepad and mouse from logitech.
I love my pc knowing that I investing in the best from all brands. Even though it's old and weak and....cheap in today's tech. :)
I don't understand people bitching. AMD is bringing VR to the masses. What's wrong with that ? Nvidia and Intel can sit this one out if they want, who cares ? PlayStation Neo, Xbox One VR and Nintendo will probably have a powerful APU with Zen cores and Polaris GPUs. Me ? I'm waiting for PlayStation Neo + VR headset. All for about a grand. This stuff is as good as a 35k Tesla. Yummy !!!
because each band expects to have the "bragging rights". Its always a back and forth. ATI won the crown? they blasted Nvidia, Nvidia won? the same in the opposite direction.
"What this is, is a land grab. In 4x terms this is a Rush."
There is nothing stopping nvidia from reducing 1070's price and positioning the upcoming 1060 (and 1060 Ti?) to counter 480, if they feel their market share is threatened.
"AMD will own, the laptop segment, with Polaris 11"
How would you know that? It depends on how GP106, 107 and 108 turn out.
"the Console segment, since Nvidia wasn't interested. Apple, and the consumer market. They also own the graphic subsystem with Mantle/Vulkan, So they get Android and iOS, etc. Since they have the consoles, and games port to the PC from consoles, it makes more sense to code to Vulkan."
Don't be so sure. There are two main gaming segments that do not really overlap: consoles/computers and mobile. In the first segment PS, xbox and PC rule. PS comes with its own optimized APIs and xbox and PC use DX. It would take a lot to convince developers to abandon these powerful APIs.
"At a stroke AMD now owns PC gaming into the future. MSFT continues to screw up, since Direct X12 doesn't support dual cards."
No dual cards? What do you mean? DX12 supports multi-GPU setups.
"... skyrim ... dx9 ... 4gb ..."
What does this have to do with anything? Off-topic.
"What AMD are going for is market dominance, and market share"
Again, there is nothing stopping nvidia from countering all this.
The Last time I bought a Laptop, 4/5 years ago, I bought a hybrid netbook with Nvidia's Optimus architecture. Woks well as a third party Chromebook, runs Linux well too, but the battery life sucks.
That's one thing I have noticed about big box retail, at least in Germany, that they no longer sell Chromebooks even as a paid installation like the rest, as it kills their margins. They're not the highest selling "laptop" on Amazon for nothing. That is the future of the consumer laptop, and thus the cheaper the chips the more margin the OEM makes. They do sell them in the UK however, that and very expensive laptops. There is seemingly no middle ground anymore. That is what I mean by owning the laptop market. Owning the rump of what's left of it. Globally there are only three makers of laptops with Clevo being the biggest. Everything else is just badged.
Sure there are plenty of indy games, which rely more on Steam for their target platform. With Vulkan I was talking about the big AAA titles, which for at least the past 5 years have been console first. they then port the console build to the PC. Skyrim, Witcher3, go look at the howls of the faithful about the graphics downgrade of Witcher3 because they chose to prioritise the console. The New consoles, (except Nintendo) will be running AMD. Purely because Nvidia wasn't interested, and Intel is incapable. Nintentdo has gone for a Tegra chip by Nvidia, so it can port to Android. Again using Vulkan. there were talks about optimising for Vulkan at the the recent Google IO, I watched them.
There is a shift underway, DX12 is built on Vulkan too, go check. This is shift back to a layer much closer to the metal, for better performance and memory management, essential for VR. This is a step change. It will take years, but it's already underway. Keep your hands inside the moving car, and watch out for changes :)
Have you used dual cards with DX12? I haven't, but I have read and watched a fair bit about people who have, and have thus taken to the interwebs to bitch about it. MSFT says they'll fix it. I haven't seen any comment about patching DX9 to use more VRAM. Though I have also seen a read a lot about that too. It's why I blocked the Windows10 upgrade. There aren't that many DX12 games as yet, so it will take years to fully phase out the older platforms.
What does it have to do with anything? It has everything to do with graphics cards, and the drivers and games that run on them. Not everyone has your use case, or habits. :)
You're right, there is nothing stopping Nvidia, except a desire. They're already looking to get out of the low end GPU business, and into AI and automotive, as anyone who's watched a recent Nvidia keynote, (like me) will have seen.
"That is what I mean by owning the laptop market. Owning the rump of what's left of it."
Umm, we still don't know anything about nvidia's upcoming mobile GPUs. You cannot be so sure when there is no comparison point. Nvidia will not just sit idle and lose market share.
"With Vulkan I was talking about the big AAA titles, which for at least the past 5 years have been console first. they then port the console build to the PC."
As was I. Again, there are only three main AAA game platforms, PS, xbox and windows, two of which use DX. Vulkan could become popular, but it doesn't mean that developers would abandon the currently known APIs. Vulkan could take a hold on android and maybe iOS (if it manages to convince developers to abandon Metal) but all of this is irrelevant because AAA games are not released on mobile.
DX12 based on vulkan? You mean completely? Source.
There might be a shift to low-level APIs, but it doesn't mean DX12 will not be popular. It's a well known, powerful API.
"I have read and watched a fair bit about people who have, and have thus taken to the interwebs to bitch about it. MSFT says they'll fix it. I haven't seen any comment about patching DX9 to use more VRAM. Though I have also seen a read a lot about that too."
Did you know that with DX12, game developers can take full control of multi-GPU setups and no longer rely on GPU drivers? Maybe the current games have not been optimized well for said setups. Just because MS haven't "patched" the old, obsolete DX9, doesn't mean they'd treat DX12 the same.
"What does it have to do with anything? It has everything to do with graphics cards, and the drivers and games that run on them. Not everyone has your use case, or habits."
Sure, but what does that have anything to do with AMD capturing the market?! Just because one old, dx9 game with tons of mods doesn't play well on windows 10, doesn't mean DX12 will fail. The market isn't decided by how skyrim plays.
"You're right, there is nothing stopping Nvidia, except a desire."
They are focusing on other areas because they already have a huge GPU market share. As soon as this share gets threatened, they will react. They will not just give up one of their main money making arms. Desire has nothing to do with it.
My point about DX9 is that DX12 is Windows10 (W10) only, and W10 is something MSFT needs the world to adopt. When even Paul Thurrot is shouting about MSFT's "indefensible" behaviour over forced W10 migrations then I think we can say that the adoption curve is behind schedule. The less W10 installs there are, the less demand there is for DX12 only games. For games and game engine companies, that matters. Which is why they're launching engines coded to Vulkan. Valve is also a lead cheerleader for Vulkan and the 600lb gorilla in the PC gaming space. Again, the PS4 and Xbone are AMD at the core. There is no alternative as Nvidia were not interested, and AMD setup a company division to do the job. (Forbes article above) AMD built Mantle and gave it to the Kronos group, who sucked the best bits into Vulkan. in a few years Vulkan will be the standard that everyone writes to for "gaming" be that PC, mobile or console.
DX12 will not fail, not like Mantle did, but it doesn't have to, if W10 doesn't attract enough users the money and development will go elsewhere, and the PC business is in secular decline, NVidia is just as much a victim of that as MSFT. Unless you're in the commodity business of making things cheaper for the platform that remains, this is where AMD, as a business, appear to be headed, They make the Chips that power the consoles, and thus have some control over the low level drivers & dev kits. Sure Microsoft wants DX12 to succeed, but it wants it's console to do the same.
Again, I'm not an AMD fanboy, I just happen think that the evident business strategy is very clever, doubtless they were lucky too, but as a full stack vertical, they seem to be in a good place to prosper in the declining PC business environment.
VR is the erstwhile saviour of #PCMasterRace but if Google/Android provide a cheap working headset with real presence the need for the Rift/Vive goes away, and with it the need for the PC backend.
"Completely, no. More as a catchup and because that's the way the industry/weather was moving."
So it is NOT based on vulkan at all. It was just perhaps inspired by mantle and is simply similar to mantle and vulkan, in the sense that all three are low-level APIs.
"I still say that AAA will be Vulkan first, eventually"
Using the words "might" and "could" won't hurt in that sentence. Right now, there are more available and upcoming DX12 games than vulkan.
"DX12 is Windows10 (W10) only, and W10 is something MSFT needs the world to adopt. When even Paul Thurrot is shouting about MSFT's "indefensible" behaviour over forced W10 migrations then I think we can say that the adoption curve is behind schedule. The less W10 installs there are, the less demand there is for DX12 only games. For games and game engine companies, that matters."
Just recently, windows 10 became the no. 1 OS on steam and is still climbing. Gamers are clearly adopting it fast. The overall, worldwide share of windows 10 is still low, but a huge lot of PC users never play games to begin with, so the steam numbers are a better indication.
I don't doubt that more upcoming games will adopt vulkan, but I don't think we'll see many vulkan only games, except for games made by id Software.
"Which is why they're launching engines coded to Vulkan."
You don't know that for sure. There could be many reasons. Maybe they just don't want to rely on one API alone.
"Valve is also a lead cheerleader for Vulkan and the 600lb gorilla in the PC gaming space."
Of course they are. They are desperately trying to make SteamOS happen. They have an agenda, not that it's a bad thing.
"Again, the PS4 and Xbone are AMD at the core. There is no alternative as Nvidia were not interested, and AMD setup a company division to do the job. (Forbes article above) AMD built Mantle and gave it to the Kronos group, who sucked the best bits into Vulkan."
I don't see why it matters if consoles use AMD chips or not. Their GPUs work just as good with DX12.
"in a few years Vulkan will be the standard that everyone writes to for "gaming" be that PC, mobile or console."
How come you never write "could" and "might"? You are so sure as if you went to the future and came back.
"if W10 doesn't attract enough users the money and development will go elsewhere, and the PC business is in secular decline"
Windows 10 is gaining among gamers, and is no. 1 on steam as I pointed out, and X1 is DX12 too.
"NVidia is just as much a victim of that as MSFT. Unless you're in the commodity business of making things cheaper for the platform that remains, this is where AMD, as a business, appear to be headed,"
NVidia still offers cheaper GPUs and will keep offering them. No change there. Gamers usually buy cards based on their price/performance, be it NVidia or AMD.
"They make the Chips that power the consoles, and thus have some control over the low level drivers & dev kits. Sure Microsoft wants DX12 to succeed, but it wants it's console to do the same."
Are you implying that AMD will cripple DX12 on xbox? MS would never allow that. Actually, MS themselves might be involved in writing the drivers. There is no way MS won't make DX12 run as fast as possible on their own console.
"Again, I'm not an AMD fanboy, I just happen think that the evident business strategy is very clever, doubtless they were lucky too, but as a full stack vertical, they seem to be in a good place to prosper in the declining PC business environment."
Never thought you were. The PC business still mainly relies on price/performance, not APIs. As long as NVidia keeps things competitive, they will keep selling.
"VR is the erstwhile saviour of #PCMasterRace but if Google/Android provide a cheap working headset with real presence the need for the Rift/Vive goes away, and with it the need for the PC backend. "
AAA games come out on consoles and PCs. As long as they keep their lead in performance over mobile, it'd stay that way, VR or no VR.
'How come you never write "could" and "might"? You are so sure as if you went to the future and came back.'
It's called a narrative argument, (look it up) as opposed to rebutting statements. Which stands in many cases as an opening gambit to a Staw Man or Reduction ad Absurdum.
"Are you implying that AMD will cripple DX12 on xbox?" No. Far from it, I think AMD have actually a vested interest in DX12 over and above Vulkan. They want as much support for their cards as possible, preferably support that doesn't require them to write the drivers. This is part of the whole push to Vulkan, offloading the driver support to the games writers, via the low level API.
I would argue that Price/Performance is only relevant at the low end. Nobody who buys an expensive card cares about that, they want speed and/or specs. Objectively by that measure my R9 390 is a lousy card due to the power requirements. My 850W PSU sounds like a hurricane when I have a game running, but I wanted the speed and VRAM, and I pay the electric bill, so.
I appreciate you disagree with what I say, but where is your argument?
FYI, I love AMD's push for fast but cheap cards. It'd be a win for consumers a.k.a me, but to think that this would suddenly tip the market share scale, would be wishful thinking.
With DX12 and Vulkan, there are two forms of Multi-Adapter support. Implicit & explicit. Implicit Multi-Adapter is when the OS does it for you, this mode is the one where the OS can use two different cards, AMD +Nvidia, Nvidia +Intel, etc. And games that write directly against DX12 can use the two cards as one device.
Explicit Multi-Adpater is enabled at the application level, (by games) and requires that all cards are identical.
It is my understanding that DX12 on Windows 10 does not currently support Implicit Multi-Adapter, though MSFT have promised to fix that. All the DX12 games seen so far have been Explicit Multi-Adapter when they have supported multiple cards at all.
It supports both. THIS VERY SITE has tested both configurations. You can do it yourself as well will software and hardware already on the market. Dude, while you may be an experienced economist, you don't really know much about the computer hardware/pc gaming world. I mean honestly, if the RX 480 competed with the 1070, AMD would be selling it for $299, not $199. It would STILL be an amazing deal and they would sell shitloads of them, plenty enough to gain marketshare even.
If the details of the card mean that the benchmarks are what we expect, then for $200, we get a card that beats the outgoing GTX 970 easily. The GTX 970 is currently the recommended minimum for VR. To reduce the price of entry to VR AND get a better experience by $130? Hell yes thank you.
" can appreciate the price point but it is really disappointing to see that AMD couldn't reach higher frequencies with such a huge jump in litography and the switch to FinFET"
Well it's made at GloFo and it's GCN. GCN clocks lower than NV uArch. Has been the case in last couple of years. GCN also has the performance/power sweetspot at much lower clocks. 290/290x sucked because they were basically factory OCed. If you reduce clock and voltage just a bit, efficiency skyrockets.
I suspect that their will be AIB cards (or even from AMD?) with better components and cooling and more power connectors that will allow a pretty significant OC. Such a card should be able to reach GTX 1070 given GCN 4 really is a major overhaul.
Sorry to say but AMD's clockspeeds are much better than Nvidia's. AMD is just touching on the power of 14nm with these conservative speeds in their first card while Nvidia is already close to peak in its first generation yet they barely are beating their last generation and their price's alienate most of their customer base. AMD has the reverse going on here, they just brought the performance of their most sought after card down to $200 with wattage that anyone can deal with and possible VR performance (if you care) similar to that of their $500 range cards (Fury Non-X). They have sane clockspeeds and less CU's and less Stream processors overall and less memory bandwidth. If it delivers on the performance it means that GCN 4 is a real killer similar to how GCN 1.0 was when it came out in 2012. Nvidia's architecture is fat, inefficient and apparently went so far backwards that they needed 60% more clockspeed to achieve 10-30% more performance.
Inefficient? Who's the GPU maker that had to create massive almost 300W cards that ran hotter than a certain other manufacture who could run their cards a 200-300mhz higher to try and compete?
AMD has had horribly inefficient GPU's and CPU's for years now.
Although I'm an AMD fan, I have to agree. All we can say for now is that Nvidia has managed to create the first GPU architecture that can clock well beyond 1GHz, even reaching 2. Doing this while keeping power levels as low as they are show that this is good engineering, not pushing clockspeed to make up for bad design.
On the other hand, if CPUs are at all comparable, Apple has shown for years that slower, more efficient cores can outperform higher clocked ones while conusming less power. It all comes down to architecture.
AMD seems to be making good progress in curtailing the inefficiencies of GCN - if this performs as an R9 390, but uses only 150W, that's 100% performance at 55% power. That's pretty great, and far beyond what the move from 28nm to 14nm accounts for. They still have a ways to go to catch Nvidia, though, as this won't come close to the performance of the 1070, yet uses as much power.
TDP is different from Power Consumption. RX 480 has a TDP of 150w but it doesn't mean that it consumes that amount of power. It consumes around 110-130w at peak but they rated it as 150W for as the maximum power it can get when overclocked. Check the reviews of GTX 1080 being rated at 180w but it spikes way up 200W+.
GTX 1080 spikes at more than 300W - but for a few milliseconds at a time. It averages (even over very short time spans like 1s) at or below TDP. Tom's Hardware has an excellent part on this in their review. AMD does the same thing - the Fury X (which is what lives i my PC) spikes around 400-450W, but only for a few ms - and makes up for it by dropping to 100-150W, averaging out around 275W - again, check out Tom's' review). On modern GPUs TDP is roughly equal to average power consumption. After all, the whole point of the TDP is as a guideline for cooler designers - it's the amount of heat that their designs need to dissipate effectively. Sure, overbuilt coolers are good for silence, but bad for costs, and board partners sure don't want to be forced to make overpowered coolers unless they want to.
AMD got screwed when they were betting on the 20nm process to pan out. It didn't, and nVidia was able to out-engineer them on the 28nm process. Ultimately I think AMD has been just trying to get by and bide their time until they had access to 14nm.
CPU wise? AMD finally came to their senses that the module design meant way lower IPC. While in a perfect world where every piece of software is 100% designed for multiple cores, the reality is much different. Games are just now getting to the point where having just 2 physical cores isn't enough.
"Nvidia's architecture is fat, inefficient and apparently went so far backwards that they needed 60% more clockspeed to achieve 10-30% more performance."
Did you notice that there were also far fewer cores in Nvidia's new arch? No? Well there you go.
"Nvidia's architecture is fat, inefficient and apparently went so far backwards that they needed 60% more clockspeed to achieve 10-30% more performance."
LoL! Since when is increasing the clock speed (particularly while lowering power consumption) a bad thing?
"To increase performance they increased performance!" - Jimster480
The facts are that currently nVidia have the best performing GPU's on the market for consumer level products. The 1080 is an incredible card that outperforms in actual gaming benchmarks any other card. This is impressive, no matter what side of the fence you sit.
The RX 480 is also an impressive card, it redefines what is expected at this price point however until there are actual gaming benchmarks widely available using off the shelf RX 480's after launch then you can't say for sure how it will stack up in real world terms in VR to the R9 Fury(non X) or any other card. On paper the card looks efficient and likely to perform well but until I see real world gaming benchmarks I can't say for sure how good it actually is or will be.
The 1070 looks set to perform well at it's price point too, but again until it's launched and I see actual gaming benchmarks I can't say anything for sure.
The fact is that AMD has paper launched a new card that is creating buzz in it's target price point and nVidia has physically launched a card and paper launched another which are doing the same.
Clocks are irrelevant , we don't know what this thing is really. Anyway clocks seem to be 1266MHz. The real TDP is likely lower at stock and 150W is the upper limit you can feed into it.
You are assuming that polaris doesn't have a huge OC ceiling. They remember the backlash over fury x, perhaps they intentionally didn't clock their chip as high as it could go this time around.
We wont know until third parties can take a whack at polaris.
Well the power of the card for the price seems nice. But the presentation of "You don't need faster GPUs any more, you just need them to be cheaper!" was a bit silly. But I guess that was their best attempt to positively spin their current lack of a high end GPU (something that we'll magically need again, apparently, in 6-9 months when Vega comes out).
That's absolutely correct. The only thing faster gets you is an even stupidly higher resolution. The *VAST* majority of people never buy $500+ halo cards, and are perfectly fine with their gaming experiences.
No it's not absolutely correct. Plenty of people buy cards for more than $200. The vast majority of people on this Earth don't buy any discrete GPUs at all, some of them game, and many of those are perfectly fine with their gaming experiences.
There's still a market for cards costing more than $200 just as there was last year and the year before.
its true that it is under 10%, I help out alot on a site where people ask for builds. I would have to say that 85%+ of the builds are using a 380/960 and only some are using a 390~ and basically none are using a 980Ti or a FuryX.
Under 10% of volume, yes, but not of revenue. And the gross margins of the lower priced parts are lower. But it's not even really my point. My point is that the market still exists. In fact with the previous generation of cards it grew. VR only puts more demands on performance, so the market for high end cards will probably still grow with this generation as well.
The demand for more and more GPU computing power will be there for a long time, because there are still plenty of things for that power to be used for.
Steam says otherwise. Even when you calculate for revenue. Halo products really are just that. They basically advertise the rest of a manufacturer's range.
Maybe you're thinking of the price of the graphics card, not the price of the GPU. The GTX 970 was a significant factor in NVIDIA's financial success the past year. You might also be looking at the steam survey and comparing the number of GTX 970s to the number of all the other graphics cards on the list, instead of thinking about what percentage of those cards were bought in the time since the GTX 970 was on the market.
But in fact, if you look at the Steam survey it says the opposite of what you claim it says. The top card on the list is the GTX 970, with over 5% share. The GTX 980 has over 1%, the 980 Ti has 1%. That's over 7% on cards that still cost over $200 to this day, and it doesn't include any mobile offerings or any AMD cards, or integrated GPUs, which are included when figuring the percentages. It also doesn't include cards that cost over $200 when they were bought, like the 780. If we just look at DX12 cards, the 970 is at 7%, the 980 is 1.4% and the 980Ti is 1.3%. This is already almost 10%, and DX12 cards go all the way back to the 500 series, as well as including Intel 4000, 5000, and 6000 integrated graphics.
It's pretty clear that, going by the steam surveys, much more than 10% of all discrete graphics cards sold in the last year and used by steam users have cost the purchasers more than $200.
The 970 is the big seller, and it is also more than $200. If you go by the steam surveys, there are a little over 3 GTX 960s for every GTX 980, and there are 5 GTX 970s for every GTX 980.
And if it's really slower than the 390X it's going to be going toe-to-toe with NVIDIA's largest GP106 card if NVIDIA clocks it at 1.6 GHz. But the TDP of the NVIDIA card is going to be less than 150W. NVIDIA would have to price that card at $220 or less, I think. It's not too far out of line with what their Pascal pricing has been so far. The GTX 780 was priced at $650 and the GTX 760 had half the cores (clocked higher) and was priced at $250. The GTX 980 was $550 and the GTX 960 had half the cores (clocked the same) and was priced at $200. The GTX 1080 is prices at $600. A GP106 card with half the cores clocked the same, priced at $220 seems about par for the course.
But the efficiency is still less than 1070 GTX 1070 - 6.5 Tflops for 150W TDP with 8 pin connector RX 480 ~ 5.5 Tflops for 150W TDp with 6 pin connector(absolutely no overclocking headroom)
Yeah. I do think the RX480 is far less efficient, but TDP is a poor substitute for performance per Watt efficiency, and I wouldn't directly compare real world Pascal performance to real world Polaris performance by theoretical peak performance. NVIDIA cards have been more efficient in terms of real world performance compared with peak theoretical performance. That pushes the efficiency argument further in the favor of NVIDIA, of course. GCN 4 may make more efficient use of its peak theoretical performance than previous generations of GCN, but I doubt its caught up to Pascal.
If anything, 150W is an overestimate. Think of it this way: 75W Max for PCIe x16 slot. Another 75W for 6-pin or another 150W for 8-pin.
For the RX480 the 6-pin just means that they expect it to draw more than 75W and that overclockers might want a little head room for the reference cards.
For the 1070, it is probably running close to the 150W TDP at 100%, so NVIDIA opted to give another 75W to offset peak power throttling and to give headroom for factory overclocked cards (a la 980 Ti).
But of course, that's just peak power draw. I expect that Polaris averages closer to 100W and peaks near 125W.
Tom's Hardware's reviews have great power measurements (detailed graphs of card-only power draw over time), but unfortunately they don't have that for the 1070 yet. However, if the 1080 is anything to go by, the 1070 averages very close to TDP (the 1080 in their measurements averages at 173W, while peaking at around 300). The 8-pin connector is most likely for several reasons: a) most importantly: Pascal, like Maxwell, spikes at far higher power than its TDP, but only for a few miliseconds at a time, and 8-pin power connectors make up for ay any out-of-spec power spikes; b) to add a modicum of OC potential (although that drives the power spikes far higher, if the 1080 is anything to go by), and c) to underscore that it's a high end card - using a single 6-pin connector for their 2nd-from-the-top card would make many enthusiasts wary and skeptical of its performance potential.
Considering that the Fury X shows pretty much the same behaviour as Nvidias architectures in terms of average draws and power spikes, I'd be surprised if the 480 was far below 150W.
The truth is that you don't. If it weren't for scam modes like "Ultra" with fake "optimizations" like "hairworks" there would be no need for these pricy cards. Try checking around YT to see people who do these sort of independent investigations into Ultra settings in most games and you will notice that there are stupid things such as 8 million tessellated textures on rocks for no reason. Something that just kills GPU performance. I have found that I can still play most games on high @ 60fps in 1080p mid/high settings with my 2013 7870Ghz which I bought for $125 on sale back then.
Pretty sure some variation of this card will be my next purchase due to price and my performance needs but still disappointed in the specs that were shown. Considering the massive difference in clock speeds the GTX1070 can do at 150W it's very surprising how low the RX480 is at 150W. Either the specs shown are still concealing real clock speeds and power draw and are better than shown, or the RX480 is a low end Polaris 10? part and there is a lot of headroom to release faster 480 and higher specc'ed 490 parts albeit with increased power draw. Still seems Nvidia did some impressive engineering to get the clock speeds, performance and power draw at the levels they did, AMD maybe not quite so good.
You can't compare TDP from NV and AMD. NV usually greatly understates TDP as was the case with 970 and 980. In real games the power use difference of whole system between a 970 and 290 was much less than the difference in TDP.
TDP is not a measure of average power consumption. The TDP rating indicates the maximum sustained heat the chip is allowed to dissipate in real world usage, or put another way, the minimum heat the cooling system must be able to dissipate. If NVIDIA were to "greatly understate TDP" then their cards would be burning up left and right because their cooling systems wouldn't be able to keep up.
If a card were to be consistently closely approaching its TDP across a wide range of uses then I think that card would probably be TDP bound. If one card is far from its TDP in a particular sustained real world use case, then there must be some other real world use case where it uses a lot more power, because otherwise the TDP claimed for the card could be lowered.
As far as what you said about the GTX 970 in comparison to the R9 290, it seems to me that you probably saw figures in just a small number of use cases. But if the 970 really is able to more consistently approach its TDP over an average of a large range of uses it seems to me that it implies the underlying architecture of the GTX 970 is more balanced, because there must be more extreme outliers for the R9 290 to require its TDP to be upped for those cases.
Regardless, the R9 290 consistently used significantly more power than the GTX 970.
An interesting thing I've noticed is how much NVIDIA's transistors per core count has gone up from Kepler to Maxwell and also from Maxwell to Pascal while AMD's transistors per core count has been relatively flat from 1st through 3rd generation GCN. I wonder what the transistors per core situation is for 4th gen. GCN.
Well, in Maxwell to Pascal you have actually changed process nodes, so you expect the number of transistors that can be placed on a die of the same size to increase. Now from Kepler (GTX 680) we went from 3.5 billion transistors to 8 billion in Maxwell 2 a 128% increase, all on the 28 nm process. While, AMD went from the 4.3 billion transistors in the HD 7970 (GCN 1) to 8.9 billion in the R9 Fury X (GCN) and increase of 107% all on the 28 nm process. Doesn't seem that different to me.
I'm not talking about transistors per die area, but transistors per compute core (CUDA core as NVIDIA calls it). What that means is that NVIDIA has been using a greater percentage of the total transistors for cache, register files, ROPs, schedulers, and the like, and a lesser percentage on ALUs. But they have been getting greater efficiency out of their architectures by doing so.
Finally a card that I can dream of buying, something that won't make the poor me feel too bad investing on. If it can really offer gtx970 level performance, I'm in. Looking forward to release and benchmarks.
Will make a nice combo with egp docks when cheaper ones come out.
Where I live, the lowest price I found for the gtx970 is about $390. And that for r9 380 is about $320. Average price is about $500 for gtx970, and about $400 for r9 380. So the base price is a very important factor here. Gpu are really expensive, for $199 you don't even get a gtx950 2gb ($220), and barely r7 370 2gb ($195). Lowest prices. Can't even compare...... Other pc components like cpu, psu, hdd has the same cost margins.......
The 390 matches the 970 performance wise. So .. according to reports this 480 will likely beat the 970 across the board but not by a lot.. (unless maybe you factor in the ram equations..) If I had to guess .. I'd say 5-10% faster...
Can someone explained to me why is there a new for a VC to be efficient in power usage? If you are going to do any heavy gaming, your rig probably be one that uses a lot of power. How does using less power a reason for a gamer to get this VC? Is it just to save a few bucks from electricity or is there a more compelling reason than that?
While power efficiency definitely helps save a bit on electricity, I think the better reason is that people won't need such a high wattage power supply to run the video card along with the rest of the machine. This has always been a problem with AMD since their video cards are almost always more power hunger than cards from Nvidia. Meaning if you wanted an AMD card, you'd likely need a beefier power supply than you would've needed if you went Nvidia. This might not be important for someone who with a big budget, but people who want the best performance for the lowest price should generally be happy about this.
Actually, you only needed a good/premium quality relatively low wattage PSU for any one-chip AMD card. I mean I'm running my 290X on a 400W Seasonic X, and even with power limit raised to +50%, there is no problem with it.
I would say it's got more to do with branding and image than anything else. IMHO, AMD is specifically trying to address some really bad press they've been getting over the last few releases which has hurt their brand badly.
As I understand it AMD were really banking on 20nm GPUs, and when that process basically failed, the only way they could stay competitive was to essentially release cards which were clocked to their very limit. Have a read of any review of the R9 290 or 290X. While okay from a price/performance perspective, the cards were slammed for running extremely hot, unbearably loud and having basically no overclocking headroom whatsoever. They were already maxed out. It was seen (I think rightly so) as being a pretty inelegant and brute-force attempt at remaining competitive. The Fury line did little to shed that unwanted image.
So from a pure performance perspective, I agree with you that power draw isn't that big of a deal, but for sure heat, noise, power supply requirements and overclocking headroom are all factors which play into the value proposition of a card. What's really going on here though, is AMD trying to shed the unwanted image of being the brute-force, hot, loud and power hungry option on the market.
It has to do with removing that heat from the video card. It has nothing (at least in discrete market) with overall power usage. Very very few people make gpu purchase decisions based on the power target. That was one of the reasons that the Titan X and 980Ti were so impressive, they were able to build a chip that was close to the theoretical maximum chip size (which is aroune 600mm2). Basically heat goes up exponentially with the physical size of the chip.
I'm afraid you're right. Too bad that AMD can't use TSMC anymore, instead relying on GloFo... GloFo almost killed their cpu business and (indirectly) hurt also their gpus. AMD wasn't prepared to stay at 28nm so long. Judging by the 480 specs, I think that 14nm GloFo is not so bright either.
Yes, but I am under the impression that Samsung's 14nm is not as efficient as TSMC's 16nm. Look at the iPhone controversy (they are dual sourcing the SoC from both).
and what about GTX 1060 of course it will have a lower TDP maybe around 110-125w and if Nvidia wants to price them at 200 dollars then AMD loses the Mainstream market. I like AMD and i want them to succeed but not this.
Will AMD lose the Mainstream market just on the basis of NVIDIA releasing a "more efficient" card at the same price? What if that efficiency comes at a significant performance hit? That's why my last $200 card was an AMD - the 960 could not compete on performance with AMD's offering at the same price.
But where is it? The 480 has been launched. Nvidia could do lots of things. What matters is what they are doing. The 480 will have this market to itself for a quarter+.
AMD can certainly use TSMC if they want, just like they can certainly buy Samsung HBM2 if they want. Lol, its so funny that people think that for some reason they are "locked out" of these options.
You do realise that TDP is not Power consumption right? 1070 is going to use an 8pin power connector which should give it room to use up to 225w so it deffinitley uses more than 150w, the rx480 has a 6pin connector so max power is 150w which means it'll use less than 150w... just guestimating but i'd expect actual power usage under load of the 1070 to be between 150w & 180w while the rx480 will probably sit between 120w and 140w... anyway we will hav to wait till actual hands on reviews come out to get more exact figures but dont look at tdp as anything more than a rough guide when it comes to power use.
Ok, having looked more closely at gtx1080 power draws, they do seem to average at about 180w, peaks are still over 200w (as an aside EVGA look to be offering an OC'd one with what they claim as max power of 215w, so looks like thay will be trying to push the limits) so maybe the 1070 does average around the 150w mark and just needs the extra room of the 8pin for peaks. That said the rx480 should be averaging somewhat less than 150w due to the limitations of the 6pin connector.
Yep, all of the energy used by the GPU turns to heat. It doesnt turn into motion, or light, etc, so it MUST turn to heat. Some of the power is used by the fans, of course, but other than that, its heat!
Exactly, nVidia is likely better but not worth the extra cost if those numbers hold up. Were I still single with disposable income I'd be buying nVidia because better. Now that I'm Al Bundy the RX480 is much more tempting because value.
Where on earth are you getting 15% less performance than the 1070 from? To quote PCPer's review of the GTX 1070: "The GTX 1070 based on Pascal is never less than 53% faster than the GTX 970 when running at 2560x1440 and is nearly twice the performance in Rise of the Tomb Raider!" They also show it performing around 50% above the 390X. The R9 390 performs roughly on par with the 970, but beats it in a few situations. This seems to be expected to slightly outperform the 390. In other words, I'd expect this to perform at 60-70% of the GTX 1070, not 85%. Even with that considered, it's good value, but nowhere near what you're saying.
And no, I'm not an Nvidia fanboy. I've never owned an Nvidia card, and my current PC has a Fury X in it. I'm simply opposed to making bogus claims, especially ones that build up unrealistic expectations of underdogs you really want to succeed.
I'm not sure how Anandtech measures but based on the info so far, just looking at teraflops, about 6.4 for the 1070 and over 5 for the RX480, can net you anywhere from 20% to 15% less performance. I used 5.2 teraflops(we don't know exact values yet) for AMD so a simple calculation of 5.2/6.4 gets you 81%.
and yes, that is an oversimplification so I will still wait for actual benchmarks of retail units but just at a quick glance of the info available, I agree with at80eighty.
My Fury X measures in at 8,6 GFLOPS. The 980Ti lands at 5,6 - yet outperforms my card in a number of games. The same is systematically true across the AMD and Nvidia lineups. AMD cards are far superior to Nvidia cards in compute, but not in gaming. Wether this is architecture or driver related is something I'm not qualified to answer, but that doesn't make it less true. A 15-20% deficiency in FLOPS to an Nvidia card does not translate to 15-20% less gaming performance, unless AMD has made some gargantuan architectural strides witl GCN 4.
"A 15-20% deficiency in FLOPS to an Nvidia card does not translate to 15-20% less gaming performance, unless AMD has made some gargantuan architectural strides witl GCN 4."
Yeah, I know thus why I said it was an oversimplification and was only base on information we had NOW. There is no other information to go off of. Sure, I could throw in Gameworks and say how that screws AMD everytime so their true performance is never known because they get second-hand dev support but that still doesn't say about how either card's architecture will be received or how they have improved until we see actual benchmarks.
There is more information to go on - knowledge of how FLOPS translate to graphics performance on the different GPU architecture in the last few generations. We have no reason to think GCN 4 is a paradigm shift in terms of architecture - in that case, AMD would make more of a marketing point of it - thus it's reasonable to assume the relation between FLOPS and gaming performance should be roughly equal. 10% better? Sure, that might happen. But not 50%.
Stating that "we don't know, so I'll assume massive improvements" is ridiculous. You're only setting yourself up for disappointment. I'd rather be realistic and end up getting a positive surprise, if that's the case.
and what about GTX 1060 of course it will have a lower TDP maybe around 110-125w and if Nvidia wants to price them at 200 dollars then AMD loses the Mainstream market. I like AMD and i want them to succeed but not this.
What's RX 480? It was what, less than 3 years ago, that AMD said it's switching to a naming scheme that's blah blah blah, with numbers representing something and completely confusing the consumer but it makes sense, right, because it's better than one number. And then it introduces Fury, and now RX.
At least they're consistent in trying to confuse consumers, got the give them that.
personally i wish that they stil used r7 and r9 but that it meant something specific....either one particular architecture or memory type or memory bus width or something......afaik al those numbers meant before were high end or mainstream which doesnt give much info.....if youre going to have r anything in the part number it should mean someting...then again i am a bit of a purist i just assume they start labeling them radeon 1, 2, 3, 4, etc. where the number is the relative speed compared to the others...as they make improvements in between cards they can list them as 1.1, 1.2, etc
As a casual gamer this hits me at my home. A Price range I can afford. Features that I want such as hdmi2 and Dp 1.3/4. I am hoping HEVC decode is decent with main10 profile (haven't seen anything about that) and it makes a 1440p Gaming experience possible (let alone VR which well, lets just say I want my own spaceship).
There is a lot to like. But then I can't justify blowing $500 USD on a card so the 1070/1080 are interesting and COOL! but ultimately not going to make the sale.
As a NVidia fan for the last 3 generations... Well Played AMD, Take my Money.
wait for the 1060 to come out before you make that purchase....personally i am an amd fan all the way but if that 1060 is going to be only slightly more expensive an better than a 970 it might be worth it to go for that...good chance you would have better efficiency and more overclocking headroom...that said, i usually roll red team myself so ugh im torn...
amd is smart. There making a card the avg person can afford. If this 480 is as good as a 980 for 200 bucks. This card will get alot more sales then the new nvidia cards.
With single 6 pin and 150W TDP and 14nm FinFET process this card screams non-referent designs. It seems for me AMD did not waste too much time on expensive PCB board but will leave it to the partners imagination :)
I don't get it, is there currently a game that R9 380 can't handle in 1080p max details!? I'm not interested in VR stuff yet, is there a reason to upgrade to this card for 1080p gaming?
I thought that and my initial excitement has been dampened by the specs. Granted its new tech so we might get a gain there, the power draw is less so possibly less heat, but overall my current R9 390 compares well with the specs I've seen so far for the RX 480. I'll wait to see the reviews and the benchmarks before committing I think.
i still have an r9 280x and at 1080p there isnt anything that i play that i cant max out at 1080 with more than playable framerates...battlefield 4 runs great, far cry no problem, etc....i have to think and hope that this is more for 1440p...i am personally hoping for playable 4k60 rates with at least medium settings but i am skeptical
It was always optimistic to expect a mid range card to offer a substantial upgrade to your high (ish) end card which was released less than 12 months ago. I don't think that's ever been the case.
I don't get it, is there currently a game that R9 380 can't handle in 1080p max details!? I'm not interested in VR stuff yet, is there a reason to upgrade to this card for 1080p gaming?
Depends how long do you want to endure with your card and whether you want max quality or are fine with some lower ones...
As for max settings... If you upgrade every generation, both will be fine... If you upgrade every 2 generations, 1070 should be fine but hard to guess about 480 now... If you upgrade on longer basis you might want the top star.
I still run on 2011 GTX-580 and only recently started hitting the wall (1440p as well). So unless something big happens flagship card should be pretty decent for 4 years. One step lower would hit the limits much sooner, my guess is ~2,5yrs. Lower cards then, if you want to play on max quality you'd be swapping every 12-18 months, some fresh&crazy games won't be able to run it on full detail. As for Rx 480... nobody has a clue how strong it will stand, but I'd guess it to be lower than 1070 so guessing 12-18 months.
Just now started hitting the wall on a 580 at 1440p?! i had a 670 (roughly equivalent to your 580), and that could barely play any games on med to high settings at 1440p. you must've sacrificed quite a bit on your graphics. Only when i upgraded to 970 was there relief at that resolution.
Nope... I play on max and had no issues so far... until Witcher 3 and Black Desert Online (this year, I waited with W3 to get patches done) which both turned it in a slideshow renderer (below 20FPS). But until then, no visible issues. Not that I would start drilling my knees if FPS dropped to 30ish. That given I have OCed watercooled model from EVGA, but that's why I subtracted a year from useful-life-span.
You're right, the GTX580 can't run games from 2012 and onwards at 1440p in high settings. It will do games from 2010 like Fallout New Vegas in the highest settings at 1440p, but Dragon Age Inquisition (2012), forget it. On highest settings the GTX580 will squeeze out 20ish fps at 1440p.
I've tested a lot of 580s, including numerous 3GB models (I have six of the completely crazy MSI LEs), also tested 970, 980s and 980 Ti. A single good 980 beats two 580s SLI, sometimes coming close to three 580s. A 970 is about on par with two 580s.
I used 2x 580 3GB SLI for a fair while, but I'm glad I switched to a 980 as even two 580s couldn't handle what I was trying to do with custom settings for Crysis and maxed out settings for E.D. (VRAM capacity being the issue as often as performance). All depends on the game, resolution, and one's predilection for maxed/custom settings, mods, etc.
If one uses typical middling game settings though, then 580 SLI is still pretty decent in many cases, though the 1.5GB versions would likely be not so good for 1440p. Thankfully, 3GB models are cheap.
i ran an r9 280x at 1440p with high (not ultra) settings and aa and af off with no problems, i dont check the fps because i dont really give a shit what the number is so long as it is playable and it was more than enough.....that said some more eye candy would have been greatly appreciated....i have to think for current games this will be fine and honestly depending on how developers take on dx12 might be ok for some future titles as well, but maybe not...?
The 390 does better at 1440 than it does at 1080, so given that the 480 is broadly similar to the 390 I'd say yes. especially in smaller case with a smaller PSU.
You have seen the reviews where the 1070 _crushes_ the 390X at 2560x1440, right? As in ~50% better? I'm an AMD fan, but this card isn't going to touch the 1070. No way. I doubt it would get anywhere near 15% below the 1070, even in best case scenarios. Nor is it meant to, at that price. but if it performs 60-70% of the 1070 at 55% of the price, that's great value for money, and a clear win for AMD.
Given that Valve, Facebook & Google are behind it I'd say that yes, VR is here to stay, though you may want to wait for the next headsets, or see what Google has in store for Android "Daydream ready" phones.
This looks promising, and unlike most people out there (at least that's how it seems) I think the "mass market first" strategy is a good idea. Sure, flagship products draw customers, but these will hit the market before Nvidias GTX 1060 (unless Nvidia rushes the launch due to this), and if the 1070 and 1080 are anything to go by, at a lower price.
That reference cooler looks nice, I hope it also performs decently and is actually going on sale. They showed a similar cooler for the 3X0 series, yet that never made it past R&D, it seems.
I find it strange that people are divided over Nvidia and AMD. It would be fair to say that traditionally those who don't care about Power consumption/heat/noise like AMD?
i run a midrange rig and play at 1080 or 1440...amd and nvidia both have reasonable cards for that segment at reasonable prices...im just a typical consumer....heat, i dont care, power consumption, i dont care, noise on my card isnt bad (sapphire with dual x fans) so thats not a problem, though i will say i wish i could overclock more (any!) it isnt a deal breaker for me.....personally i like amd's software over nvidias, simple as that, i like the way their config is done
Actually not quite, the 290X at launch was worse for power/heat (just checked the Anandtech reviews of both to be sure), but the 480 was certainly louder.
Mind you, have you ever heard two 7970GEs in CF? It's the loudest setup I've ever heard so far, worse than my oc'd 900MHz 580s on max fan.
i dont care about VR at all, not until prices of all of the gear comes way down...i want to see how this compares to the 1070 and supposed 1060...and yes i realize the 1070 is more expensive and the only report i saw of the 1060 speculated that it too would be slightly more.......but if i can game at medium settings at 4k60 on a 1060 but not on an rx480 then ill spend that little extra, and if all i need is that 480 then i will certainly be staying on team red
No way this is even close to the 1070, but I am curious to see if this can at least equal the 980 and what the price and performance of the 1060 is going to be. Could be an interesting card for TB3 enclosures, as TB3 would be less of a bottleneck (if at all) compared to a faster 1080 for instance...
maybe i am wrong here but my understanding is tb3 runs on a pcie x4 3.0 bus and from what i have seen in testing there is only a 2 or 3 percent difference between x4 and x8 and no difference between x8 and x16 in terms of performance from any graphic cards up until now....i think it was a titan x they had done the testing with but i would have to find the article or video or whatever...my guess is that either 1080/70/60 will run slightly below where it would on a full x1 interface and wont be noticeable in gameplay except for those cases where the game is right at 30fps and losing 3 percent actually will matter.........ill be closely watching this and the 1060 to see if either can do 4k at medium settings reasonably, fingers crossed
Now that the tests are here, we know that 480 is on par with 970 so clearly below 980 and as I said above not even close to 1070. Waiting to see how Nvidia responds with 1060 performance, and what price and, above all, when (as AMD has no competition in this segment for now...)
Not sure I understand the fuzzy logic across some of these comments, people trying to approximate this card to the GTX 1070 or even judge one as the better value...
AMD themselves have positioned the RX 480 right around the R9 390X. Last time I checked a GTX 1070 is right around a GTX 980 Ti and that was at least as good as 2x R9 290/390 in CF.
Seems like apples and oranges to me. All this posturing about whether it's smarter to go for the high end or the midrange first is just nonsense, anyone not desperate for an upgrade will simply wait out until both lineups are fully fleshed out.
I'm very tempted by 2x GTX 1070 to replace my R9 290, but I'm still gonna wait to see what Vega brings because I'm simply not desperate. Similarly if i was looking at the RX 480 I'd wait for the GTX 1060 to launch.
Nothing worse than kicking yourself because you bought a next gen card too early and the price dropped a couple months later in response to the competition.
980ti "at least as good as dual 390s" when only one 390 is working lol great logic! You're aiming too low, a single 980ti is at least as good as over 9000 Crossfire 390s when only one of them is working.
Looking thru old reviews 980Ti certainly seems to trade mess blows with CF R9 290, in some games avg fps are lower but the minimum is equal, in some it's better and in others it's worse.
Dunno, looks pretty close to me, so "at least as good as" may not be entirely accurate (that could be interpreted as slightly better than), but ghee point stands... We're looking at vastly different levels of performance.
I've got zero interest in fanboy debates, the RX 480 looks great if you're price sensitive and only interested in 1080p gaming... But as an upgrade path from my CF R9 290 it's decidedly uninteresting.
I'm not someone that's gonna go the opposite extreme and blow over a grand on GPUs either, and like I said, I probably WILL wait for AMD's real next gen flagship before upgrading.
GTX 1070 SLI looks mighty tempting in the meantime tho...
I really don't get the huge excitement for people wanting to look a total dork wearing a VR headset. I just think of that episode of Red Dwarf showing them galloping on a horse... I think watching people use VR will be far more entertaining than those using it.
What, because people look totally cool sitting in front of a screen with KB + Mouse in hand shouting things like "grab an inhib and baron before they respawn" into a ludicrously oversized gaming headset? That all appears utterly ridiculous to non-gamers, but we still do it because the experience is fun enough to pull us in. VR will go the same way. If (and that's still very much an "if" at this stage) the experience is good enough, who cares what you look and sound like?
Speak for yourself. VR is the future. Headsets will come down in price, size and weight. Imagine playing Battlefield 1 in VR. Or jump into a spaceship and tour the Milky Way. Way to go !
Spaceship? Explain to me how I'm supposed to see all my controls for ED if I have an HMD stuck on my face? One can get by to some extent with voice recognition, but there are limits.
Curiously missing was an answer to SMP, Nvidias technology to reduce the rendering overhead of drawing two scenes to almost negligible. If this 200 dollar card has a VR focus, a feature like that is absolutely killer.
I don't think that AMD has revealed all the interesting features and changes with Polaris and GCN 4.
They just wanted to get their foot in the door and make sure the public knew about a low price option before they started breaking their piggy banks to get the 1070 in a week or so.
For even more performance, there's SLI. Those hoping that Nvidia's rapidly ageing SLI technology for linking two or more graphics card together would die out with Pascal will be disappointed—but, on the flip side, at least there are a few improvements.
For starters there's a new jaunty, high-bandwidth SLI bridge—or rather, there are three of them depending on your motherboard spacing. They are designed to link just two GTX 1080s together at a higher 650MHz speed (versus 400MHz) by using the second SLI connector traditionally reserved for three- or four-way SLI configurations. Nvidia claims that the new bridge results in less stuttering, although without a bridge or second 1080 to test, we'll just have to take Nvidia's word for it for now.
An odd side effect of the move is that, for the first time, Nvidia is officially recommending users go with a two-way SLI configuration. Those running 4K or monitor surround should use a HB Bridge. SLI has never scaled all that well past two cards—and four-card solutions are pretty much just for show in games—so this isn't all that surprising."
What is you want to buy 3 GTX 1070 and you couldn't work with that. But you can run 4 RX 480 and has all the power of crossfire and asynchronous computing (i.e. GCN 4.0). Which GPU will worth something in 5 years?
How many people do you suppose are going to run 4 x RX 480? I understand what the article is saying, but I have never heard of anyone going tri or quad sli with midrange cards.
This is really an exciting time. If AMDs promises hold, and they really can deliver R9 390X range performance at 150W for $199, that is a great accomplishment. It gives me hope that the APU era will finally deliver on some of it's promises. A Zen APU that can game acceptably at 1080p in a small ITX box like a NUC. With NVMe drives available, you don't need to waste space on drive bays, you can stick those in your NAS. But the APUs to date have always fallen a little short. If Polaris is what they claim it is from a performance and efficiency standpoint, and Zen is as well, I think we'll finally get the APU offerings that make low/mid range obsolete and consoles as well. Discrete video card gaming will then be relegated to VR and 4K+ setups for those who really want to spend the money and push technology to the bleeding edge. However, even in that space the future looks bright as Vega is coming by years end, and NVidia is moving to HBM2 by early next year. Imagine designing a VR box with an Nvidia based HBM2 board. Can they get Pascal into a board the size of the Fury Nano? Or smaller? It's so energy efficient already, I'm betting they can.
I noticed in the pictures connectivity is all HDMI and Displayport. As DVI-D is the only way to get 144hz 1080p I hope the production cards have a port. Otherwise the hardcore FPS players will suffer.
Oh... The absolutely wrong and misinformed on the internet...
Oh, by the way, you're wrong. HDMI 2.0 and DisplayPort 1.4 can carry plenty more bandwidth than DVI-D and are more than capable of 144hz at 1080p, and even more.
Maybe the latest and greatest HDMI supports it, but most video cards, GPU's and monitors from more than a year ago dont. Unless you want to game at 720p.
As I said - 144hz @ 1080p + freesync isnt going to happen over HDMI. Displayport maybe, my GPU doesnt have one though =/
Both HDMI 2.0 and DP 1.3 (not to mention 1.4) have higher bandwidth than DL-DVI. Also, there are adapters aplenty for DP-whatever you want. Have an old 144Hz DVI monitor, want a new GPU? Get an active DP-DVI adapter.
I for one applaud AMD for pushing out obsolete standards. The sooner, the better. We're still not rid of VGA, after all.
As someone who usually buys GPU's at the $150~ price and holds them for 3~ years this has my attention. The fact that its finally a new architechture offering 100% or so improvement over the last product to hold that price point makes it ok to spring for the $200 part.
I was getting sick of the last 5 or so years of reviews saying "This isnt a new architechure, AMD just tweaked the old one, added 10% performance and x/y features. It still suffers from high TDP and isnt really competitive with Nvidia except on the low end".
hopefully Nvidia responds in kind and the new generation of GPU's is here. Since I have a freesync monitor I'll be getting on ASAP.
I keep hearing people say that the GTX 1070/1080 are overkill for 1080p gaming. What about in a few years? Are we at a point where graphics improvements at 1080p resolutions have reached a ceiling? If not, then I want a GTX 1070 or AMD's next offering above the RX 480 because I want to play max'd out settings at 1080p without having to upgrade every couple of years.
Sort of. We're at a point where the newest crop of GPUs has and will entirely outstrip the 1080p performance floor established with this generation (i.e. anything at or above the power of a GTX 950 will get your foot in the door with games at 1080p), to almost comical degrees. The 1080 can push 45 frames per second in a variety of games at 4K resolution. In any given game where it pushes 45 fps at 4K, that can be roughly translated into 180 frames per second at 1080p. That's so far beyond overkill that it is simply ridiculous, and there is so much headroom that games in a few years still won't come close to noticeably munching on that number. If you plan to stick with a 1080p display for the next few years, the GTX 1080 is a waste of money and you would be better off saving cash and going with something in the new midrange, such as the RX 480, or the inevitable RX 470 and GTX 1050/1060.
The only reason to buy a truly high-end, $300+ GPU from this new generation is if you have a 4K or 1440p/144hz display. And if you don't have one of those things, your bank account will thank you. :)
At $199 this is an automatic buy. I have been hoping to hear about green team's GTX 1060 plans, but if those are months away I will be moving to team red. I hope AMD has the supply to cover the market.
Tinfoil hat wearing internet commenter uses incorrect spelling in a statement questioning the intelligence level of the author. Internet commenter than uses all caps to make sure everyone skims past the comment. Internet commenter then proceeds to make several posts in a row lamenting the same thing. Internet commenter is brilliant, even if only in his own mind.
Yep, the maximum number of draw calls has very little relation to actual performance in real games. This might change over time, but it's not very likely.
I think taking different approaches is an overall win for PC gaming. If you look at the GTX 1070, which offers the same performance as the GTX 980 Ti ($649) for only $379 retail, gamers are getting the same performance at 58% of the cost only one year later.
Similarly, if Polaris does indeed perform near a R9 390X ($429) for only $199, that gives us the same level of performance for 46% of the cost just one year later.
And for those with deeper pockets, they can pick up a GTX 1080 and get performance that simply didn't exist a year ago for a price below the high end of last year and rest comfortably knowing that the card will last a long time. Even if Vega and the HBM2 Pascal wind up significantly faster, neither will be out for a least 6 months and they will likely carry the crazy price premiums of top end cards.
Ok so I absolutely cannot wait to hear about the 490. Its coming. And since AMD must have limited frame rates via Crimson and maybe even graphics settings (If I am one to judge by what I saw) to showcase 50% utilisation on that dual setup running Ashes.
I for one could sit here and listen to AMD, Intel, nVidia, Sony, Dell, Discovery Channel and God himself introduce a brand spanking new generation of products designed to blow me away with their innovation and performance for much less money than ever. None of this stuff means a damn thing until I read multiple reviews online, watch videos of the actual products performing live and all that. Even Jesus could come on the stage with a GTX9080 128GB HBM50 that can make Mars go round. If there's no hard data and actual reviews of that specific piece of hardware, the money comfortably stays in my pocket, AMD.
I'll buy 480 or maybe 490 If available at 29th. Gtx 1080 costs 120,000 in my country(local currency) . Which roughly converts to 1000 dollars because of various taxes. 480 price seems to be bearable atm.
Given that AMD hasn't so much as mentioned an RX 490, that's definitely not showing up on the 29th. Vega is rumored to be pushed to this fall (from an early winter/early next year timeframe), but you're not getting a 490 before then. From all that we know, the RX 480 is a full Polaris 10 chip - which means we need Vega for more performance.
Judging by the volume of comments GPUs are the most interesting pieces of tech around. But Anandtech seems incapable of reviewing them, much like smart phones.
Like many others I'm sure, I've waited too long and read so many other reviews that Anandtech's is sure to be an anticlimax. What I didn't fully realise until recently is that this is a broader problem affecting not only the GPU section. I feel this is something that Anandtech need to take seriously as they seem to be alienating a significant number of their readers. Cue the Anandtech faithfull defence....
Any thoughts on double precision performance? I haven't seen a comprehensive Pascal review to see if it improves NV's relatively anemic double precision performance compared to single precision, but think I have seen that it's still 1:32 (Double to single). Later generations of AMD cards also decreased DP FP performance relative to SP. Is this still going to be in the 1:24 - 1:32 range, or may we see something better?
What tasks are you thinking of which need FP64 but don't need reliable ECC RAM? NV has clearly decided that it's not worth offering partial FP64 support (like the Titan) in consumer grade cards. It's ironic that a Quadro 4000 actually has a higher FP64 rating than the latest M6000.
I spend some time running Milkyway@home (BOINC). With that I have seen the migration from very strong DP performance (at the time) on AMD cards a few years ago toward weaker "ratios", whereas NV has remained relatively consistent at 1:32. Just curious if that was still the case. MW@H requires a double precision capable GPU.
And with Polaris AMD has the narrowed the gap with NVidia from two years to one year, and with Zen the gap with Intel from 4 years to 3 years. That is progress and you can not knock it, but exactly how NVidia is going to react to this is already obvious. 1060 is going to be out around the middle of July with a TDP of 130 watts and a little better performance than 480X priced at $210. By the time Vega is out the door 1080Ti is already comfortably the fastest and highest performing GPU for the next year and a half. I was hoping against hope that AMD was going to actually announce Vega or at least give the specs for it but then again AMD lived up to its as of not so lately acquired reputation and just stayed behind both NVidia and Intel. Still, that is progress. Congratulations AMD. And congratulations to Intel and NVidia too. They won't have to worry about a visit by the antitrust commissioners for now.
Sorry, but you're a bit off here. Sure, AMD has been a bit behind Nvidia the last generation - they lost bad on the failed 20nm process, while Nvidia (with their far larger resources) were able to pivot better. Then again, they're already close to closing the efficiency gap - within one generation! - while still competing on both price and performance. Also, I believe Nvidia got very, very lucky with the way Pascal clocks on 16nm, giving them better performance than expected - otherwise, they wouldn't be making a second-tier (soon to be third when 1080Ti rolls out) GPU that beats their previous flagship. We don't know anything about the clocks of GCN 4 yet, but it's probably not that high. Still, AMD can gain a lot of performance just from driver improvements, which they have been delivering steadily on for the last year. And value like the RX 480 is exactly where they need to deliver to stay relevant. Sure, flagships are image building, but they don't keep your company afloat.
Also, there is no way the GTX 1060 is at 130W - the 1070 is 150W with 1920 cores at >1,7GHz. If clocks are equal, that would give the GTX 1060 ~1650 cores at 130W, which would neither be smart (it would cannibalize 1070 sales like crazy), or possible (the GTX 1080 has 4 GPCs (2560 CUDA cores), the 1070 has 3 (1920 cores), which would then put the 1060 at ... 2,5? That makes no sense when the 1060 is supposed to be based on a separate, smaller chip. Thus, 1280 cores (2 GPCs) is far more likely, as Nvidia designing a whole new GPC for "small Pascal" would be ludicrously expensive and economically inefficient (not to mention the nightmare of yet another level of driver optimization). And 1280 cores wouldn't need 130W unless they ran them at 2+GHz - which would be dumb in terms of both cooling, stability and warranty/RMA costs.
The GTX 1060 will in all likelyhood be around ~100W, and will probably perform similar to, if not slightly lower than, the RX 480. It will sell like hotcakes (it's going to be a midrange Nvidia GPU, after all), but AMD might have a winner here in terms of raw performance and price. We'll see.
I highly doubt NVidia will sell the 1060 for less than $279, which of course will be non-existent until the $350 "founders" edition is sold out... My point, don't expect anything "gaming" usable from NV for less than $300 this round.
What happens when one company brings a product that undercuts another company's profit margins ? The usual rant. And it usually comes from folks pretending to voice their own independent and unbiased opinion. In reality, and leaving the fanboys aside, these folks have a financial stake of some sort. And they don't like what they see.
You people are idiots. NVIDIOT would launch a so-called state of the art Geforce GTX 2590 probably 3 or 4 months from now and price it at $35,575 USD and you losers would run and buy it anyway
I've dumped $10s of thousands into hifi over the years, and am used to paying for the best.
But... I am pretty much brand agnostic. If Krell make the best amp, and Stereophile is able to explain & SHOW me why, then that's it - I'll take two servings of Krell, thanks. If Halcro happen to leap-frog them in any tangible manner, and I'm feeling flush at that time, then Halcro it is.
The point is - fanbiosm is UGLY, childish, and quite disturbing, because it comes mostly from adults, displaying highly introverted personalities, which I think have no real life.
I mean, who gives a sh1t about $100 when spending HUNDREDS of them on a GPU? Not me, if its for good reason.
Anyway, I'm done, flame away, I've got six Saparro to sit back and read it with...
Why I don't see any comment on gtx1070 also having 150w TDP and if it means anything for AMD on the high end side of things? Will they be able to make something to compete against an 1080TI?
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
377 Comments
Back to Article
Eden-K121D - Tuesday, May 31, 2016 - link
LOL before event we already have articles from anandtech.ImSpartacus - Tuesday, May 31, 2016 - link
There were rumors of this anyway. It's not entirely unexpected.AntDX316 - Wednesday, June 1, 2016 - link
I kind of lost interest in Nvidia, besides their new multiprojection technique, when they made the efficiency graphs insane but they released a graphics card with chopped off cores and ROPs and evened up the progress difference by increase the MHz while price gouging on a reference cooler design that at most should cost $50 more but instead $100.I have to stay Nvidia because G-sync and multiprojection. I think AMD is targeting the mid-end area but not super high-end. Most of the market in the world is low/mid-end. Look at the console gamers. PC master race.. cost like 500% more than a console plus the people don't know how to make and run one properly. Some settings remain unchanged. All they use is presets.
CaptainDoug - Wednesday, June 1, 2016 - link
So much peasantry.scotto330 - Sunday, June 5, 2016 - link
So much douchery.fanofanand - Wednesday, June 1, 2016 - link
"I kind of lost interest in Nvidia""I have to stay Nvidia"
You are definitely a conflicted individual.
Totally - Wednesday, June 1, 2016 - link
His situation is exactly why I haven't bought NVidia in a while, stumbling into pitfalls like that isn't something I like to do. Having to decide whether to void some value of a monitor I purchased vs a going with competitors card even if it's the more sensible choice it not a position I'm willing to put myself in. There are probably many more in his shoes.looncraz - Thursday, June 2, 2016 - link
Yep, Totally! (see what I did there?)nVidia has been working hard to create vendor lock-in with all of their proprietary feature-adds just so they had the upper hand when AMD or Intel became competitive.
monstercameron - Wednesday, June 1, 2016 - link
cognitive dissonance.cocochanel - Monday, June 6, 2016 - link
cognitive ignorance.Narg - Wednesday, June 1, 2016 - link
G-Sync is a poor reason. Just lock your frame rates. Moving frame rates are just as bad a sync issues. Smooth game play comes from a steady frame rate. And since the human eyes can detect more than 30 FPS, I'm still at odds as to why people bother. More is not always better, or useful at all.blppt - Wednesday, June 1, 2016 - link
Because tearing can still happen at lower than fixed framerates too. If it were as simple as locking my games to 60fps without vsync, I wouldnt even consider buying a gsync/freesync monitor. And tearing bugs the hell out of me---what's the point of getting a card able to play the latest high-end graphics when those wonderful graphics are tearing?HideOut - Wednesday, June 1, 2016 - link
again, why do we have to log in now for every post...But human eyes can see 50-60fps if you have normal vision. Some people with extremely sensitive yes can get closer to 70. 30 is laggy as hell in action sequences.
Morawka - Wednesday, June 1, 2016 - link
nonsense. human eyes can recognize the fluidity difference all the way up to 175 FPS. Just pull out your iphone and put it in video mode and select 120 or 240 FPS capture and notice how much fluid the movements are.willis936 - Sunday, June 5, 2016 - link
Well phone displays are only running at 60 fps so I'm not quite sure what looking at it would tell you.bcronce - Thursday, June 2, 2016 - link
Humans don't see in "FPS", but we tend to normalize to FPS which has some issues since we perceive different different aspects of FPS.Human vision has a minimum of around 24 FPS to see "motion" before we start to see a slide show of rapidly succeeding images. Humans process images at a rate of about 30 updates per second. It can take up to 100ms to integrate an image into consciousness. While 10ms is considered instant for delay purposes, the brain can detect timing anomalies close to 1ms. We can visually recognize unexpected differences in images as fast as 300fps.
The best way to describe human vision in a nutshell is we process about 30fps, but we continuously integrate and can detect visual anomalies at a rate of around 300fps.
I have myself been able to reliably tell the difference between 70fps and 85fps in my hayday of Counter-Strike, and I was able to eventually notice after several seconds if someone was using a 120hz screen because of the higher real FPS. Fast paced motions were noticeably smoother and easier to predict instead of the strobe-effect slide-show of 85hz. In both situations the rendering FPS was about 150-200, but the refresh was either 85hz or 120hz, and it was noticeable.
HollyDOL - Monday, June 6, 2016 - link
I suggest reading some scientific facts about this human eye fps issue. For one I recommend Michael Duggan's "The Official Guide to 3D Game Studio", the most interesting part here: https://books.google.cz/books?id=weMLAAAAQBAJ&...G0053 - Thursday, June 2, 2016 - link
I am by no means a competitive gamer, but I can perceive the drop from 120 to 100 fps on my 144hz monitor in games a simple as Heroes of the Storm. Even my wife notices a reduction in smoothness of game play in Borderlands when dropping from 72 to sub 60fps.scewb - Thursday, June 2, 2016 - link
The human eye can physiologically detect up to 1000 frames per second. 60 is acceptable in slow games but i want 150+ in competitive FPS.vango - Thursday, June 2, 2016 - link
Your stupid.. your eyes can see past 30 frames easy.. I can sure tell by any weak game on a console in comparison .. I also have issues with some games dipping below 60 down to 30 when I play games and noticed the slow down so yeah it is a big deal having plenty of frames when action picks up in a game because they drop.. So you really don't know what your talking about. Your eyes don't work like that MR Doc Narg.. By the way it's clear to see the difference in 4k video but guess what the average is 72 frames during video. Just stick to your console where you belong..vango - Thursday, June 2, 2016 - link
And locking your frames is the worst thing you can do.piiman - Saturday, June 4, 2016 - link
really ? why?Use some facts this time
Hxx - Thursday, June 2, 2016 - link
How do u lock your framerates if lets say your vide card cannot withstand the framerates you are supposedly locking? thats the whole point of gsync. to give you smooth gameplay not only at high fps but also at LOW fps. So unless you plan on running a very high end video setup and thus locking that 60,75, or 100 refresh rate in absolutely all games based on your display, then g sync will sound like an attractive option. Besides, vsync introduces significant lag which may matter to you if you play twich shooters so oftentimes even with a high end setup you will have to choose between tearing or increased lag. TBH I think nowadays if you purchase a nice gaming monitor that comes with a high refresh rate then will most likely have either freesync or gsyncSeanJ76 - Sunday, June 5, 2016 - link
Human eye can see more than 150+ fps........IDIOT!MobiusPizza - Sunday, June 5, 2016 - link
It's the variance (standard deviation) of the frame time that is the problem. 24fps consistent with zero deviation as in movie is not actually a problem for the eyes, our brain is very good at intepolation. Except there will be more latency with computer game input with low fps. That's why going to cinema is fine.barleyguy - Monday, June 6, 2016 - link
The other reason going to the cinema is fine is because movie directors plan their shots around 24 fps. Certain types of shots, such as horizontal pans, look like crap at 24 fps. But directors know that, and avoid it.Sports, on the other hand, is generally shot and broadcast at 60 fps, because it wouldn't look good at 24 fps. The shots aren't planned ahead of time, or edited before being seen.
Narg - Wednesday, June 1, 2016 - link
...besides, AMD has freesync with does the same thing as G-Sync.medi03 - Wednesday, June 1, 2016 - link
It doesn't come at a hefty premium.So it should be worse, right?
Right?
D. Lister - Saturday, June 4, 2016 - link
Pffft nah, it is used by AMD, so it can only be better, or at least as good as the alternatives, right? Right?piiman - Saturday, June 4, 2016 - link
If its only has good but free, compared to costing money, then its better in my bookTotally - Tuesday, June 7, 2016 - link
Oh look, that makes it non issue about not being able to use a feature by using an nvidia card since you didn't have to pay extra for it. :le gasp:piiman - Saturday, June 4, 2016 - link
Yeah that's the ticket its free so it must suck yeahhhh :) Yeah sucks for Nvidiacrankstewie - Wednesday, June 1, 2016 - link
Not like everyone who play console is inferior, and its not hard at all to go over and change settings. (Hate Hate Hate for your comment)theduckofdeath - Friday, June 3, 2016 - link
the mid is not an end. Literally. :D(sorry, couldn't resist)
sonicmerlin - Friday, June 3, 2016 - link
The GTX 970 was the best selling individual discrete GPU of the last generation. AMD made yet another huge blunder by starting with the 480.fanofanand - Monday, June 6, 2016 - link
Citation please. You are full of BS, the 970 was not the best selling individual discrete GPU of the last generation. Where in the world did you pull that nugget from? (I have an idea, and it's a dark smelly place)hapeid - Friday, September 30, 2016 - link
Enjoy while it last because the rumour said that Kyro would be the last custom core from Qualcomm. If ARM were to release a new arch to replace v8-a or the newer cortex were to much better than Kyro, then we're back to many cores madness. http://idramakorea.comat80eighty - Wednesday, June 1, 2016 - link
sucks for anandtech.people bitch if they take their time to give thorough info on products, then there's the opposite spectrum of bitching like yours where it's a problem if they talk about stuff early on like everyone else
xype - Thursday, June 2, 2016 - link
AnandTech: going downhill since the second article they published.Cellar Door - Thursday, June 2, 2016 - link
You are an idiot.piiman - Saturday, June 4, 2016 - link
And yet you have been reading their site for all that time? If they suck why? WHY WHY WHY?tygrus - Wednesday, June 1, 2016 - link
Should have been called the "RX 460" and then leave room for faster products. a 460 is a bigger number than 380 anyway. AMD with 14nm up against Nvidia's 16nm should have given AMD an advantage. The 14nm process being used is obviously not fully optimised and not as mature at the 16nm used elsewhere. AMD need to increase the clocks by 30% to 60% and release the enthusist model with atleast 80% more GPU resources (re. geometry, texture, shader, ROP, MEM) and 30% higher clock rate. More than double the previous performance in the same form factor. The old pattern of improvement applied to the 28nm to 14nm shift would give an ideal of 4x improvement (wishful thinking of the good old days).tygrus - Wednesday, June 1, 2016 - link
I know they have more to come and this isn't the fastest model they will release in the family. Whatever they are planning for in the next 6 to 12months need to be brought forward as quick as possible. Having a better lead could be worth an extra $200M revenue a quarter (and 40% of that as profit). If you think leading the market costs a lot, being the follower (delaying and limiting R&D) leads to less revenue and considerably more losses.slickr - Thursday, June 2, 2016 - link
What are you talking about? This just from the raw numbers seems like its at least 40% faster than the R9380 and both cost $200. If architecture improvements are even semi decent we could be seeing close to 50% faster speeds than the r9 380 at lower wattage and the same price point.Considering Nvidia only has the 1070 which goes for at least $400, while the RX 480 would come close to the r9 390x performance, for half the price that is very good.
I mean lets take Far Cry Primal a new game that is running on an engine equally optimized for both Nvidia and AMD, historically very consistent results for both companies, the 390x comes in at about 50fps at 1440p, the 1070 comes in at about 65fps.
So basically for HALF the price of the 1070 you are getting 85% of the performance of the 1070.
tipoo - Tuesday, May 31, 2016 - link
"(A Positive Integer)"I would hope! Hah
CaedenV - Tuesday, May 31, 2016 - link
I laughed so hard when I saw that!Demiurge - Tuesday, May 31, 2016 - link
Seconding the laugh on that one, and adding a laugh for the "(Many)" as well!mr_tawan - Wednesday, June 1, 2016 - link
Hopefully it's not overflowed!edzieba - Wednesday, June 1, 2016 - link
"We have advanced our binning ability to selectively disable HALF a ROP. Suck it, Anandtech!"Ryan Smith - Wednesday, June 1, 2016 - link
Haha. If they could actually do that, I'd probably publish that letter too...fanofanand - Wednesday, June 1, 2016 - link
That did give me a good laugh, well played Ryan!bananaforscale - Sunday, June 5, 2016 - link
Anything else would be irrational! (Not really, but just play along. :D)Samus - Tuesday, May 31, 2016 - link
Yes!!Ikefu - Tuesday, May 31, 2016 - link
Really loving that performance per dollar ratio. As an occasional gamer the price speaks to meKutark - Wednesday, June 1, 2016 - link
More importantly is that it's doing it without being able to power a Los Angeles class nuclear submarine.prtskg - Wednesday, June 1, 2016 - link
Lol!This perf per dollar is great for occasional gamer like me.
Eden-K121D - Tuesday, May 31, 2016 - link
Hopefully will buy a RX 480 for 1080P gaming.aakash_sin - Wednesday, June 1, 2016 - link
same!hans_ober - Tuesday, May 31, 2016 - link
If this can get to 70% of a GTX 1070, it's a win.psychoticdream - Wednesday, June 1, 2016 - link
2 480s for $500 (at under 60% use) beating a $699 1080?it's a HUGE win
HollyDOL - Wednesday, June 1, 2016 - link
Yep, if your power bills are really low... 2x150W vs 1x180W... in some regions those $199 can return quite fast depending on electricity cost.RandSec - Wednesday, June 1, 2016 - link
Does that really make sense? The difference is 120W worst case. But how long is the computer going to be on? And, of that time, how much will be strong gaming? And, if the card is running at 60% capability, will it really take 150W? Step by step, the difference actually means less and less.Kvaern1 - Wednesday, June 1, 2016 - link
15% of my powerbill is the price for the power itself. The rest is taxes and shit.So yes.
FMinus - Wednesday, June 1, 2016 - link
so much fucking this, my power bill is ~120EUR/month, when I look at the dissection of that, the actually power is 45EUR, rest is just fucking taxes, I'd just so god damn wrong.fanofanand - Wednesday, June 1, 2016 - link
You can thank a progressive for that.Murloc - Thursday, June 2, 2016 - link
how much of that power comes from coal?Most power "generation" added in the future will come in the form of energy conservation.
It has to be incentivized in some way, otherwise people will keep wasting power.
Zoomer - Thursday, June 2, 2016 - link
Can't go too far. Someone's going to make a backyard coal generator that will make your diesel generator look clean.fanofanand - Thursday, June 2, 2016 - link
Very reasonable comment Zoomer, I'm sure backyard power plants are the wave of the future /serple2 - Friday, June 3, 2016 - link
I wish my power bill was that cheap. And live in the USA, where we get cheap power. I'd love to pay just 200€ per month for power.Kvaern1 - Saturday, June 4, 2016 - link
If I used that much power I'd have to pay about €700 for it according to thishttps://www.ovoenergy.com/guides/energy-guides/ave...
Murloc; Last year 41% came from coal
Meteor2 - Saturday, June 4, 2016 - link
Then act to cut your usage (€30 pm here).D. Lister - Wednesday, June 1, 2016 - link
More importantly, it is always better to go with a single powerful GPU vs a multi-GPU setup.atlantico - Wednesday, June 1, 2016 - link
This is true, but a multi GPU setup looks awesome :Dvalinor89 - Wednesday, June 1, 2016 - link
And "Might" actaually be a good idea, for the first time ever, for VR.Meteor2 - Saturday, June 4, 2016 - link
.. until now?FMinus - Wednesday, June 1, 2016 - link
150W is the limit on that reference board, I'd wager the whole card runs at like 120-130W at worst under load. The GTX 1080 is pushing past 180W quite often.slickr - Thursday, June 2, 2016 - link
Really? The 980 has a 160w rating and consumes about 200w at peak gaming. I wouldn't trust Nvidia's power rating at all.Wreckage - Tuesday, May 31, 2016 - link
I predicted that AMD would abandon the high end and just be a budget brand.wperry - Tuesday, May 31, 2016 - link
AMD has been very open about the roll-out of the 14nm parts and which market segments they would address and when they would address them - that they would be hitting the meaty mid-range part of the market first (which is not a bad move, IMO) has been known for a while.Meteor2 - Wednesday, June 1, 2016 - link
You say budget, I say mass-market. This is where the money is.Michael Bay - Wednesday, June 1, 2016 - link
The PC mass market is commodized to the point of race for the bottom for basically everybody.Good luck finding money there.
hero4hire - Wednesday, June 1, 2016 - link
Comments like this are infuriating to me. "Good luck finding money there"So. Fking. What.
If AMD puts out a 480 (or 490) at a loss how do I, the consumer, lose? Are we all financial investors playing the market?
If some unknown 3rd party put out a card at $25 with open source drivers that delivered 5 tflops and bare minimum gaming support/features, I'd buy. When that company blows through $10B of investor money killing countless hedge funds through an unsustainable market presence; will I cry about their poor ROI?
I'm here as a consumer. Give me it all for free if you want. Lots of high margin items out there for you to buy and help those corporate bottom lines. Better hurry before there's a price drop!
CaptainDoug - Wednesday, June 1, 2016 - link
You don't have to be an investor to know AMD needs to regain some market share. Nvidia needs a competitor. Same with the AMD vs Intel battle. If AMD goes down, you'll be paying more for CPUs and GPUs.D. Lister - Wednesday, June 1, 2016 - link
Wrong. Intel and Nvidia (or any company that sells non-perishable/non-consumable goods) can only raise the price to a point where it is still viable for someone who already has an older model. Otherwise, they wouldn't make any new sales.If AMD was dead now, you think the GTX 1080 would have had an MSRP of $2,000? How would they convince someone who already has a 980Ti, or even a 780Ti for that matter, to part with two grands?
When AMD goes down, nothing would change in the world, except perhaps eventually normal people would not have to put up with AMD's fanbase anymore.
TheinsanegamerN - Wednesday, June 1, 2016 - link
Not release new drivers for older cards? That sounds like a very easy way to force people to upgrade.Also, it's been proven time and time again that a monopoly leads to higher prices. Every. Single. Time. The PC market is no different.
D. Lister - Wednesday, June 1, 2016 - link
@TheinsanegamerNA consumer GPU is an entertainment product, or rather a mere component of one, which not only has to compete with other brands in the same line of products, but also with the value provided by consoles/handheld/mobile, and even other forms of entertainment, like TV and movies.
JKay6969AT - Thursday, June 2, 2016 - link
@D. Lister You don't really get how competition works do you? Monopoly = Bad simple as that. In fact intel don't want AMD to fail, simply to be noncompetitive. If AMD dies then intel faces being broken into smaller companies to remove their monopoly. As long as AMD exists intel can claim it's a free market and there is no monopoly, even when their high end product prices reveal otherwise.I am not attacking you just pointing out the truth, this is not intel bashing or nVidia bashing or AMD bashing. It is simply the fact that where markets are competitive prices fall and innovation increases but while there is no or little competition the opposite is true.
I currently have an intel Core i7 4970K and a Radeon R9 290X. My last CPU was a Core i7 920 and my last GPU was an nvidia GTX 580.
I don't care which company makes my hardware, I simply care to get the best product for the best price that is suitable for purpose.
I will consider buying AMD Zen if it has benefits over intel, even if it's slower as the competing intel CPU as long as the price reflects this and the CPU is up to the task I require it for. The same goes for my next GPU, nVidia or AMD doesn't matter, price and performance relative to my needs is important, not brand loyalty.
D. Lister - Thursday, June 2, 2016 - link
"You don't really get how competition works do you? Monopoly = Bad simple as that. "And you obviously don't understand the word "monopoly" beyond being some sort of a scary monetary bogeyman. The state has monopoly on law enforcement and military, monopolies that are necessary and practical. Then there are natural monopolies that are actually beneficial. The point is not a direct comparison, but to make you realize the world isn't quite as much in black and white as you may like to think.
Secondly, in the grander scheme of things, a duopoly isn't really an immense improvement over a monopoly anyway.
"If AMD dies then intel faces being broken into smaller companies to remove their monopoly."
For heaven's sake, there is a big difference between having monopoly status, and actually being charged for abuse of monopoly. Intel could only be broken down at the demise of AMD, if it was proven in court that Intel was somehow directly responsible for AMD's bankruptcy.
READ, CAREFULLY... >>> http://www.economicsonline.co.uk/Market_failures/M...
Brek - Tuesday, June 7, 2016 - link
step away from the foilhat lister, you nvid fanboy.atlantico - Wednesday, June 1, 2016 - link
When Nvidia goes down, well be rid of the Nvidiots. Good times ahead.D. Lister - Wednesday, June 1, 2016 - link
@atlantico:)
Nvidia's current market share and stock position suggest you may have to wait a bit for those good times. Don't lose hope though, it could very well happen in your lifetime.
tamalero - Wednesday, June 1, 2016 - link
There is a huge difference in a "nvidiot" (aka someone who defends the company even if they fist them so hard on the butt) to "educated gamers". educated gamers have no brand and switch whatever brand gives better perf for the money with less problems.The ball is on Nvidia's side right now.
Enigmatica - Thursday, June 2, 2016 - link
Especially on the higher-end market. I have a 144Hz 1440p IPS G-sync monitor (best thing I ever did to improve my gaming experience) and the only card I can see getting close to pushing my monitor on demanding games is the 1080, or even the 1080ti if I am patient enoughHammerStrike - Wednesday, June 1, 2016 - link
Since competition has no bearing on price, I'd like to offer you my services at no charge. Simply agree to route all your online purchases through me and I guarantee that I get you the absolute best price while dealing with all the order entry, tracking, etc. Win-Win.D. Lister - Wednesday, June 1, 2016 - link
@HammerStrikelol, it seems your analogy got out of your hands there. Mainly because it was based on a false premise. No worries, keep trying. *thumbs up*
JKay6969AT - Thursday, June 2, 2016 - link
If there was no AMD we wouldn't be talking about the 1080 or 1070 as they wouldn't likely exist. Market stagnation and rising prices WOULD BE and ARE a result of a monopoly.Just look at the prices of intel's high end offerings, they have been steadily rising and the benefits seen between generations are incremental at best. They are milking the market with high prices and less innovation, why not AMD doesn't currently have anything on the market to compete beyond the Core i5 have they? This is not a knock on AMD simply a fact of the market.
With Zen and Vega AMD look set to return to the high end in both CPU and GPU. This is a good thing for all. No one is saying you have to buy either but that intel and nVidia will be forced to up their game by dropping prices and increasing innovation. It's a win win for all.
D. Lister - Thursday, June 2, 2016 - link
"If there was no AMD we wouldn't be talking about the 1080 or 1070 as they wouldn't likely exist."Nvidia (or Intel, for that matter) doesn't make their products just to be in some sort of a technological pissing contest with AMD. Companies cater to the needs of the market. If there is demand for more power, then there is money in making more powerful hardware. Sure, without a race, the technology may grow slower, but that means software developers would have more time to optimize for every arch (a la consoles), which means that at the end of the day, we as consumers, would see the same growth in graphic quality, but only with lesser bugs.
"Just look at the prices of intel's high end offerings, they have been steadily rising and the benefits seen between generations are incremental at best."
Read through the following links:
http://smallbusiness.chron.com/inflation-effect-pu...
http://phys.org/news/2015-08-silicon-limits-power-...
fanofanand - Thursday, June 2, 2016 - link
You are SO right, it's not like Intel with it's dominant monopolistic position on the high end just increased their price for their top CPU by 80% or anything. Oh wait.....Chaser - Wednesday, June 1, 2016 - link
The GTX 1070 is $379.00 and offers Titan X/980Ti performance. So in this case please tell us all which AMD product forced Nvidia to offer a less expensive, more powerful, more efficient product this round?pashhtk27 - Wednesday, June 1, 2016 - link
Well, last round AMD forced Nvidia to release gtx980ti. You were not gonna get gtx980ti which performs close to titan without Fury X coming out. I reckon it'll repeat next year with 1080ti........They had no competition this round so they increased the prices of gtx1080 and gtx1070 by 'a small amount'. Monopoly doesn't come in a day, nor does it mean that prices rise up suddenly like inflation. Nvidia can offer a lot more with competition. Now they will be forced to unveil a good gtx1060 which might not have come had there been no rx480 in my opinion.
Talk about Intel....just check the prices of i7-6950x.......
Competition in the mass market segment is very important. Look at the difference between gtx950, gtx960 and gtx970. It made everyone recommend the later over former two cards for mid-level segment. I remember a few years ago the most common recommendation was gtx560ti, gtx660ti for the same. This is how you play market games.
RavenSe - Wednesday, June 1, 2016 - link
All of the 9th series of GTX? Once new tech is out, almost nobody wants obsolete one. There is tons of those chips on the market. Other then that I don't expect NVIDIA to sell 1080 and 1070 at volume, it's early yields, meaning they will still have to sell Maxwell until the switch to new node is fully complete. In the meantime market will start to be flooded with small die size AMD cards that will certainly be at higher volume.slickr - Thursday, June 2, 2016 - link
That is the msrp on the card, but since it will only be a paper launch and Nvidia is the one selling first "founders" edition cards at over $450, in fact market price is close to $480, I wouldn't be surprised if market prices for the 1070 are at least $420-430 for 3-4 months before it goes down to $400 and I'd expect some competition from AMD to get the price to a more reasonable and sensible price such as $330-$350.JKay6969AT - Thursday, June 2, 2016 - link
In the UK the price of the cheapest 1080 is £525 and that has a really horrid cheap plastic shroud over the heatsink and the blurb makes no mention of a Vapor Chamber cooling system which comes on the 'Founders Edition' for £619. The decent 3rd party cooler 1080 cards range in price from £580 to £690 and the hybrid and water cooled versions go up past £700.This is an expensive card, a really good card, but an expensive one. When AMD releases their competing product these prices will fall.
Again this is not against AMD or nVidia, just stating the facts.
pashhtk27 - Friday, June 3, 2016 - link
True enough ;)JKay6969AT - Thursday, June 2, 2016 - link
If tomorrow AMD launched the RXXX Wizzbang Wallop x2 that was $500 and was the same performance as the GTX 1080 then nVidia would be forced to....oh that's right, COMPETE!You would soon see the price of the 1080 drop and maybe even see the introduction of the 1080 ti to fill the highest end void to appease the fanboys who need their brand to be the best, even if the top end cards are way out of their league they can aspire to have them. This creates brand loyalty to an almost occultist level where the hardcore fans will argue against fact, logic and truth to defend their brand.
This is not a bash on nVidia, I'm quite sure that if AMD were in nVidias position they would be acting in the same fashion. it is how business works, like it or not.
Meteor2 - Saturday, June 4, 2016 - link
Vega.piiman - Saturday, June 4, 2016 - link
" So in this case please tell us all which AMD product forced Nvidia to offer a less expensive, more powerful, more efficient product this round?"You act like the round is over. AMD just came out of the corner give them time to start throwing some punches before declaring them the loser. :)
poohbear - Wednesday, June 1, 2016 - link
Are u an investor? Most ppl thay frequent this site are consumers, so they're very happy.slickr - Thursday, June 2, 2016 - link
LOL! Lets me guess you live in the USA or Canada, have never went more than your state and you are going to be telling how people are in the world?Have you been to Russia, India, Slovenia, Morocco, Brazil, etc...? There is a major market out there for $200 and cheaper graphic cards. If AMD can fill the price points from $100 to $200 with amazing graphic solutions that make sense, offer great value you'll see them selling tens of millions of graphics and making big profits!
Jumangi - Wednesday, June 1, 2016 - link
$200 isn't "budget". Its the mainstream where the vast majprity of cards are sold. The 1080/1070 are to show off not for large volume and revenue.jabber - Wednesday, June 1, 2016 - link
Yeah I doubt Nvidia are expecting to pay all the bills from sales of the 1080. Mercedes doesn't make its profits from the top S class which has all the latest tech. The money comes in being able to push all that expensive and tested R&D out to the masses.poohbear - Wednesday, June 1, 2016 - link
It's $199 for gtx 970 performance. How's this budget?TheinsanegamerN - Wednesday, June 1, 2016 - link
$200 has been budget level for gaming cards for awhile.Midwayman - Wednesday, June 1, 2016 - link
Mainstream. Budget is generally when you get down to the 950 level cards.pashhtk27 - Wednesday, June 1, 2016 - link
^This. Budget is 950 and below. I have a r7 260x, and I know it is a budget card. ;)JKay6969AT - Thursday, June 2, 2016 - link
@pashhtk27 Exactly, I think people here seem to overestimate what budget gaming is.£25-£100 = budget gaming
£101-£250 = Mainstream
£250-£400 = High End
£400+ = Enthusiast
This is roughly how I define the levels. I consider myself High End that aspires to the Enthusiast level :-) This means that I always buy the best card for about £300-£400 but would be willing to spend more if the performance levels justified it. When I bought my R9 290X 4GB the only other card that I could consider was the 780 ti 3GB as that was the best nVidia had at that time but that was £100+ more expensive and had 1GB less framebuffer so I went with the R9 290X 4GB.
At $200 the RX 480 should perform similarly to the R9 290X and at that price I wouldn't hesitate to recommend it to a mainstream 1080P+ gamer. This may change when nVidia inevitably responds with their 1060 but that's a good thing as there's more competition and more choice.
FMinus - Wednesday, June 1, 2016 - link
GTX 980* performance.piiman - Saturday, June 4, 2016 - link
"I predicted that AMD would abandon the high end and just be a budget brand. "Are you claiming victory? Because someone should tell AMD to cancel Vega SOON!
Its seems to me AMD is about ready to make Nvidia slash the prices of their line of cards from top to bottom.
BenSkywalker - Tuesday, May 31, 2016 - link
I get that we don't have exact figures, but if these work out to be correct we are looking at the most memory starved GPU ever made.Just over 1/5th the memory bandwidth of the GeForce DDR that came out in 1999.
Obviously that isn't what they meant to say, but screwing up Hz and bitrate.... really?
Ryan Smith - Tuesday, May 31, 2016 - link
The standard is to measure memory by bandwidth per pin. Measuring by Hz stopped being sensible when GDDR5 came out, and GDDR5X makes this worse (there are several different frequencies you could measure).BenSkywalker - Wednesday, June 1, 2016 - link
Actually this site stopped using actual Hz rating in the 90s- DDR made it problematic, it simply has been compounded since.Throwing out 256GB/sec for the actual bandwidth compared to say the 298GB/sec the only 290 had or the 357GB/sec that the 390 was offering would likely be a bit more friendly for people looking for quick information and not an AMD sales brochure :)
0razor1 - Wednesday, June 1, 2016 - link
I agree, a very valid point.jjj - Wednesday, June 1, 2016 - link
If anything AMD is being wasteful again with a 256 bit bus here. You can bet that Nvidia will go with less and save some on area and power. Ofc AMD will do better at higher res since some might even buy this for 4k - would struggle in 4k but with 4k monitors even bellow 300$, it's an option.SetiroN - Tuesday, May 31, 2016 - link
I can appreciate the price point but it is really disappointing to see that AMD couldn't reach higher frequencies with such a huge jump in litography and the switch to FinFETThis part would be positioned much higher with the smaller Polaris (which they couldn't even mention at this point) actually introducing the VR ready lineup. Instead we're pretty much the old 390 at a TDP that is merely acceptable. Wow.
rhysiam - Tuesday, May 31, 2016 - link
If the performance claims are accurate and we get 390-390x performance with 1x6 Pin connector from a ~$200 card, that's a pretty decent offering from AMD IMHO, who really cares about clockspeeds? I'd be more disappointed, actually, if AMD did have to push the clocks really hard to reach their performance target. Maybe this way we'll get aftermarket cards with an 8 Pin connector and a heap of overclocking headroom. Maybe not, of course. Nvidia did say that had to put a lot of work into their designs to hit the 1700+ Mhz they have on the 1080.But whatever way you shake it, this is good news for the gaming masses who are running 1080P @ 60hz monitors. If (and that's still an "if" at that stage), performance, price and availability are as promised, you'd have a hard time recommending anything other than this card for a decent 1080P@60 gaming rig until Nvidia release their mid range or respond with significant price cuts.
euskalzabe - Tuesday, May 31, 2016 - link
Yup. To me it's either this RX 480 or the eventual 1060. I've bought in the x70 range for the past few generations, but since my monitor is still 1080p60, x70 level cards are not well "balanced" for this anymore. Example: I'm on a 770 that, while speedy enough, gets starved in modern games due to its 2GB VRAM (AC Unity is a great example). At this point in time, buying x60 level cards and renewing every couple years seems like the more sensible option.jabber - Wednesday, June 1, 2016 - link
Yeah the only reason I want to upgrade from my HD 7870 is the 2GB limit. If the 480 gives me 4GB and a 35% boost I'm in.rhysiam - Wednesday, June 1, 2016 - link
35%? Based on what's been said thus far, you should be expecting a card that's starting to approach double the performance of your 7870. Even a 290 (with its awful stock cooler) is approaching those kind of gains: http://www.anandtech.com/bench/product/1034?vs=106...Let's wait and see, of course, but I'd expect at least 80% or more from what we've seen.
slickr - Thursday, June 2, 2016 - link
What? Assuming conservative performance numbers based on all the data we have, it will perform as a r9 390 which is over 50% of your GPU. For only $200 and 150w that is amazing.jabber - Friday, June 3, 2016 - link
I always expect to be disappointed so I aim low.fingerbob69 - Wednesday, June 8, 2016 - link
"Maybe this way we'll get aftermarket cards with an 8 Pin connector and a heap of overclocking headroom"That would be the RX 490 @ $299 ...29th June SURPRISE!
PocketNerd - Tuesday, May 31, 2016 - link
Aside from the usual "don't knock it till we see the benchmarks" routine..This thing is $130 cheaper than the 390, and even if the clocks aren't as high the architecture is supposed to be more efficient so I wouldn't count this out just yet. I was already considering getting the GTX 1070 but if this thing is comparable I might just get the 480.
Kutark - Wednesday, June 1, 2016 - link
It won't be comparable to the 1070. I say that not as a fanboi, but the argument of economics. They've announced the price target as being $200, they simply would not sell a card that trades blows with a 1070 for half the cost, that would literally be throwing away free money.Meteor2 - Wednesday, June 1, 2016 - link
...or, AMD would get very high sales (as this is much more affordable) while cutting off Nvidia's revenue. Though it would just result in a price war.praxis22 - Wednesday, June 1, 2016 - link
I have 15 years of economics under my belt, so I'd say "nonsense" Or if you're a Brit, "Pish and likewise Tosh" :)From what I've read and seen, mostly deeply technical and geeky, not economic. What this is, is a land grab. In 4x terms this is a Rush. At a stroke, AMD will own, the laptop segment, with Polaris 11, the Console segment, since Nvidia wasn't interested. Apple, and the consumer market. They also own the graphic subsystem with Mantle/Vulkan, So they get Android and iOS, etc. Since they have the consoles, and games port to the PC from consoles, it makes more sense to code to Vulkan.
At a stroke AMD now owns PC gaming into the future. MSFT continues to screw up, since Direct X12 doesn't support dual cards, which Vulkan does out of the box. I have Vulkan drivers on windows 7 right now.
I own an R9 390, before that I had a GTX 660. I bought the 660 as it was a 200+ Euro card at the time, the sweet spot of a gaming PC. I bought the R9 390 as the game I play mostly is Skyrim, and there isn't a card available that will excel at fully modded Skyrim, so I went for brute power and lots of VRAM, at the expense of Power consumption, (who cares about Performance per Watt?) It cost me something like 360 Euro, plays everything else at Max. Modded Witcher3, DA:I, that sort of thing. Plays Skyrim OK, Gave me about 50% more fps than the 660 with over 600+ mods Allows me to use 5GB of VRAM even with an ENB, Which takes main memory into VRAM to stop the game crashing. This is on Windows 7, MSFT screwed the pooch for DX9 in Windows 10, limits all card to 4GB, even the GTX 980 Ti. But I digress...
What I really want is Skyrim in VR, for that I need 4k to get stereoscopic 1080p. Given that Occulus is actually running two screens I figure the software must drive this. the problem is that are no 4K screens, So I can probably use two 2k screens with a high PPI, and drive them from separate cards. I say "probably" as until I get my hands on one or two, Or I see indepth reviews, I won't know. However, given that I have two monitors at work as single view pane, (under Linux) I figure it can't be that hard.
What this does is make Dual card setups possible for the mainstream, and AMD are making a push for VR.
tldr; Your economics argument is specious, AMD aren't going for the performance crown, as nobody who bought a 970, will buy AMD anyway. They'll buy the 1070 and lust after a 1080. What AMD are going for is market dominance, and market share, They will sell millions of these, to OEM's looking to build mass consumer products, and the console manufacturers & Apple. They sold it at $200 not to make excessive profit. But to own the market. With Intel retiring to the Perfomance arena too, (with the loss of 12K jobs recently) then AMD may well end up owning the CPU market with Xen too.
This is not about performance, it's about price. This is not winner takes all, this is commodity pricing, you have the wrong economic model. :)
jjj - Wednesday, June 1, 2016 - link
Do remember that in high end (1070) the value is lower than in midrange.People got excited about the 1070 value because they are shortsighted, it might be good value for high end but the best value is never there.
AMD here is likely to be offering very solid value and make it hard for Nvidia to beat them by a large margins. If Nvidia decides to beat them they can by a little but they can't make AMD look bad and AMD can slightly adjust prices if needed. Assuming this card is a bit faster than the 390, the pricing is just right.
We'll see what the final power numbers are and the die size and only then we can figure out where AMD really is.
praxis22 - Wednesday, June 1, 2016 - link
I think you're missing the point. In economic terms people who buy the high end cards are price-elastic. they don't care how much it cost.Back in the day I bought an ATI 9700Pro, it was, for about 4-6 months, the fastest graphic card in the world. I paid a lot of money for it, I would have paid more, because I wanted the best. I wanted to own the fastest graphic card in the world. It broke my heart when it died, a heat pad shrivelled and detached and the GPU fried itself, even when I installed an aftermarket cooler it stayed dead. :(
Nobody but geeks and gamers actually upgrades a PC, there are far more gamers that play on a console, simply because it's easier and cheaper, than having to maintain a PC. Simply plug and play. This is why they write for the console and back port to the PC these days.
Similarly very few people actually build a PC from scratch, this is why Valve's survey is so useful. this is what the many people who use Steam are gaming on, and hence what developers should aim for to gain the largest audience.
By our very nature, taking about hardware specs on a technology web site, we are not "normal" indeed the Normals are by definition the mainstream, people who buy at best Buy or other big box retail stores.
This is why I say that people who consciously go out and buy a new GPU as an upgrade are the minority, and likely already have a preference/prejudice as to what they will buy, and the new 480 is aimed squarely at the sweet spot, of price & performance for people who are limited on what they can spend (price-inelastic) either with having somebody else tell them about it, or having somebody else build it. This is "normal" for PC hardware these days.
The stuff you'll find in big box retail is always exploitative. They usually give you a good CPU, typically over specified for gaming which is GPU limited, and give you a passable GPU, that will likely need to be replaced within a year. On a recent trawl through a refitted Consumer electronics big box I found that most of the PC's where offering an i7 6700 with a 970 for around a 1000 Euro. That is a serious chunk of change to drop (price elastic) for a PC, when you buy a console for $400 or less.
PC Master Race on Reddit have builds, from $331 to $901 all but one of them socket an AMD GPU because of price. The two cheaper boxes socket AMD Athalon's the two more expensive builds socket an i5.
There is an article on the Verge today all about Intel realising that increasingly they are not going to be 'Inside' on what remains of the new PC market.
So I say again, this is a land grab, colonising the parts of the market that the previous incumbents have left behind, and doing so at a price that appeals directly to the price-inelastic consumer, and OEM's that build for normals.
Whether people here like AMD or not, doesn't matter, this is not aimed at us.
just4U - Thursday, June 2, 2016 - link
What you seem to forget is the 9700 PRO came out after the Radeon 8000 series.. What happened there precisely? Do you remember? They were offering Geforce 3 performance at close to half the price... The next 2 generations saw cards from both companies coming out at in around that benchmark price (slightly higher.. but not much)They did the same thing with the 3800 series against those magical 8800s..and didn't quit match the performance but came close.. by the time the 4000s came out they'd surpassed NVidia again and kept prices at that sweet spot.
Nvidia started raising prices again in the 500 series..even though Amd had comparable products (although late..) at cheaper prices.. now we see those bumps again just in time for AMD to bring them back down to reasonable levels..
Or so it looks to me.
just4U - Thursday, June 2, 2016 - link
ofcourse their not doing it for us (forgot to add that lol..) it's to brand recognition, and market share. AMD knows where it plays at and what it can do.. but it's been Nvidia all the way since the Geforce 2 to push the pricing envelope.praxis22 - Thursday, June 2, 2016 - link
Do I remember? No :) I built my first computer from a kit in 1981, and went through many "home" computers, including a DEC Alpha UDB, (I'm a UNIX admin) before I owned a PC. The first PC I ever owned was powered by a P4 Northwood, and some crappy Nvidia card, a 240 or a 440, something like that. Then I spent more on the 9700 Pro than I had on the computer :)That said after watching some videos last night I came to the conclusion that IMO, while AMD's products are "competitive" I believe that they have chosen not to compete for the performance crown, and are instead playing their own game. Namely occupying the low-middle ground, and selling millions of SKU's to OEM's, the console makers and Apple. We get to see discrete graphics cards, but we're not their audience.
mapesdhs - Friday, June 3, 2016 - link
praxis22 writes:> The stuff you'll find in big box retail is always exploitative. They usually give
> you a good CPU, typically over specified for gaming which is GPU limited,
> and give you a passable GPU, that will likely need to be replaced within a year.
(good posts btw!)
This happens for pro sales aswell. I'm helping a guy who bought a system which was mostly to be for pro tasks, with some gaming too. It had a decent enough CPU (i7 4770) but a pretty naff GPU (GTX 645 1GB), and cost quite a lot. I'm building him a replacement with a 4.8GHz 3930K, Quadro 6000 6GB and GTX 580 3GB for extra CUDA.
Seems to be a common approach by generic builders, they overspec the CPU and then fit a lame GPU. The PC the guy bought also had a bizarre RAM config (12GB with 3 DIMMs). At least he had the sense to fit an SSD himself (I'm so tired of seeing OEM systems supplied with 1TB rust spinners).
jabber - Wednesday, June 1, 2016 - link
I was with you all the way till the Zen bit. It was like when you talk to someone and have a really good conversation then they say "and have you ever accepted the Lord Jesus into your heart?"praxis22 - Wednesday, June 1, 2016 - link
I run an i5 like most sensible gamers, I'm in now way an AMD fanboy, I bought my GPU for one game, so I'm defacto not normal. I just happen to think that the evident long running AMD strategy is very clever. I'm unlikely to buy a Zen/Xen (not sure how you spell it) but it will surely sell a lot if priced well.My wife would like me to accept Jesus, but my heart belongs to Danu :)
extide - Monday, June 6, 2016 - link
It's Zen. Xen is a hypervisor.JKay6969AT - Thursday, June 2, 2016 - link
Sadly with a lot of 'PC Enthusiasts' these days their brand loyalty is getting close to religious worship. Arguing against facts, logic and truth as they can't imagine a situation where their beloved brand could be wrong or worse or imperfect.The reality is that AMD, nVidia and intel are all corporations out to make profits, if the market allows them to price gouge then that is what they will do, if the market allows them to rebrand old products as new ones then that is what they will do, if the market forces them to compete then that is what they will do, if the market forces them to innovate and release new products then that is what they will do.
None of these companies hold any true loyalty to any of you, they just want you to invest in their products and advertise them through word of mouth, which incidentally is not a regulated form of advertising and so can be far more effective than having to tell the truth about your products, your fanboys will say whatever makes them feel they have won the argument regardless of it's basis in reality.
pashhtk27 - Friday, June 3, 2016 - link
Very true.When I built my pc 4 years ago, I was very proud not to have two components from the same brand except the RAM and PSU from corsair. And with a few updates, I still have no component and peripheral same except those two......and well, a gamepad and mouse from logitech.
I love my pc knowing that I investing in the best from all brands. Even though it's old and weak and....cheap in today's tech. :)
cocochanel - Wednesday, June 1, 2016 - link
I don't understand people bitching. AMD is bringing VR to the masses. What's wrong with that ? Nvidia and Intel can sit this one out if they want, who cares ? PlayStation Neo, Xbox One VR and Nintendo will probably have a powerful APU with Zen cores and Polaris GPUs. Me ? I'm waiting for PlayStation Neo + VR headset. All for about a grand. This stuff is as good as a 35k Tesla. Yummy !!!tamalero - Wednesday, June 1, 2016 - link
because each band expects to have the "bragging rights". Its always a back and forth. ATI won the crown? they blasted Nvidia, Nvidia won? the same in the opposite direction.eddman - Thursday, June 2, 2016 - link
"What this is, is a land grab. In 4x terms this is a Rush."There is nothing stopping nvidia from reducing 1070's price and positioning the upcoming 1060 (and 1060 Ti?) to counter 480, if they feel their market share is threatened.
"AMD will own, the laptop segment, with Polaris 11"
How would you know that? It depends on how GP106, 107 and 108 turn out.
"the Console segment, since Nvidia wasn't interested. Apple, and the consumer market. They also own the graphic subsystem with Mantle/Vulkan, So they get Android and iOS, etc. Since they have the consoles, and games port to the PC from consoles, it makes more sense to code to Vulkan."
Don't be so sure. There are two main gaming segments that do not really overlap: consoles/computers and mobile.
In the first segment PS, xbox and PC rule. PS comes with its own optimized APIs and xbox and PC use DX. It would take a lot to convince developers to abandon these powerful APIs.
"At a stroke AMD now owns PC gaming into the future. MSFT continues to screw up, since Direct X12 doesn't support dual cards."
No dual cards? What do you mean? DX12 supports multi-GPU setups.
"... skyrim ... dx9 ... 4gb ..."
What does this have to do with anything? Off-topic.
"What AMD are going for is market dominance, and market share"
Again, there is nothing stopping nvidia from countering all this.
praxis22 - Friday, June 3, 2016 - link
Oh good, argument! :)The Last time I bought a Laptop, 4/5 years ago, I bought a hybrid netbook with Nvidia's Optimus architecture. Woks well as a third party Chromebook, runs Linux well too, but the battery life sucks.
That's one thing I have noticed about big box retail, at least in Germany, that they no longer sell Chromebooks even as a paid installation like the rest, as it kills their margins. They're not the highest selling "laptop" on Amazon for nothing. That is the future of the consumer laptop, and thus the cheaper the chips the more margin the OEM makes. They do sell them in the UK however, that and very expensive laptops. There is seemingly no middle ground anymore. That is what I mean by owning the laptop market. Owning the rump of what's left of it. Globally there are only three makers of laptops with Clevo being the biggest. Everything else is just badged.
Sure there are plenty of indy games, which rely more on Steam for their target platform. With Vulkan I was talking about the big AAA titles, which for at least the past 5 years have been console first. they then port the console build to the PC. Skyrim, Witcher3, go look at the howls of the faithful about the graphics downgrade of Witcher3 because they chose to prioritise the console. The New consoles, (except Nintendo) will be running AMD. Purely because Nvidia wasn't interested, and Intel is incapable. Nintentdo has gone for a Tegra chip by Nvidia, so it can port to Android. Again using Vulkan. there were talks about optimising for Vulkan at the the recent Google IO, I watched them.
There is a shift underway, DX12 is built on Vulkan too, go check. This is shift back to a layer much closer to the metal, for better performance and memory management, essential for VR. This is a step change. It will take years, but it's already underway. Keep your hands inside the moving car, and watch out for changes :)
Have you used dual cards with DX12? I haven't, but I have read and watched a fair bit about people who have, and have thus taken to the interwebs to bitch about it. MSFT says they'll fix it. I haven't seen any comment about patching DX9 to use more VRAM. Though I have also seen a read a lot about that too. It's why I blocked the Windows10 upgrade. There aren't that many DX12 games as yet, so it will take years to fully phase out the older platforms.
What does it have to do with anything? It has everything to do with graphics cards, and the drivers and games that run on them. Not everyone has your use case, or habits. :)
You're right, there is nothing stopping Nvidia, except a desire. They're already looking to get out of the low end GPU business, and into AI and automotive, as anyone who's watched a recent Nvidia keynote, (like me) will have seen.
eddman - Friday, June 3, 2016 - link
"That is what I mean by owning the laptop market. Owning the rump of what's left of it."Umm, we still don't know anything about nvidia's upcoming mobile GPUs. You cannot be so sure when there is no comparison point. Nvidia will not just sit idle and lose market share.
"With Vulkan I was talking about the big AAA titles, which for at least the past 5 years have been console first. they then port the console build to the PC."
As was I. Again, there are only three main AAA game platforms, PS, xbox and windows, two of which use DX. Vulkan could become popular, but it doesn't mean that developers would abandon the currently known APIs.
Vulkan could take a hold on android and maybe iOS (if it manages to convince developers to abandon Metal) but all of this is irrelevant because AAA games are not released on mobile.
DX12 based on vulkan? You mean completely? Source.
There might be a shift to low-level APIs, but it doesn't mean DX12 will not be popular. It's a well known, powerful API.
"I have read and watched a fair bit about people who have, and have thus taken to the interwebs to bitch about it. MSFT says they'll fix it. I haven't seen any comment about patching DX9 to use more VRAM. Though I have also seen a read a lot about that too."
Did you know that with DX12, game developers can take full control of multi-GPU setups and no longer rely on GPU drivers? Maybe the current games have not been optimized well for said setups.
Just because MS haven't "patched" the old, obsolete DX9, doesn't mean they'd treat DX12 the same.
"What does it have to do with anything? It has everything to do with graphics cards, and the drivers and games that run on them. Not everyone has your use case, or habits."
Sure, but what does that have anything to do with AMD capturing the market?! Just because one old, dx9 game with tons of mods doesn't play well on windows 10, doesn't mean DX12 will fail. The market isn't decided by how skyrim plays.
"You're right, there is nothing stopping Nvidia, except a desire."
They are focusing on other areas because they already have a huge GPU market share. As soon as this share gets threatened, they will react. They will not just give up one of their main money making arms. Desire has nothing to do with it.
praxis22 - Friday, June 3, 2016 - link
"DX12 based on vulkan? You mean completely?" Completely, no. More as a catchup and because that's the way the industry/weather was moving.Source:
http://www.extremetech.com/gaming/177407-microsoft...
http://www.pcper.com/reviews/Editorial/Microsoft-I...
https://www.reddit.com/r/pcmasterrace/comments/3mc...
I don't doubt DX12 will be popular, but I still say that AAA will be Vulkan first, eventually, (3 years from now, give or take)
http://www.kitguru.net/components/graphic-cards/an...
https://en.wikipedia.org/wiki/Vulkan_(API)
http://www.forbes.com/sites/patrickmoorhead/2013/0...
http://www.bloomberg.com/news/articles/2016-03-03/...
http://www.tomshardware.com/news/nvidia-gpu-tv-con...
https://blogs.nvidia.com/blog/2016/02/16/vulkan-gr...
My point about DX9 is that DX12 is Windows10 (W10) only, and W10 is something MSFT needs the world to adopt. When even Paul Thurrot is shouting about MSFT's "indefensible" behaviour over forced W10 migrations then I think we can say that the adoption curve is behind schedule. The less W10 installs there are, the less demand there is for DX12 only games. For games and game engine companies, that matters. Which is why they're launching engines coded to Vulkan. Valve is also a lead cheerleader for Vulkan and the 600lb gorilla in the PC gaming space.
Again, the PS4 and Xbone are AMD at the core. There is no alternative as Nvidia were not interested, and AMD setup a company division to do the job. (Forbes article above) AMD built Mantle and gave it to the Kronos group, who sucked the best bits into Vulkan. in a few years Vulkan will be the standard that everyone writes to for "gaming" be that PC, mobile or console.
DX12 will not fail, not like Mantle did, but it doesn't have to, if W10 doesn't attract enough users the money and development will go elsewhere, and the PC business is in secular decline, NVidia is just as much a victim of that as MSFT. Unless you're in the commodity business of making things cheaper for the platform that remains, this is where AMD, as a business, appear to be headed, They make the Chips that power the consoles, and thus have some control over the low level drivers & dev kits. Sure Microsoft wants DX12 to succeed, but it wants it's console to do the same.
Again, I'm not an AMD fanboy, I just happen think that the evident business strategy is very clever, doubtless they were lucky too, but as a full stack vertical, they seem to be in a good place to prosper in the declining PC business environment.
VR is the erstwhile saviour of #PCMasterRace but if Google/Android provide a cheap working headset with real presence the need for the Rift/Vive goes away, and with it the need for the PC backend.
eddman - Friday, June 3, 2016 - link
"Completely, no. More as a catchup and because that's the way the industry/weather was moving."So it is NOT based on vulkan at all. It was just perhaps inspired by mantle and is simply similar to mantle and vulkan, in the sense that all three are low-level APIs.
"I still say that AAA will be Vulkan first, eventually"
Using the words "might" and "could" won't hurt in that sentence. Right now, there are more available and upcoming DX12 games than vulkan.
"DX12 is Windows10 (W10) only, and W10 is something MSFT needs the world to adopt. When even Paul Thurrot is shouting about MSFT's "indefensible" behaviour over forced W10 migrations then I think we can say that the adoption curve is behind schedule. The less W10 installs there are, the less demand there is for DX12 only games. For games and game engine companies, that matters."
Just recently, windows 10 became the no. 1 OS on steam and is still climbing. Gamers are clearly adopting it fast. The overall, worldwide share of windows 10 is still low, but a huge lot of PC users never play games to begin with, so the steam numbers are a better indication.
I don't doubt that more upcoming games will adopt vulkan, but I don't think we'll see many vulkan only games, except for games made by id Software.
"Which is why they're launching engines coded to Vulkan."
You don't know that for sure. There could be many reasons. Maybe they just don't want to rely on one API alone.
"Valve is also a lead cheerleader for Vulkan and the 600lb gorilla in the PC gaming space."
Of course they are. They are desperately trying to make SteamOS happen. They have an agenda, not that it's a bad thing.
"Again, the PS4 and Xbone are AMD at the core. There is no alternative as Nvidia were not interested, and AMD setup a company division to do the job. (Forbes article above) AMD built Mantle and gave it to the Kronos group, who sucked the best bits into Vulkan."
I don't see why it matters if consoles use AMD chips or not. Their GPUs work just as good with DX12.
"in a few years Vulkan will be the standard that everyone writes to for "gaming" be that PC, mobile or console."
How come you never write "could" and "might"? You are so sure as if you went to the future and came back.
"if W10 doesn't attract enough users the money and development will go elsewhere, and the PC business is in secular decline"
Windows 10 is gaining among gamers, and is no. 1 on steam as I pointed out, and X1 is DX12 too.
"NVidia is just as much a victim of that as MSFT. Unless you're in the commodity business of making things cheaper for the platform that remains, this is where AMD, as a business, appear to be headed,"
NVidia still offers cheaper GPUs and will keep offering them. No change there. Gamers usually buy cards based on their price/performance, be it NVidia or AMD.
"They make the Chips that power the consoles, and thus have some control over the low level drivers & dev kits. Sure Microsoft wants DX12 to succeed, but it wants it's console to do the same."
Are you implying that AMD will cripple DX12 on xbox? MS would never allow that. Actually, MS themselves might be involved in writing the drivers. There is no way MS won't make DX12 run as fast as possible on their own console.
"Again, I'm not an AMD fanboy, I just happen think that the evident business strategy is very clever, doubtless they were lucky too, but as a full stack vertical, they seem to be in a good place to prosper in the declining PC business environment."
Never thought you were. The PC business still mainly relies on price/performance, not APIs. As long as NVidia keeps things competitive, they will keep selling.
"VR is the erstwhile saviour of #PCMasterRace but if Google/Android provide a cheap working headset with real presence the need for the Rift/Vive goes away, and with it the need for the PC backend. "
AAA games come out on consoles and PCs. As long as they keep their lead in performance over mobile, it'd stay that way, VR or no VR.
praxis22 - Monday, June 6, 2016 - link
'How come you never write "could" and "might"? You are so sure as if you went to the future and came back.'It's called a narrative argument, (look it up) as opposed to rebutting statements. Which stands in many cases as an opening gambit to a Staw Man or Reduction ad Absurdum.
"Are you implying that AMD will cripple DX12 on xbox?" No. Far from it, I think AMD have actually a vested interest in DX12 over and above Vulkan. They want as much support for their cards as possible, preferably support that doesn't require them to write the drivers. This is part of the whole push to Vulkan, offloading the driver support to the games writers, via the low level API.
I would argue that Price/Performance is only relevant at the low end. Nobody who buys an expensive card cares about that, they want speed and/or specs. Objectively by that measure my R9 390 is a lousy card due to the power requirements. My 850W PSU sounds like a hurricane when I have a game running, but I wanted the speed and VRAM, and I pay the electric bill, so.
I appreciate you disagree with what I say, but where is your argument?
praxis22 - Monday, June 6, 2016 - link
Ah, the curse of auto correct... Reductio Ad Absurdumeddman - Friday, June 3, 2016 - link
FYI, I love AMD's push for fast but cheap cards. It'd be a win for consumers a.k.a me, but to think that this would suddenly tip the market share scale, would be wishful thinking.praxis22 - Monday, June 6, 2016 - link
Given the timescales involved with AAA games, (typically 3-5 years) I think "suddenly" is the wrong term. Like I said, give it three years.piiman - Saturday, June 4, 2016 - link
" since Direct X12 doesn't support dual cards"say what?
praxis22 - Monday, June 6, 2016 - link
With DX12 and Vulkan, there are two forms of Multi-Adapter support. Implicit & explicit. Implicit Multi-Adapter is when the OS does it for you, this mode is the one where the OS can use two different cards, AMD +Nvidia, Nvidia +Intel, etc. And games that write directly against DX12 can use the two cards as one device.Explicit Multi-Adpater is enabled at the application level, (by games) and requires that all cards are identical.
It is my understanding that DX12 on Windows 10 does not currently support Implicit Multi-Adapter, though MSFT have promised to fix that. All the DX12 games seen so far have been Explicit Multi-Adapter when they have supported multiple cards at all.
extide - Monday, June 6, 2016 - link
It supports both. THIS VERY SITE has tested both configurations. You can do it yourself as well will software and hardware already on the market. Dude, while you may be an experienced economist, you don't really know much about the computer hardware/pc gaming world. I mean honestly, if the RX 480 competed with the 1070, AMD would be selling it for $299, not $199. It would STILL be an amazing deal and they would sell shitloads of them, plenty enough to gain marketshare even.Rampart - Tuesday, May 31, 2016 - link
If the details of the card mean that the benchmarks are what we expect, then for $200, we get a card that beats the outgoing GTX 970 easily. The GTX 970 is currently the recommended minimum for VR. To reduce the price of entry to VR AND get a better experience by $130? Hell yes thank you.Jimster480 - Wednesday, June 1, 2016 - link
Nevermind that it mentions that the VR performance is similar to a "500$ card" meaning Fury-Non X.Eden-K121D - Wednesday, June 1, 2016 - link
GTX 980 if you willpraxis22 - Monday, June 6, 2016 - link
I think that means that there is a 480X in the works, which will retail higher, but two of them will still be approx $500Eden-K121D - Sunday, June 12, 2016 - link
RX 480X sounds odd. It will be RX 490 then RX FURY Zbeginner99 - Wednesday, June 1, 2016 - link
" can appreciate the price point but it is really disappointing to see that AMD couldn't reach higher frequencies with such a huge jump in litography and the switch to FinFET"Well it's made at GloFo and it's GCN. GCN clocks lower than NV uArch. Has been the case in last couple of years. GCN also has the performance/power sweetspot at much lower clocks. 290/290x sucked because they were basically factory OCed. If you reduce clock and voltage just a bit, efficiency skyrockets.
I suspect that their will be AIB cards (or even from AMD?) with better components and cooling and more power connectors that will allow a pretty significant OC. Such a card should be able to reach GTX 1070 given GCN 4 really is a major overhaul.
Jimster480 - Wednesday, June 1, 2016 - link
Sorry to say but AMD's clockspeeds are much better than Nvidia's. AMD is just touching on the power of 14nm with these conservative speeds in their first card while Nvidia is already close to peak in its first generation yet they barely are beating their last generation and their price's alienate most of their customer base.AMD has the reverse going on here, they just brought the performance of their most sought after card down to $200 with wattage that anyone can deal with and possible VR performance (if you care) similar to that of their $500 range cards (Fury Non-X).
They have sane clockspeeds and less CU's and less Stream processors overall and less memory bandwidth. If it delivers on the performance it means that GCN 4 is a real killer similar to how GCN 1.0 was when it came out in 2012.
Nvidia's architecture is fat, inefficient and apparently went so far backwards that they needed 60% more clockspeed to achieve 10-30% more performance.
flashbacck - Wednesday, June 1, 2016 - link
I thought the argument for clocks peed as the only thing that mattered died with the pentium 4.Jumangi - Wednesday, June 1, 2016 - link
Inefficient? Who's the GPU maker that had to create massive almost 300W cards that ran hotter than a certain other manufacture who could run their cards a 200-300mhz higher to try and compete?AMD has had horribly inefficient GPU's and CPU's for years now.
Valantar - Wednesday, June 1, 2016 - link
Although I'm an AMD fan, I have to agree. All we can say for now is that Nvidia has managed to create the first GPU architecture that can clock well beyond 1GHz, even reaching 2. Doing this while keeping power levels as low as they are show that this is good engineering, not pushing clockspeed to make up for bad design.On the other hand, if CPUs are at all comparable, Apple has shown for years that slower, more efficient cores can outperform higher clocked ones while conusming less power. It all comes down to architecture.
AMD seems to be making good progress in curtailing the inefficiencies of GCN - if this performs as an R9 390, but uses only 150W, that's 100% performance at 55% power. That's pretty great, and far beyond what the move from 28nm to 14nm accounts for. They still have a ways to go to catch Nvidia, though, as this won't come close to the performance of the 1070, yet uses as much power.
bedscenez - Thursday, June 2, 2016 - link
TDP is different from Power Consumption. RX 480 has a TDP of 150w but it doesn't mean thatit consumes that amount of power. It consumes around 110-130w at peak but they rated it as 150W for as the maximum power it can get when overclocked. Check the reviews of GTX 1080 being rated at 180w but it spikes way up 200W+.
Valantar - Thursday, June 2, 2016 - link
GTX 1080 spikes at more than 300W - but for a few milliseconds at a time. It averages (even over very short time spans like 1s) at or below TDP. Tom's Hardware has an excellent part on this in their review. AMD does the same thing - the Fury X (which is what lives i my PC) spikes around 400-450W, but only for a few ms - and makes up for it by dropping to 100-150W, averaging out around 275W - again, check out Tom's' review). On modern GPUs TDP is roughly equal to average power consumption. After all, the whole point of the TDP is as a guideline for cooler designers - it's the amount of heat that their designs need to dissipate effectively. Sure, overbuilt coolers are good for silence, but bad for costs, and board partners sure don't want to be forced to make overpowered coolers unless they want to.Rampart19 - Wednesday, June 1, 2016 - link
AMD got screwed when they were betting on the 20nm process to pan out. It didn't, and nVidia was able to out-engineer them on the 28nm process. Ultimately I think AMD has been just trying to get by and bide their time until they had access to 14nm.CPU wise? AMD finally came to their senses that the module design meant way lower IPC. While in a perfect world where every piece of software is 100% designed for multiple cores, the reality is much different. Games are just now getting to the point where having just 2 physical cores isn't enough.
D. Lister - Wednesday, June 1, 2016 - link
"Nvidia's architecture is fat, inefficient and apparently went so far backwards that they needed 60% more clockspeed to achieve 10-30% more performance."Did you notice that there were also far fewer cores in Nvidia's new arch? No? Well there you go.
Kvaern1 - Wednesday, June 1, 2016 - link
"yet they barely are beating their last generation"Stopped reading there.
HammerStrike - Wednesday, June 1, 2016 - link
"Nvidia's architecture is fat, inefficient and apparently went so far backwards that they needed 60% more clockspeed to achieve 10-30% more performance."LoL! Since when is increasing the clock speed (particularly while lowering power consumption) a bad thing?
"To increase performance they increased performance!" - Jimster480
JKay6969AT - Thursday, June 2, 2016 - link
The facts are that currently nVidia have the best performing GPU's on the market for consumer level products. The 1080 is an incredible card that outperforms in actual gaming benchmarks any other card. This is impressive, no matter what side of the fence you sit.The RX 480 is also an impressive card, it redefines what is expected at this price point however until there are actual gaming benchmarks widely available using off the shelf RX 480's after launch then you can't say for sure how it will stack up in real world terms in VR to the R9 Fury(non X) or any other card. On paper the card looks efficient and likely to perform well but until I see real world gaming benchmarks I can't say for sure how good it actually is or will be.
The 1070 looks set to perform well at it's price point too, but again until it's launched and I see actual gaming benchmarks I can't say anything for sure.
The fact is that AMD has paper launched a new card that is creating buzz in it's target price point and nVidia has physically launched a card and paper launched another which are doing the same.
jjj - Wednesday, June 1, 2016 - link
Clocks are irrelevant , we don't know what this thing is really.Anyway clocks seem to be 1266MHz.
The real TDP is likely lower at stock and 150W is the upper limit you can feed into it.
TheinsanegamerN - Wednesday, June 1, 2016 - link
You are assuming that polaris doesn't have a huge OC ceiling. They remember the backlash over fury x, perhaps they intentionally didn't clock their chip as high as it could go this time around.We wont know until third parties can take a whack at polaris.
Yojimbo - Tuesday, May 31, 2016 - link
Well the power of the card for the price seems nice. But the presentation of "You don't need faster GPUs any more, you just need them to be cheaper!" was a bit silly. But I guess that was their best attempt to positively spin their current lack of a high end GPU (something that we'll magically need again, apparently, in 6-9 months when Vega comes out).alfalfacat - Tuesday, May 31, 2016 - link
That's absolutely correct. The only thing faster gets you is an even stupidly higher resolution. The *VAST* majority of people never buy $500+ halo cards, and are perfectly fine with their gaming experiences.Yojimbo - Wednesday, June 1, 2016 - link
No it's not absolutely correct. Plenty of people buy cards for more than $200. The vast majority of people on this Earth don't buy any discrete GPUs at all, some of them game, and many of those are perfectly fine with their gaming experiences.There's still a market for cards costing more than $200 just as there was last year and the year before.
Jimster480 - Wednesday, June 1, 2016 - link
its true that it is under 10%, I help out alot on a site where people ask for builds. I would have to say that 85%+ of the builds are using a 380/960 and only some are using a 390~ and basically none are using a 980Ti or a FuryX.Yojimbo - Wednesday, June 1, 2016 - link
Under 10% of volume, yes, but not of revenue. And the gross margins of the lower priced parts are lower. But it's not even really my point. My point is that the market still exists. In fact with the previous generation of cards it grew. VR only puts more demands on performance, so the market for high end cards will probably still grow with this generation as well.The demand for more and more GPU computing power will be there for a long time, because there are still plenty of things for that power to be used for.
Meteor2 - Wednesday, June 1, 2016 - link
Steam says otherwise. Even when you calculate for revenue. Halo products really are just that. They basically advertise the rest of a manufacturer's range.tamalero - Wednesday, June 1, 2016 - link
halo technically are a niche, gaming community. not the global reach.Yojimbo - Wednesday, June 1, 2016 - link
Maybe you're thinking of the price of the graphics card, not the price of the GPU. The GTX 970 was a significant factor in NVIDIA's financial success the past year. You might also be looking at the steam survey and comparing the number of GTX 970s to the number of all the other graphics cards on the list, instead of thinking about what percentage of those cards were bought in the time since the GTX 970 was on the market.But in fact, if you look at the Steam survey it says the opposite of what you claim it says. The top card on the list is the GTX 970, with over 5% share. The GTX 980 has over 1%, the 980 Ti has 1%. That's over 7% on cards that still cost over $200 to this day, and it doesn't include any mobile offerings or any AMD cards, or integrated GPUs, which are included when figuring the percentages. It also doesn't include cards that cost over $200 when they were bought, like the 780. If we just look at DX12 cards, the 970 is at 7%, the 980 is 1.4% and the 980Ti is 1.3%. This is already almost 10%, and DX12 cards go all the way back to the 500 series, as well as including Intel 4000, 5000, and 6000 integrated graphics.
It's pretty clear that, going by the steam surveys, much more than 10% of all discrete graphics cards sold in the last year and used by steam users have cost the purchasers more than $200.
Jumangi - Wednesday, June 1, 2016 - link
10% is way overstating it. I'd bet Nvidia sells 50+ 960 GPU's for every 980 sold. Only a fraction buy in the 400+ range.Yojimbo - Wednesday, June 1, 2016 - link
The 970 is the big seller, and it is also more than $200. If you go by the steam surveys, there are a little over 3 GTX 960s for every GTX 980, and there are 5 GTX 970s for every GTX 980.Kvaern1 - Wednesday, June 1, 2016 - link
Define "even stupidly higher resolution".Yojimbo - Wednesday, June 1, 2016 - link
And if it's really slower than the 390X it's going to be going toe-to-toe with NVIDIA's largest GP106 card if NVIDIA clocks it at 1.6 GHz. But the TDP of the NVIDIA card is going to be less than 150W. NVIDIA would have to price that card at $220 or less, I think. It's not too far out of line with what their Pascal pricing has been so far. The GTX 780 was priced at $650 and the GTX 760 had half the cores (clocked higher) and was priced at $250. The GTX 980 was $550 and the GTX 960 had half the cores (clocked the same) and was priced at $200. The GTX 1080 is prices at $600. A GP106 card with half the cores clocked the same, priced at $220 seems about par for the course.Eden-K121D - Wednesday, June 1, 2016 - link
But the efficiency is still less than 1070GTX 1070 - 6.5 Tflops for 150W TDP with 8 pin connector
RX 480 ~ 5.5 Tflops for 150W TDp with 6 pin connector(absolutely no overclocking headroom)
Yojimbo - Wednesday, June 1, 2016 - link
Yeah. I do think the RX480 is far less efficient, but TDP is a poor substitute for performance per Watt efficiency, and I wouldn't directly compare real world Pascal performance to real world Polaris performance by theoretical peak performance. NVIDIA cards have been more efficient in terms of real world performance compared with peak theoretical performance. That pushes the efficiency argument further in the favor of NVIDIA, of course. GCN 4 may make more efficient use of its peak theoretical performance than previous generations of GCN, but I doubt its caught up to Pascal.SaberKOG91 - Wednesday, June 1, 2016 - link
If anything, 150W is an overestimate. Think of it this way: 75W Max for PCIe x16 slot. Another 75W for 6-pin or another 150W for 8-pin.For the RX480 the 6-pin just means that they expect it to draw more than 75W and that overclockers might want a little head room for the reference cards.
For the 1070, it is probably running close to the 150W TDP at 100%, so NVIDIA opted to give another 75W to offset peak power throttling and to give headroom for factory overclocked cards (a la 980 Ti).
But of course, that's just peak power draw. I expect that Polaris averages closer to 100W and peaks near 125W.
Valantar - Wednesday, June 1, 2016 - link
Tom's Hardware's reviews have great power measurements (detailed graphs of card-only power draw over time), but unfortunately they don't have that for the 1070 yet. However, if the 1080 is anything to go by, the 1070 averages very close to TDP (the 1080 in their measurements averages at 173W, while peaking at around 300). The 8-pin connector is most likely for several reasons: a) most importantly: Pascal, like Maxwell, spikes at far higher power than its TDP, but only for a few miliseconds at a time, and 8-pin power connectors make up for ay any out-of-spec power spikes; b) to add a modicum of OC potential (although that drives the power spikes far higher, if the 1080 is anything to go by), and c) to underscore that it's a high end card - using a single 6-pin connector for their 2nd-from-the-top card would make many enthusiasts wary and skeptical of its performance potential.Considering that the Fury X shows pretty much the same behaviour as Nvidias architectures in terms of average draws and power spikes, I'd be surprised if the 480 was far below 150W.
rderubeis - Wednesday, June 1, 2016 - link
but look at the price difference. If this 480 is even close to this 1070 its going to b amazingJimster480 - Wednesday, June 1, 2016 - link
The truth is that you don't. If it weren't for scam modes like "Ultra" with fake "optimizations" like "hairworks" there would be no need for these pricy cards.Try checking around YT to see people who do these sort of independent investigations into Ultra settings in most games and you will notice that there are stupid things such as 8 million tessellated textures on rocks for no reason. Something that just kills GPU performance.
I have found that I can still play most games on high @ 60fps in 1080p mid/high settings with my 2013 7870Ghz which I bought for $125 on sale back then.
Meteor2 - Wednesday, June 1, 2016 - link
But it looks better.Michael Bay - Wednesday, June 1, 2016 - link
>hairworksThey should have done it just for the AMD user butthurt factor alone.
Alexey291 - Wednesday, June 1, 2016 - link
Shame it looks so awful thoughRanger1065 - Wednesday, June 1, 2016 - link
Oh dear. With no sign of Chizoo, I hoped Nvidia trolls were on the decrease @ Anandtech.Zingam - Wednesday, June 1, 2016 - link
I agree... I prefer cheaper! :)gilmour - Wednesday, June 1, 2016 - link
Pretty sure some variation of this card will be my next purchase due to price and my performance needs but still disappointed in the specs that were shown.Considering the massive difference in clock speeds the GTX1070 can do at 150W it's very surprising how low the RX480 is at 150W.
Either the specs shown are still concealing real clock speeds and power draw and are better than shown, or the RX480 is a low end Polaris 10? part and there is a lot of headroom to release faster 480 and higher specc'ed 490 parts albeit with increased power draw.
Still seems Nvidia did some impressive engineering to get the clock speeds, performance and power draw at the levels they did, AMD maybe not quite so good.
beginner99 - Wednesday, June 1, 2016 - link
You can't compare TDP from NV and AMD. NV usually greatly understates TDP as was the case with 970 and 980. In real games the power use difference of whole system between a 970 and 290 was much less than the difference in TDP.Yojimbo - Wednesday, June 1, 2016 - link
TDP is not a measure of average power consumption. The TDP rating indicates the maximum sustained heat the chip is allowed to dissipate in real world usage, or put another way, the minimum heat the cooling system must be able to dissipate. If NVIDIA were to "greatly understate TDP" then their cards would be burning up left and right because their cooling systems wouldn't be able to keep up.If a card were to be consistently closely approaching its TDP across a wide range of uses then I think that card would probably be TDP bound. If one card is far from its TDP in a particular sustained real world use case, then there must be some other real world use case where it uses a lot more power, because otherwise the TDP claimed for the card could be lowered.
As far as what you said about the GTX 970 in comparison to the R9 290, it seems to me that you probably saw figures in just a small number of use cases. But if the 970 really is able to more consistently approach its TDP over an average of a large range of uses it seems to me that it implies the underlying architecture of the GTX 970 is more balanced, because there must be more extreme outliers for the R9 290 to require its TDP to be upped for those cases.
Regardless, the R9 290 consistently used significantly more power than the GTX 970.
Yojimbo - Wednesday, June 1, 2016 - link
An interesting thing I've noticed is how much NVIDIA's transistors per core count has gone up from Kepler to Maxwell and also from Maxwell to Pascal while AMD's transistors per core count has been relatively flat from 1st through 3rd generation GCN. I wonder what the transistors per core situation is for 4th gen. GCN.ajlueke - Wednesday, June 1, 2016 - link
Well, in Maxwell to Pascal you have actually changed process nodes, so you expect the number of transistors that can be placed on a die of the same size to increase. Now from Kepler (GTX 680) we went from 3.5 billion transistors to 8 billion in Maxwell 2 a 128% increase, all on the 28 nm process. While, AMD went from the 4.3 billion transistors in the HD 7970 (GCN 1) to 8.9 billion in the R9 Fury X (GCN) and increase of 107% all on the 28 nm process. Doesn't seem that different to me.Yojimbo - Wednesday, June 1, 2016 - link
I'm not talking about transistors per die area, but transistors per compute core (CUDA core as NVIDIA calls it). What that means is that NVIDIA has been using a greater percentage of the total transistors for cache, register files, ROPs, schedulers, and the like, and a lesser percentage on ALUs. But they have been getting greater efficiency out of their architectures by doing so.pashhtk27 - Wednesday, June 1, 2016 - link
Finally a card that I can dream of buying, something that won't make the poor me feel too bad investing on.If it can really offer gtx970 level performance, I'm in. Looking forward to release and benchmarks.
Will make a nice combo with egp docks when cheaper ones come out.
webdoctors - Wednesday, June 1, 2016 - link
I have the same thought/question. How does this card compare to the 970 which is now going for about the same price? Do we have a new $199 king?pashhtk27 - Wednesday, June 1, 2016 - link
Where I live, the lowest price I found for the gtx970 is about $390. And that for r9 380 is about $320. Average price is about $500 for gtx970, and about $400 for r9 380. So the base price is a very important factor here. Gpu are really expensive, for $199 you don't even get a gtx950 2gb ($220), and barely r7 370 2gb ($195). Lowest prices. Can't even compare......Other pc components like cpu, psu, hdd has the same cost margins.......
Gaming has become a luxury.
just4U - Thursday, June 2, 2016 - link
The 390 matches the 970 performance wise. So .. according to reports this 480 will likely beat the 970 across the board but not by a lot.. (unless maybe you factor in the ram equations..) If I had to guess .. I'd say 5-10% faster...Cliff34 - Wednesday, June 1, 2016 - link
Can someone explained to me why is there a new for a VC to be efficient in power usage? If you are going to do any heavy gaming, your rig probably be one that uses a lot of power. How does using less power a reason for a gamer to get this VC? Is it just to save a few bucks from electricity or is there a more compelling reason than that?numz - Wednesday, June 1, 2016 - link
While power efficiency definitely helps save a bit on electricity, I think the better reason is that people won't need such a high wattage power supply to run the video card along with the rest of the machine. This has always been a problem with AMD since their video cards are almost always more power hunger than cards from Nvidia. Meaning if you wanted an AMD card, you'd likely need a beefier power supply than you would've needed if you went Nvidia.This might not be important for someone who with a big budget, but people who want the best performance for the lowest price should generally be happy about this.
nagi603 - Wednesday, June 1, 2016 - link
Actually, you only needed a good/premium quality relatively low wattage PSU for any one-chip AMD card. I mean I'm running my 290X on a 400W Seasonic X, and even with power limit raised to +50%, there is no problem with it.extide - Tuesday, June 7, 2016 - link
FWIW, a 450W PSU will run any single CPU + GPU on the market right now.rhysiam - Wednesday, June 1, 2016 - link
I would say it's got more to do with branding and image than anything else. IMHO, AMD is specifically trying to address some really bad press they've been getting over the last few releases which has hurt their brand badly.As I understand it AMD were really banking on 20nm GPUs, and when that process basically failed, the only way they could stay competitive was to essentially release cards which were clocked to their very limit. Have a read of any review of the R9 290 or 290X. While okay from a price/performance perspective, the cards were slammed for running extremely hot, unbearably loud and having basically no overclocking headroom whatsoever. They were already maxed out. It was seen (I think rightly so) as being a pretty inelegant and brute-force attempt at remaining competitive. The Fury line did little to shed that unwanted image.
So from a pure performance perspective, I agree with you that power draw isn't that big of a deal, but for sure heat, noise, power supply requirements and overclocking headroom are all factors which play into the value proposition of a card. What's really going on here though, is AMD trying to shed the unwanted image of being the brute-force, hot, loud and power hungry option on the market.
Kutark - Wednesday, June 1, 2016 - link
It has to do with removing that heat from the video card. It has nothing (at least in discrete market) with overall power usage. Very very few people make gpu purchase decisions based on the power target. That was one of the reasons that the Titan X and 980Ti were so impressive, they were able to build a chip that was close to the theoretical maximum chip size (which is aroune 600mm2). Basically heat goes up exponentially with the physical size of the chip.Duckeenie - Wednesday, June 1, 2016 - link
From my perspective: wife friendly PC cases don't dissipate enough heat for video cards with high power usage.Listen guys, irrespective of what people say; power and size matters.
bedscenez - Wednesday, June 1, 2016 - link
that 150W TDP is disappointing to me. 1070 has same TDP but performs better. Efficiency is AMD's weakness.Eden-K121D - Wednesday, June 1, 2016 - link
and will likely to remain. TSMC 16nm FF+ is better than Samsung/ Glo-Fo's 14nm techMugur - Wednesday, June 1, 2016 - link
I'm afraid you're right. Too bad that AMD can't use TSMC anymore, instead relying on GloFo... GloFo almost killed their cpu business and (indirectly) hurt also their gpus. AMD wasn't prepared to stay at 28nm so long. Judging by the 480 specs, I think that 14nm GloFo is not so bright either.Eden-K121D - Wednesday, June 1, 2016 - link
I think they should have gone 14nm LPP of samsungatlantico - Wednesday, June 1, 2016 - link
They did, the 14nm GloFo is Samsung 14nm, licensed. The 14 nm process is not made by FloFo, but by Samsung.Mugur - Thursday, June 2, 2016 - link
Yes, but I am under the impression that Samsung's 14nm is not as efficient as TSMC's 16nm. Look at the iPhone controversy (they are dual sourcing the SoC from both).Meteor2 - Saturday, June 4, 2016 - link
Which in fact suggests there's nothing between them.bedscenez - Wednesday, June 1, 2016 - link
and what about GTX 1060 of course it will have a lower TDP maybe around 110-125w and if Nvidiawants to price them at 200 dollars then AMD loses the Mainstream market. I like AMD and i want them to succeed but not this.
jardows2 - Wednesday, June 1, 2016 - link
Will AMD lose the Mainstream market just on the basis of NVIDIA releasing a "more efficient" card at the same price? What if that efficiency comes at a significant performance hit? That's why my last $200 card was an AMD - the 960 could not compete on performance with AMD's offering at the same price.Meteor2 - Saturday, June 4, 2016 - link
But where is it? The 480 has been launched. Nvidia could do lots of things. What matters is what they are doing. The 480 will have this market to itself for a quarter+.extide - Tuesday, June 7, 2016 - link
AMD can certainly use TSMC if they want, just like they can certainly buy Samsung HBM2 if they want. Lol, its so funny that people think that for some reason they are "locked out" of these options.bucketface - Wednesday, June 1, 2016 - link
You do realise that TDP is not Power consumption right?1070 is going to use an 8pin power connector which should give it room to use up to 225w so it deffinitley uses more than 150w, the rx480 has a 6pin connector so max power is 150w which means it'll use less than 150w... just guestimating but i'd expect actual power usage under load of the 1070 to be between 150w & 180w while the rx480 will probably sit between 120w and 140w... anyway we will hav to wait till actual hands on reviews come out to get more exact figures but dont look at tdp as anything more than a rough guide when it comes to power use.
HollyDOL - Wednesday, June 1, 2016 - link
Well, at least EVGA on their site claims "Max Power Draw: 150W" ... http://www.evga.com/Products/Product.aspx?pn=08G-P...bucketface - Wednesday, June 1, 2016 - link
Ok, having looked more closely at gtx1080 power draws, they do seem to average at about 180w, peaks are still over 200w (as an aside EVGA look to be offering an OC'd one with what they claim as max power of 215w, so looks like thay will be trying to push the limits)so maybe the 1070 does average around the 150w mark and just needs the extra room of the 8pin for peaks.
That said the rx480 should be averaging somewhat less than 150w due to the limitations of the 6pin connector.
Meteor2 - Saturday, June 4, 2016 - link
TDP is average maximum draw. There's no place the energy goes other than heat.extide - Tuesday, June 7, 2016 - link
Yep, all of the energy used by the GPU turns to heat. It doesnt turn into motion, or light, etc, so it MUST turn to heat. Some of the power is used by the fans, of course, but other than that, its heat!at80eighty - Wednesday, June 1, 2016 - link
~15% less performance than a 1070 @ ~48% less price.it's a no brainer for those who are balancing all available factors
lakedude - Wednesday, June 1, 2016 - link
Exactly, nVidia is likely better but not worth the extra cost if those numbers hold up. Were I still single with disposable income I'd be buying nVidia because better. Now that I'm Al Bundy the RX480 is much more tempting because value.Valantar - Wednesday, June 1, 2016 - link
Where on earth are you getting 15% less performance than the 1070 from? To quote PCPer's review of the GTX 1070: "The GTX 1070 based on Pascal is never less than 53% faster than the GTX 970 when running at 2560x1440 and is nearly twice the performance in Rise of the Tomb Raider!" They also show it performing around 50% above the 390X. The R9 390 performs roughly on par with the 970, but beats it in a few situations. This seems to be expected to slightly outperform the 390. In other words, I'd expect this to perform at 60-70% of the GTX 1070, not 85%. Even with that considered, it's good value, but nowhere near what you're saying.And no, I'm not an Nvidia fanboy. I've never owned an Nvidia card, and my current PC has a Fury X in it. I'm simply opposed to making bogus claims, especially ones that build up unrealistic expectations of underdogs you really want to succeed.
Tewt - Wednesday, June 1, 2016 - link
I'm not sure how Anandtech measures but based on the info so far, just looking at teraflops, about 6.4 for the 1070 and over 5 for the RX480, can net you anywhere from 20% to 15% less performance. I used 5.2 teraflops(we don't know exact values yet) for AMD so a simple calculation of 5.2/6.4 gets you 81%.Tewt - Wednesday, June 1, 2016 - link
and yes, that is an oversimplification so I will still wait for actual benchmarks of retail units but just at a quick glance of the info available, I agree with at80eighty.Valantar - Thursday, June 2, 2016 - link
My Fury X measures in at 8,6 GFLOPS. The 980Ti lands at 5,6 - yet outperforms my card in a number of games. The same is systematically true across the AMD and Nvidia lineups. AMD cards are far superior to Nvidia cards in compute, but not in gaming. Wether this is architecture or driver related is something I'm not qualified to answer, but that doesn't make it less true. A 15-20% deficiency in FLOPS to an Nvidia card does not translate to 15-20% less gaming performance, unless AMD has made some gargantuan architectural strides witl GCN 4.Tewt - Thursday, June 2, 2016 - link
"A 15-20% deficiency in FLOPS to an Nvidia card does not translate to 15-20% less gaming performance, unless AMD has made some gargantuan architectural strides witl GCN 4."Yeah, I know thus why I said it was an oversimplification and was only base on information we had NOW. There is no other information to go off of. Sure, I could throw in Gameworks and say how that screws AMD everytime so their true performance is never known because they get second-hand dev support but that still doesn't say about how either card's architecture will be received or how they have improved until we see actual benchmarks.
Valantar - Saturday, June 4, 2016 - link
There is more information to go on - knowledge of how FLOPS translate to graphics performance on the different GPU architecture in the last few generations. We have no reason to think GCN 4 is a paradigm shift in terms of architecture - in that case, AMD would make more of a marketing point of it - thus it's reasonable to assume the relation between FLOPS and gaming performance should be roughly equal. 10% better? Sure, that might happen. But not 50%.Stating that "we don't know, so I'll assume massive improvements" is ridiculous. You're only setting yourself up for disappointment. I'd rather be realistic and end up getting a positive surprise, if that's the case.
bedscenez - Wednesday, June 1, 2016 - link
and what about GTX 1060 of course it will have a lower TDP maybe around 110-125w and if Nvidiawants to price them at 200 dollars then AMD loses the Mainstream market. I like AMD and i want them to succeed but not this.
ET - Wednesday, June 1, 2016 - link
All I have to say is: AMD marketing, really...What's RX 480? It was what, less than 3 years ago, that AMD said it's switching to a naming scheme that's blah blah blah, with numbers representing something and completely confusing the consumer but it makes sense, right, because it's better than one number. And then it introduces Fury, and now RX.
At least they're consistent in trying to confuse consumers, got the give them that.
FourEyedGeek - Wednesday, June 1, 2016 - link
Previous numbering system:Radeon HD 2xxx, Radeon HD 3xxx, Radeon HD 4xxx, Radeon HD 5xxx, Radeon HD 6xxx, Radeon HD 7xxx, Radeon HD 8xxx
Current numbering system:
Radeon Rx 2xx, Radeon Rx 3xx (includes Fury), Radeon Rx 4xx
For the performance it offers, it really should be R9 480, as it is similar to R9 390.
BOOSTERK - Wednesday, June 1, 2016 - link
I think you're missing the point. RX probably means R10. So basically it's an R10 480.Eden-K121D - Wednesday, June 1, 2016 - link
RX sounds like GTX and sounds cool compared to R9 which more sounds like an old school measurementadamod - Wednesday, June 1, 2016 - link
personally i wish that they stil used r7 and r9 but that it meant something specific....either one particular architecture or memory type or memory bus width or something......afaik al those numbers meant before were high end or mainstream which doesnt give much info.....if youre going to have r anything in the part number it should mean someting...then again i am a bit of a purist i just assume they start labeling them radeon 1, 2, 3, 4, etc. where the number is the relative speed compared to the others...as they make improvements in between cards they can list them as 1.1, 1.2, etcbedscenez - Wednesday, June 1, 2016 - link
So this RX 480 is not the R9 480x the Full Polaris 10 Chip not the cut down.Michael Bay - Wednesday, June 1, 2016 - link
Marketing is the best performing part of AMD as a whole. Just look at the comment section on AT, they still have users!Great success.
acccddccc - Wednesday, June 1, 2016 - link
Really tempting from a performance per dollar ratio. Looking forward to seeing the numbersR3MF - Wednesday, June 1, 2016 - link
Is it just me thinking that in the R(X)480 they've only announced the 2016 version of the 7850?With 36x64 shaders at 1.1GHz and 256GB/s GDDR5.
May we expect an announcement for a fully enabled R(Z)480 to fill the spot of the old 7870?
With 40x64 shaders at 1.3GHz and 320GB/s GDDR5X.
doggface - Wednesday, June 1, 2016 - link
As a casual gamer this hits me at my home. A Price range I can afford. Features that I want such as hdmi2 and Dp 1.3/4. I am hoping HEVC decode is decent with main10 profile (haven't seen anything about that) and it makes a 1440p Gaming experience possible (let alone VR which well, lets just say I want my own spaceship).There is a lot to like. But then I can't justify blowing $500 USD on a card so the 1070/1080 are interesting and COOL! but ultimately not going to make the sale.
As a NVidia fan for the last 3 generations... Well Played AMD, Take my Money.
adamod - Wednesday, June 1, 2016 - link
wait for the 1060 to come out before you make that purchase....personally i am an amd fan all the way but if that 1060 is going to be only slightly more expensive an better than a 970 it might be worth it to go for that...good chance you would have better efficiency and more overclocking headroom...that said, i usually roll red team myself so ugh im torn...LarryBarry - Wednesday, June 1, 2016 - link
Mid range card competing with 3 years old nvidia cards FAIL!!! Amd has become a mid range company and can not compete with the big boys....sad.rderubeis - Wednesday, June 1, 2016 - link
amd is smart. There making a card the avg person can afford. If this 480 is as good as a 980 for 200 bucks. This card will get alot more sales then the new nvidia cards.fanofanand - Wednesday, June 1, 2016 - link
Best to ignore Larry the troll.define - Wednesday, June 1, 2016 - link
With single 6 pin and 150W TDP and 14nm FinFET process this card screams non-referent designs. It seems for me AMD did not waste too much time on expensive PCB board but will leave it to the partners imagination :)Eden-K121D - Wednesday, June 1, 2016 - link
A backplate is a mus these days. But i like AMD's reference design language although isn't the Card too thickUSGroup1 - Wednesday, June 1, 2016 - link
I don't get it, is there currently a game that R9 380 can't handle in 1080p max details!? I'm not interested in VR stuff yet, is there a reason to upgrade to this card for 1080p gaming?Alsoran - Wednesday, June 1, 2016 - link
I thought that and my initial excitement has been dampened by the specs. Granted its new tech so we might get a gain there, the power draw is less so possibly less heat, but overall my current R9 390 compares well with the specs I've seen so far for the RX 480. I'll wait to see the reviews and the benchmarks before committing I think.praxis22 - Wednesday, June 1, 2016 - link
I don't think it's meant for us, I've got a 390 too, but it looks promising for the future, may be able to socket one to go dual cards with Vulkan.adamod - Wednesday, June 1, 2016 - link
i still have an r9 280x and at 1080p there isnt anything that i play that i cant max out at 1080 with more than playable framerates...battlefield 4 runs great, far cry no problem, etc....i have to think and hope that this is more for 1440p...i am personally hoping for playable 4k60 rates with at least medium settings but i am skepticalrhysiam - Wednesday, June 1, 2016 - link
It was always optimistic to expect a mid range card to offer a substantial upgrade to your high (ish) end card which was released less than 12 months ago. I don't think that's ever been the case.praxis22 - Wednesday, June 1, 2016 - link
At stock probably not, modded, different matter entirely.USGroup1 - Wednesday, June 1, 2016 - link
I don't get it, is there currently a game that R9 380 can't handle in 1080p max details!? I'm not interested in VR stuff yet, is there a reason to upgrade to this card for 1080p gaming?Ariknowsbest - Wednesday, June 1, 2016 - link
It seems to offer great value for money and probably help the retake market share.milkod2001 - Wednesday, June 1, 2016 - link
480 or 1070 for 1440p gaming? Would 480 be enough? thoughts?rderubeis - Wednesday, June 1, 2016 - link
mostlikely yesHollyDOL - Wednesday, June 1, 2016 - link
Depends how long do you want to endure with your card and whether you want max quality or are fine with some lower ones...As for max settings... If you upgrade every generation, both will be fine... If you upgrade every 2 generations, 1070 should be fine but hard to guess about 480 now... If you upgrade on longer basis you might want the top star.
I still run on 2011 GTX-580 and only recently started hitting the wall (1440p as well). So unless something big happens flagship card should be pretty decent for 4 years. One step lower would hit the limits much sooner, my guess is ~2,5yrs. Lower cards then, if you want to play on max quality you'd be swapping every 12-18 months, some fresh&crazy games won't be able to run it on full detail. As for Rx 480... nobody has a clue how strong it will stand, but I'd guess it to be lower than 1070 so guessing 12-18 months.
maximumGPU - Wednesday, June 1, 2016 - link
Just now started hitting the wall on a 580 at 1440p?!i had a 670 (roughly equivalent to your 580), and that could barely play any games on med to high settings at 1440p. you must've sacrificed quite a bit on your graphics.
Only when i upgraded to 970 was there relief at that resolution.
HollyDOL - Wednesday, June 1, 2016 - link
Nope... I play on max and had no issues so far... until Witcher 3 and Black Desert Online (this year, I waited with W3 to get patches done) which both turned it in a slideshow renderer (below 20FPS). But until then, no visible issues. Not that I would start drilling my knees if FPS dropped to 30ish. That given I have OCed watercooled model from EVGA, but that's why I subtracted a year from useful-life-span.atlantico - Thursday, June 2, 2016 - link
You're right, the GTX580 can't run games from 2012 and onwards at 1440p in high settings. It will do games from 2010 like Fallout New Vegas in the highest settings at 1440p, but Dragon Age Inquisition (2012), forget it. On highest settings the GTX580 will squeeze out 20ish fps at 1440p.mapesdhs - Friday, June 3, 2016 - link
I've tested a lot of 580s, including numerous 3GB models (I have six of the completely crazy MSI LEs), also tested 970, 980s and 980 Ti. A single good 980 beats two 580s SLI, sometimes coming close to three 580s. A 970 is about on par with two 580s.I used 2x 580 3GB SLI for a fair while, but I'm glad I switched to a 980 as even two 580s couldn't handle what I was trying to do with custom settings for Crysis and maxed out settings for E.D. (VRAM capacity being the issue as often as performance). All depends on the game, resolution, and one's predilection for maxed/custom settings, mods, etc.
If one uses typical middling game settings though, then 580 SLI is still pretty decent in many cases, though the 1.5GB versions would likely be not so good for 1440p. Thankfully, 3GB models are cheap.
adamod - Wednesday, June 1, 2016 - link
i ran an r9 280x at 1440p with high (not ultra) settings and aa and af off with no problems, i dont check the fps because i dont really give a shit what the number is so long as it is playable and it was more than enough.....that said some more eye candy would have been greatly appreciated....i have to think for current games this will be fine and honestly depending on how developers take on dx12 might be ok for some future titles as well, but maybe not...?praxis22 - Wednesday, June 1, 2016 - link
The 390 does better at 1440 than it does at 1080, so given that the 480 is broadly similar to the 390 I'd say yes. especially in smaller case with a smaller PSU.medi03 - Wednesday, June 1, 2016 - link
480x I'd say for 1440p, it should be within 10-15% of 1070 for about half of the price.Valantar - Wednesday, June 1, 2016 - link
You have seen the reviews where the 1070 _crushes_ the 390X at 2560x1440, right? As in ~50% better? I'm an AMD fan, but this card isn't going to touch the 1070. No way. I doubt it would get anywhere near 15% below the 1070, even in best case scenarios. Nor is it meant to, at that price. but if it performs 60-70% of the 1070 at 55% of the price, that's great value for money, and a clear win for AMD.Zingam - Wednesday, June 1, 2016 - link
Is VR going to take off successfully this time around or it will plunge into the ground soon afterwards once more?praxis22 - Wednesday, June 1, 2016 - link
Given that Valve, Facebook & Google are behind it I'd say that yes, VR is here to stay, though you may want to wait for the next headsets, or see what Google has in store for Android "Daydream ready" phones.Valantar - Wednesday, June 1, 2016 - link
This looks promising, and unlike most people out there (at least that's how it seems) I think the "mass market first" strategy is a good idea. Sure, flagship products draw customers, but these will hit the market before Nvidias GTX 1060 (unless Nvidia rushes the launch due to this), and if the 1070 and 1080 are anything to go by, at a lower price.That reference cooler looks nice, I hope it also performs decently and is actually going on sale. They showed a similar cooler for the 3X0 series, yet that never made it past R&D, it seems.
pav1 - Wednesday, June 1, 2016 - link
I find it strange that people are divided over Nvidia and AMD. It would be fair to say that traditionally those who don't care about Power consumption/heat/noise like AMD?adamod - Wednesday, June 1, 2016 - link
i run a midrange rig and play at 1080 or 1440...amd and nvidia both have reasonable cards for that segment at reasonable prices...im just a typical consumer....heat, i dont care, power consumption, i dont care, noise on my card isnt bad (sapphire with dual x fans) so thats not a problem, though i will say i wish i could overclock more (any!) it isnt a deal breaker for me.....personally i like amd's software over nvidias, simple as that, i like the way their config is doneTheinsanegamerN - Wednesday, June 1, 2016 - link
That wasnt the case until maxwell came out, now power usage seems to be more important than performance.idris - Wednesday, June 1, 2016 - link
I don't think you've kept up to date with AMD Polaris developments...atlantico - Thursday, June 2, 2016 - link
Not really, the GTX480 is perhaps the loudest, energy sucking, heat emitting GPU ever made. And it's Nvidia.mapesdhs - Friday, June 3, 2016 - link
Actually not quite, the 290X at launch was worse for power/heat (just checked the Anandtech reviews of both to be sure), but the 480 was certainly louder.Mind you, have you ever heard two 7970GEs in CF? It's the loudest setup I've ever heard so far, worse than my oc'd 900MHz 580s on max fan.
adamod - Wednesday, June 1, 2016 - link
i dont care about VR at all, not until prices of all of the gear comes way down...i want to see how this compares to the 1070 and supposed 1060...and yes i realize the 1070 is more expensive and the only report i saw of the 1060 speculated that it too would be slightly more.......but if i can game at medium settings at 4k60 on a 1060 but not on an rx480 then ill spend that little extra, and if all i need is that 480 then i will certainly be staying on team redgeekman1024 - Wednesday, June 1, 2016 - link
Hmm... I think I'll wait for RX 782.digiguy - Wednesday, June 1, 2016 - link
No way this is even close to the 1070, but I am curious to see if this can at least equal the 980 and what the price and performance of the 1060 is going to be. Could be an interesting card for TB3 enclosures, as TB3 would be less of a bottleneck (if at all) compared to a faster 1080 for instance...adamod - Wednesday, June 1, 2016 - link
maybe i am wrong here but my understanding is tb3 runs on a pcie x4 3.0 bus and from what i have seen in testing there is only a 2 or 3 percent difference between x4 and x8 and no difference between x8 and x16 in terms of performance from any graphic cards up until now....i think it was a titan x they had done the testing with but i would have to find the article or video or whatever...my guess is that either 1080/70/60 will run slightly below where it would on a full x1 interface and wont be noticeable in gameplay except for those cases where the game is right at 30fps and losing 3 percent actually will matter.........ill be closely watching this and the 1060 to see if either can do 4k at medium settings reasonably, fingers crossedmedi03 - Wednesday, June 1, 2016 - link
According to leaks (the C7) it scores better than 980 in 3dmark (between 980 and Fury non-X in fact)digiguy - Wednesday, June 29, 2016 - link
Now that the tests are here, we know that 480 is on par with 970 so clearly below 980 and as I said above not even close to 1070. Waiting to see how Nvidia responds with 1060 performance, and what price and, above all, when (as AMD has no competition in this segment for now...)Impulses - Wednesday, June 1, 2016 - link
Not sure I understand the fuzzy logic across some of these comments, people trying to approximate this card to the GTX 1070 or even judge one as the better value...AMD themselves have positioned the RX 480 right around the R9 390X. Last time I checked a GTX 1070 is right around a GTX 980 Ti and that was at least as good as 2x R9 290/390 in CF.
Seems like apples and oranges to me. All this posturing about whether it's smarter to go for the high end or the midrange first is just nonsense, anyone not desperate for an upgrade will simply wait out until both lineups are fully fleshed out.
I'm very tempted by 2x GTX 1070 to replace my R9 290, but I'm still gonna wait to see what Vega brings because I'm simply not desperate. Similarly if i was looking at the RX 480 I'd wait for the GTX 1060 to launch.
Nothing worse than kicking yourself because you bought a next gen card too early and the price dropped a couple months later in response to the competition.
medi03 - Wednesday, June 1, 2016 - link
980 Ti was NOT "at least as good" as dual 390's, what are you talking about?TheinsanegamerN - Wednesday, June 1, 2016 - link
It certainly was when crossfire didnt work, which was pretty common for new games until the profiles got updated.atlantico - Wednesday, June 1, 2016 - link
980ti "at least as good as dual 390s" when only one 390 is working lol great logic! You're aiming too low, a single 980ti is at least as good as over 9000 Crossfire 390s when only one of them is working.lulz
Impulses - Friday, June 3, 2016 - link
Looking thru old reviews 980Ti certainly seems to trade mess blows with CF R9 290, in some games avg fps are lower but the minimum is equal, in some it's better and in others it's worse.Dunno, looks pretty close to me, so "at least as good as" may not be entirely accurate (that could be interpreted as slightly better than), but ghee point stands... We're looking at vastly different levels of performance.
I've got zero interest in fanboy debates, the RX 480 looks great if you're price sensitive and only interested in 1080p gaming... But as an upgrade path from my CF R9 290 it's decidedly uninteresting.
I'm not someone that's gonna go the opposite extreme and blow over a grand on GPUs either, and like I said, I probably WILL wait for AMD's real next gen flagship before upgrading.
GTX 1070 SLI looks mighty tempting in the meantime tho...
jabber - Wednesday, June 1, 2016 - link
I really don't get the huge excitement for people wanting to look a total dork wearing a VR headset. I just think of that episode of Red Dwarf showing them galloping on a horse... I think watching people use VR will be far more entertaining than those using it.rhysiam - Wednesday, June 1, 2016 - link
What, because people look totally cool sitting in front of a screen with KB + Mouse in hand shouting things like "grab an inhib and baron before they respawn" into a ludicrously oversized gaming headset? That all appears utterly ridiculous to non-gamers, but we still do it because the experience is fun enough to pull us in. VR will go the same way. If (and that's still very much an "if" at this stage) the experience is good enough, who cares what you look and sound like?jabber - Wednesday, June 1, 2016 - link
No that can look pretty pretty silly too but VR headsets take it to a whole higher level of nerdism and dorkiness.mapesdhs - Friday, June 3, 2016 - link
I don't think it'll be as appealing in the same way either, bunch of pals getting together for a gaming night, that sort of thing.cocochanel - Wednesday, June 1, 2016 - link
Speak for yourself. VR is the future. Headsets will come down in price, size and weight. Imagine playing Battlefield 1 in VR. Or jump into a spaceship and tour the Milky Way. Way to go !mapesdhs - Friday, June 3, 2016 - link
Spaceship? Explain to me how I'm supposed to see all my controls for ED if I have an HMD stuck on my face? One can get by to some extent with voice recognition, but there are limits.Meteor2 - Saturday, June 4, 2016 - link
Pass through cameras.fanofanand - Thursday, June 2, 2016 - link
You think that's bad, check out the backpack PC's they are making for VR.jabber - Friday, June 3, 2016 - link
Yeah its 3D TVs all over again. By 2020 it will all be gone again for another 20 years.tipoo - Wednesday, June 1, 2016 - link
Curiously missing was an answer to SMP, Nvidias technology to reduce the rendering overhead of drawing two scenes to almost negligible. If this 200 dollar card has a VR focus, a feature like that is absolutely killer.
tarqsharq - Wednesday, June 1, 2016 - link
I don't think that AMD has revealed all the interesting features and changes with Polaris and GCN 4.They just wanted to get their foot in the door and make sure the public knew about a low price option before they started breaking their piggy banks to get the 1070 in a week or so.
Joseph_Crox - Wednesday, June 1, 2016 - link
Here is a paragraph of ArsTechnica article: http://arstechnica.com/gadgets/2016/05/nvidia-gtx-..."Three-way SLI? You'll need a code for that
For even more performance, there's SLI. Those hoping that Nvidia's rapidly ageing SLI technology for linking two or more graphics card together would die out with Pascal will be disappointed—but, on the flip side, at least there are a few improvements.
For starters there's a new jaunty, high-bandwidth SLI bridge—or rather, there are three of them depending on your motherboard spacing. They are designed to link just two GTX 1080s together at a higher 650MHz speed (versus 400MHz) by using the second SLI connector traditionally reserved for three- or four-way SLI configurations. Nvidia claims that the new bridge results in less stuttering, although without a bridge or second 1080 to test, we'll just have to take Nvidia's word for it for now.
An odd side effect of the move is that, for the first time, Nvidia is officially recommending users go with a two-way SLI configuration. Those running 4K or monitor surround should use a HB Bridge. SLI has never scaled all that well past two cards—and four-card solutions are pretty much just for show in games—so this isn't all that surprising."
What is you want to buy 3 GTX 1070 and you couldn't work with that. But you can run 4 RX 480 and has all the power of crossfire and asynchronous computing (i.e. GCN 4.0). Which GPU will worth something in 5 years?
fanofanand - Wednesday, June 1, 2016 - link
How many people do you suppose are going to run 4 x RX 480? I understand what the article is saying, but I have never heard of anyone going tri or quad sli with midrange cards.LuxZg - Wednesday, June 1, 2016 - link
So it will have DisplayPort 1.4 after allajlueke - Wednesday, June 1, 2016 - link
This is really an exciting time. If AMDs promises hold, and they really can deliver R9 390X range performance at 150W for $199, that is a great accomplishment. It gives me hope that the APU era will finally deliver on some of it's promises.A Zen APU that can game acceptably at 1080p in a small ITX box like a NUC. With NVMe drives available, you don't need to waste space on drive bays, you can stick those in your NAS. But the APUs to date have always fallen a little short. If Polaris is what they claim it is from a performance and efficiency standpoint, and Zen is as well, I think we'll finally get the APU offerings that make low/mid range obsolete and consoles as well. Discrete video card gaming will then be relegated to VR and 4K+ setups for those who really want to spend the money and push technology to the bleeding edge.
However, even in that space the future looks bright as Vega is coming by years end, and NVidia is moving to HBM2 by early next year. Imagine designing a VR box with an Nvidia based HBM2 board. Can they get Pascal into a board the size of the Fury Nano? Or smaller? It's so energy efficient already, I'm betting they can.
spaceholder - Wednesday, June 1, 2016 - link
I noticed in the pictures connectivity is all HDMI and Displayport. As DVI-D is the only way to get 144hz 1080p I hope the production cards have a port. Otherwise the hardcore FPS players will suffer.JoeyJoJo123 - Wednesday, June 1, 2016 - link
>As DVI-D is the only way to get 144hz 1080pOh... The absolutely wrong and misinformed on the internet...
Oh, by the way, you're wrong. HDMI 2.0 and DisplayPort 1.4 can carry plenty more bandwidth than DVI-D and are more than capable of 144hz at 1080p, and even more.
mdriftmeyer - Wednesday, June 1, 2016 - link
Not to mention are HDR standards compliant.http://www.vesa.org/featured-articles/vesa-publish...
spaceholder - Thursday, June 2, 2016 - link
Most google searches (and redditors) disagree: https://www.reddit.com/r/pcmasterrace/comments/2hc...Maybe the latest and greatest HDMI supports it, but most video cards, GPU's and monitors from more than a year ago dont. Unless you want to game at 720p.
As I said - 144hz @ 1080p + freesync isnt going to happen over HDMI. Displayport maybe, my GPU doesnt have one though =/
JoeyJoJo123 - Thursday, June 2, 2016 - link
Did you read?HDMI 2.0, not 1.4. TWO POINT ZERO.
Valantar - Saturday, June 4, 2016 - link
Both HDMI 2.0 and DP 1.3 (not to mention 1.4) have higher bandwidth than DL-DVI. Also, there are adapters aplenty for DP-whatever you want. Have an old 144Hz DVI monitor, want a new GPU? Get an active DP-DVI adapter.I for one applaud AMD for pushing out obsolete standards. The sooner, the better. We're still not rid of VGA, after all.
alawadhi3000 - Wednesday, June 1, 2016 - link
R9 390 and 390X have 6Gbps vRAM.spaceholder - Wednesday, June 1, 2016 - link
As someone who usually buys GPU's at the $150~ price and holds them for 3~ years this has my attention. The fact that its finally a new architechture offering 100% or so improvement over the last product to hold that price point makes it ok to spring for the $200 part.I was getting sick of the last 5 or so years of reviews saying "This isnt a new architechure, AMD just tweaked the old one, added 10% performance and x/y features. It still suffers from high TDP and isnt really competitive with Nvidia except on the low end".
hopefully Nvidia responds in kind and the new generation of GPU's is here. Since I have a freesync monitor I'll be getting on ASAP.
Teknobug - Wednesday, June 1, 2016 - link
A single 6-pin, that's a good thing.If the price remains at that I'm getting 2 of them for crossfire.
roy2115 - Wednesday, June 1, 2016 - link
I keep hearing people say that the GTX 1070/1080 are overkill for 1080p gaming. What about in a few years? Are we at a point where graphics improvements at 1080p resolutions have reached a ceiling? If not, then I want a GTX 1070 or AMD's next offering above the RX 480 because I want to play max'd out settings at 1080p without having to upgrade every couple of years.Jerion - Thursday, June 2, 2016 - link
Sort of. We're at a point where the newest crop of GPUs has and will entirely outstrip the 1080p performance floor established with this generation (i.e. anything at or above the power of a GTX 950 will get your foot in the door with games at 1080p), to almost comical degrees. The 1080 can push 45 frames per second in a variety of games at 4K resolution. In any given game where it pushes 45 fps at 4K, that can be roughly translated into 180 frames per second at 1080p. That's so far beyond overkill that it is simply ridiculous, and there is so much headroom that games in a few years still won't come close to noticeably munching on that number. If you plan to stick with a 1080p display for the next few years, the GTX 1080 is a waste of money and you would be better off saving cash and going with something in the new midrange, such as the RX 480, or the inevitable RX 470 and GTX 1050/1060.The only reason to buy a truly high-end, $300+ GPU from this new generation is if you have a 4K or 1440p/144hz display. And if you don't have one of those things, your bank account will thank you. :)
Rocket321 - Wednesday, June 1, 2016 - link
At $199 this is an automatic buy. I have been hoping to hear about green team's GTX 1060 plans, but if those are months away I will be moving to team red. I hope AMD has the supply to cover the market.rav55 - Wednesday, June 1, 2016 - link
You folks at Anand are SUPPOSSED to be pretty smart.So why do media sites benchmark NVidia's latest tech 16nm, 1.6 gHz GTX 1080 using benchmarks from the OBSOLETE DX11 API?
Well the ONLY DX12 benchmark that ran on GTX 1080 showed that GPU was BROKEN running DX12.
The best that 16nm, 1.6gHz GTX 1080 could do against 28nm, 1.05gHz AMD Fury X was a 11% improvement!!!
That's it? ALL that new tech for ONLY 11% over last years AMD 28nm FuryX?
GTX 1080 is BROKEN in DX12.
fanofanand - Wednesday, June 1, 2016 - link
Tinfoil hat wearing internet commenter uses incorrect spelling in a statement questioning the intelligence level of the author. Internet commenter than uses all caps to make sure everyone skims past the comment. Internet commenter then proceeds to make several posts in a row lamenting the same thing. Internet commenter is brilliant, even if only in his own mind.rav55 - Wednesday, June 1, 2016 - link
Hey ANAND TECHWhy aren't you running 3dMArks DX12 Draw call overhead feature test?
Is it because that test shows just how BAD Intel and NVidia graphics silicon really is?
You have the benchmark and you ran these tests last year.
Lets see how AMD crushes Intel and NVidia in draw calls again.
jwcalla - Wednesday, June 1, 2016 - link
Nobody cares about draw calls.Valantar - Saturday, June 4, 2016 - link
Yep, the maximum number of draw calls has very little relation to actual performance in real games. This might change over time, but it's not very likely.rav55 - Wednesday, June 1, 2016 - link
AMD Radeon 480 is quite clearly a binned GPU just like GTX 1070 is a binned GPU. But it's also a better DX12 performer than 1070.And at $199 one wonders just how 2 of them stack up.
Radeon 480 is meant to eat NVidia's mid price point lunch while reserving the real meal for an October release and the Christmas market.
Knowing that AMD's high end HBM2 Vega is coming out well before Christmas is going to slow sales for GTZzzzzz 1080.
First use Radeon 480 to kill the mid-price point high margin AIB market then crush Christmas sales with a new high end 8 gb HBM2 Vega 10.
And AMD gpu's aren't broken running DX12 like NVidia silicon is.
Valantar - Saturday, June 4, 2016 - link
Sure, it's binned. As in "chips without defective CUs that perform to spec." All chips are binned. What are you getting at?Totally - Wednesday, June 1, 2016 - link
Since AMD usually goes with 16 SPs per Texture unit wouldn't the total count be 144 Texture units for this card?ajlueke - Wednesday, June 1, 2016 - link
I think taking different approaches is an overall win for PC gaming. If you look at the GTX 1070, which offers the same performance as the GTX 980 Ti ($649) for only $379 retail, gamers are getting the same performance at 58% of the cost only one year later.Similarly, if Polaris does indeed perform near a R9 390X ($429) for only $199, that gives us the same level of performance for 46% of the cost just one year later.
And for those with deeper pockets, they can pick up a GTX 1080 and get performance that simply didn't exist a year ago for a price below the high end of last year and rest comfortably knowing that the card will last a long time. Even if Vega and the HBM2 Pascal wind up significantly faster, neither will be out for a least 6 months and they will likely carry the crazy price premiums of top end cards.
Meteor2 - Saturday, June 4, 2016 - link
Great comment. That's a good way to think about it.Part of me wishes the same was happening on the CPU side, but beyond video encoding, not many consumer use cases appear CPU limited.
jhayr - Wednesday, June 1, 2016 - link
IMO AMD's RX 480 sounds good for the price, i might purchase this GPU since im not really gaming in 4K.Murloc - Thursday, June 2, 2016 - link
I'd really wait for the reviews...Meteor2 - Saturday, June 4, 2016 - link
Which take about a week.Archetype - Wednesday, June 1, 2016 - link
Ok so I absolutely cannot wait to hear about the 490. Its coming. And since AMD must have limited frame rates via Crimson and maybe even graphics settings (If I am one to judge by what I saw) to showcase 50% utilisation on that dual setup running Ashes.My prediction for 29 July...
350: $100
360: $150
480: $200 (4GB)
480: $230 (8GB)
490: $270 (4GB) *
490: $300 (8GB) *
* I cannot be remotely sure - They said the range they are aiming for is $100 - $300 and I am basing my estimates entirely on that.
Archetype - Wednesday, June 1, 2016 - link
*450,460... dohMugur - Thursday, June 2, 2016 - link
There won't be any 490 with 4 GB, like there won't be any 1080 with 4 GB.3ogdy - Wednesday, June 1, 2016 - link
I for one could sit here and listen to AMD, Intel, nVidia, Sony, Dell, Discovery Channel and God himself introduce a brand spanking new generation of products designed to blow me away with their innovation and performance for much less money than ever. None of this stuff means a damn thing until I read multiple reviews online, watch videos of the actual products performing live and all that.Even Jesus could come on the stage with a GTX9080 128GB HBM50 that can make Mars go round. If there's no hard data and actual reviews of that specific piece of hardware, the money comfortably stays in my pocket, AMD.
Meteor2 - Saturday, June 4, 2016 - link
...?ianmills - Saturday, June 4, 2016 - link
May Jesus bless your soul young son. Pray to he and he will give u graphic power beyond your wildest desires!Native7i - Wednesday, June 1, 2016 - link
I'll buy 480 or maybe 490 If available at 29th. Gtx 1080 costs 120,000 in my country(local currency) . Which roughly converts to 1000 dollars because of various taxes. 480 price seems to be bearable atm.Valantar - Saturday, June 4, 2016 - link
Given that AMD hasn't so much as mentioned an RX 490, that's definitely not showing up on the 29th. Vega is rumored to be pushed to this fall (from an early winter/early next year timeframe), but you're not getting a 490 before then. From all that we know, the RX 480 is a full Polaris 10 chip - which means we need Vega for more performance.mdriftmeyer - Wednesday, June 1, 2016 - link
$229 for 8GB Model. It'll be hard to stay on shelves. Definitely worth buying two.poohbear - Thursday, June 2, 2016 - link
Hopefully June 29th Anandtech will actually post an article about the 480...still waiting for the GTX 1070 review...HighTech4US - Thursday, June 2, 2016 - link
or the MIA GTX 1080 reviewfanofanand - Thursday, June 2, 2016 - link
Or the 960idris - Thursday, June 2, 2016 - link
Where are they Ryan? Any indications when the 1080 review will be out? thanks..Meteor2 - Saturday, June 4, 2016 - link
Judging by the volume of comments GPUs are the most interesting pieces of tech around. But Anandtech seems incapable of reviewing them, much like smart phones.Ranger1065 - Tuesday, June 7, 2016 - link
Like many others I'm sure, I've waited too long and read so many other reviews that Anandtech's is sure to be an anticlimax. What I didn't fully realise until recently is that this is a broader problem affecting not only the GPU section. I feel this is something that Anandtech need to take seriously as they seem to be alienating a significant number of their readers.Cue the Anandtech faithfull defence....
catavalon21 - Thursday, June 2, 2016 - link
+1scaramoosh - Thursday, June 2, 2016 - link
People always gotta bitch about something, I think this is great.Notmyusualid - Sunday, June 12, 2016 - link
I got the nuts & popcorn, before I sat down to read it all too!catavalon21 - Thursday, June 2, 2016 - link
Any thoughts on double precision performance? I haven't seen a comprehensive Pascal review to see if it improves NV's relatively anemic double precision performance compared to single precision, but think I have seen that it's still 1:32 (Double to single). Later generations of AMD cards also decreased DP FP performance relative to SP. Is this still going to be in the 1:24 - 1:32 range, or may we see something better?mapesdhs - Friday, June 3, 2016 - link
What tasks are you thinking of which need FP64 but don't need reliable ECC RAM? NV has clearly decided that it's not worth offering partial FP64 support (like the Titan) in consumer grade cards. It's ironic that a Quadro 4000 actually has a higher FP64 rating than the latest M6000.catavalon21 - Friday, June 3, 2016 - link
I spend some time running Milkyway@home (BOINC). With that I have seen the migration from very strong DP performance (at the time) on AMD cards a few years ago toward weaker "ratios", whereas NV has remained relatively consistent at 1:32. Just curious if that was still the case. MW@H requires a double precision capable GPU.Meteor2 - Saturday, June 4, 2016 - link
Compute.Chaitanya - Friday, June 3, 2016 - link
In that last photo in gallery there seems to be connector of some sort visible on pcb anyone has idea what that is for?CountryBumkin - Friday, June 3, 2016 - link
Is this card supporting HDCP 2.2?I know it supports HDMI 2.0a
Thanks
JoeyJoJo123 - Friday, June 3, 2016 - link
Most likely yes.Regardless, any HDCP stripper will make the HDCP version support stuff a moot point.
versesuvius - Friday, June 3, 2016 - link
And with Polaris AMD has the narrowed the gap with NVidia from two years to one year, and with Zen the gap with Intel from 4 years to 3 years. That is progress and you can not knock it, but exactly how NVidia is going to react to this is already obvious. 1060 is going to be out around the middle of July with a TDP of 130 watts and a little better performance than 480X priced at $210. By the time Vega is out the door 1080Ti is already comfortably the fastest and highest performing GPU for the next year and a half. I was hoping against hope that AMD was going to actually announce Vega or at least give the specs for it but then again AMD lived up to its as of not so lately acquired reputation and just stayed behind both NVidia and Intel. Still, that is progress. Congratulations AMD. And congratulations to Intel and NVidia too. They won't have to worry about a visit by the antitrust commissioners for now.medi03 - Saturday, June 4, 2016 - link
Gap? What gap?Jesus, people.
980T loses to Fury X at resolutions starting from 1440p.
Comparing situation to CPU world, where Intel (STILL, even after bogus 14nm by Samsung) is years ahead process wise, is ridiculous.
Valantar - Saturday, June 4, 2016 - link
Sorry, but you're a bit off here. Sure, AMD has been a bit behind Nvidia the last generation - they lost bad on the failed 20nm process, while Nvidia (with their far larger resources) were able to pivot better. Then again, they're already close to closing the efficiency gap - within one generation! - while still competing on both price and performance. Also, I believe Nvidia got very, very lucky with the way Pascal clocks on 16nm, giving them better performance than expected - otherwise, they wouldn't be making a second-tier (soon to be third when 1080Ti rolls out) GPU that beats their previous flagship. We don't know anything about the clocks of GCN 4 yet, but it's probably not that high. Still, AMD can gain a lot of performance just from driver improvements, which they have been delivering steadily on for the last year. And value like the RX 480 is exactly where they need to deliver to stay relevant. Sure, flagships are image building, but they don't keep your company afloat.Also, there is no way the GTX 1060 is at 130W - the 1070 is 150W with 1920 cores at >1,7GHz. If clocks are equal, that would give the GTX 1060 ~1650 cores at 130W, which would neither be smart (it would cannibalize 1070 sales like crazy), or possible (the GTX 1080 has 4 GPCs (2560 CUDA cores), the 1070 has 3 (1920 cores), which would then put the 1060 at ... 2,5? That makes no sense when the 1060 is supposed to be based on a separate, smaller chip. Thus, 1280 cores (2 GPCs) is far more likely, as Nvidia designing a whole new GPC for "small Pascal" would be ludicrously expensive and economically inefficient (not to mention the nightmare of yet another level of driver optimization). And 1280 cores wouldn't need 130W unless they ran them at 2+GHz - which would be dumb in terms of both cooling, stability and warranty/RMA costs.
The GTX 1060 will in all likelyhood be around ~100W, and will probably perform similar to, if not slightly lower than, the RX 480. It will sell like hotcakes (it's going to be a midrange Nvidia GPU, after all), but AMD might have a winner here in terms of raw performance and price. We'll see.
Ananke - Monday, June 6, 2016 - link
I highly doubt NVidia will sell the 1060 for less than $279, which of course will be non-existent until the $350 "founders" edition is sold out...My point, don't expect anything "gaming" usable from NV for less than $300 this round.
SeanJ76 - Sunday, June 5, 2016 - link
More FAIL by AMD!!cocochanel - Monday, June 6, 2016 - link
What happens when one company brings a product that undercuts another company's profit margins ?The usual rant. And it usually comes from folks pretending to voice their own independent and unbiased opinion. In reality, and leaving the fanboys aside, these folks have a financial stake of some sort. And they don't like what they see.
Just Legendary - Sunday, June 5, 2016 - link
You people are idiots. NVIDIOT would launch a so-called state of the art Geforce GTX 2590 probably 3 or 4 months from now and price it at $35,575 USD and you losers would run and buy it anywayNotmyusualid - Sunday, June 12, 2016 - link
I might.I've dumped $10s of thousands into hifi over the years, and am used to paying for the best.
But... I am pretty much brand agnostic. If Krell make the best amp, and Stereophile is able to explain & SHOW me why, then that's it - I'll take two servings of Krell, thanks. If Halcro happen to leap-frog them in any tangible manner, and I'm feeling flush at that time, then Halcro it is.
The point is - fanbiosm is UGLY, childish, and quite disturbing, because it comes mostly from adults, displaying highly introverted personalities, which I think have no real life.
I mean, who gives a sh1t about $100 when spending HUNDREDS of them on a GPU? Not me, if its for good reason.
Anyway, I'm done, flame away, I've got six Saparro to sit back and read it with...
Jtsuishiro - Tuesday, June 7, 2016 - link
Anyone noticed that the actual pcb of the card is like 2 inches shorter?Silver Streak - Thursday, June 9, 2016 - link
I have hopes on the Radeon RX 490 for maybe about $279.Frihed - Friday, June 10, 2016 - link
Why I don't see any comment on gtx1070 also having 150w TDP and if it means anything for AMD on the high end side of things? Will they be able to make something to compete against an 1080TI?stardude82 - Friday, June 10, 2016 - link
Vega is due in October which might actually be "Fury" at top cutting down to the 490/480X. Polaris 11. coming out soon, will round out the low end.Bonez0r - Monday, June 13, 2016 - link
Just an fyi: it's FLOPS or flops, not FLOPs. Floating point operations per second. One FLOPS, two FLOPS. :)