Yeah, we also 'skipped' a generation, so it will be even a bigger bang... And with how old the 28nm is, it should be more mature process with better yields, so these prices look even more out of control.
Even with a mature process, producing a 601 mm^2 chip isn't going to be easy. The only larger chips I've heard of are ultra high end server processors (18 core Haswell-EX, IBM POWER8 etc.) which typically go for several grand a piece.
Heh, I guess you don't normally shop in this price range or haven't been paying very close attention. $650 is getting back to Nvidia's typical flagship pricing (8800GTX, GTX 280), they dropped it to $500 for the 480/580 due to economic circumstances and the need to regain marketshare from AMD, but raised it back to $650-700 with the 780/780Ti.
In terms of actual performance gains, the actual performance increases are certainly justified. You could just as easily be paying the same price or more for 28nm parts that aren't any faster (stay tuned for AMD's rebranded chips in the upcoming month).
AMD will launch the HBM card on 400 series. 300 series is an OEM only series. ... just like ... wait for it .... nVidia's 300 series. WOW talk about unprecedented!
AMD already used that excuse...for the...wait for it...8000 series. Which is now the...wait for it....R9 300 OEM series (confirmed) and Rx 300 Desktop series (soon to underwhelm).
780 was $650 at launch actually, and stayed there for some 6 months until AMD launched the 290X. The only way Nvidia will drop price on the 980Ti is if Fiji is both faster than it and priced similar, and even then Nvidia may not touch a thing.
I think Nvidia knows what AMD has and they've already set the price points for AMD so that they won't have to change their pricing no matter what.
"The only way Nvidia will drop price on the 980Ti is if Fiji is both faster than it and priced similar" ...and given themselves a month's head start ...and AMD a whole month too look at this Ti and adjust accordingly (if even necessary).
I think it's nVidia who's looking weak here. In the UK Scan/Overclockers are really low on AMD stock 290 thru to 290x ...big launch coming?
Yes, I'm sure it was AMD taking a position of strength to allow Nvidia to completely dominate the dGPU landscape for the last 9 months, unopposed, unchallenged since the GTX 970/980 launch, followed by the GTX 960, GTX Titan X, and GTX 980Ti. Makes perfect sense.
Everyone expects a big launch from AMD sure, but I guess they are just waiting for Nvidia to tire themselves out first. I mean a quick scan in the US shows you can't find the GTX 980Ti anywhere, sold out instantly in a single day. AMD is just biding their time though for something big to pop out of a hole in the ground! :)
that would be a shame. It's been a long time since I pulled the trigger on my upgrade itches. Never thought it would be this long. ( gtx 480 tri sli at $399 with the starcraft release discount coupon )
I think I assumed that the rise in pricing after gtx 580 was just going to be a short term fluke and that the world would return to sanity with resumption of the $499 pricing.
I imagine I would have probably have upgraded twice otherwise. I wonder if my demographic adds up to a significant lost market at the end of the day?
On the bright side... Having waited this long, If I start buying used in two years the performance gain will be worth the loss of warranty considering the high cost.
"Just wait for AMD's release and the price will have to drop."
Exactly why all those Nvidia fanboys should shut their mouths:
If they were to succeed in maligning AMD to death, handing Nvidia a monopoly, Nvidia will have not only lost any reason to ever drop prices, they'll also loose any reason to rush new gaming cards out. They'd put consumer cards at the very back of the queue, to be released at Nvidia's convenience - because Nvidia gets better margins from HPC sales.
Their, they're, there - no call for typo nitpicking. He who is free of guilt and all. As far as the 28nm bit, its probably ore of a well-poisoning comment by someone who leans towards the other camp.
He was highlighting incorrect grammar, not a typo. Btw, that should have been "it's", not "its", for the same reason. ;D My old English teacher would have used bullets for such errors if such as allowed...
lower nm = more transistors in the same space = more performance, and possibly reductions in heat and power usage. There is only so much they can do on the 28nm node, which is why the next generation will likely utterly massacre the current one. 4K will be feasible at a reasonable price point.
Transistor count means nothing. The GTX 780 Ti has 2.8 billion transistors. The GTX 980 has around 2 billion transistors, and yet the GTX 980 can dance with the GTX 780 Ti in performance.
As the saying goes... it's not the size that matters, only how you use it.
In this very article they list the transistor count of those two cards in a giant graph. The 980 has 5.2 billion transistors and the 780ti 7.1 billion. Still, your point is the same, they got more performance out of less transistors on the same manufacturing node. All 28nm means is how small the gap is between identical components, in this case the CUDA cores. Each Maxwell CUDA is clearly more efficient than each Kepler. Also helping is the double VRAM size which probably allowed them to also double the ROP count which greatly improved transistor efficiency and performance.
The difference now is that there are actually 20 nm products on the market today, just none of them are GPUs. It seems that without FinFET, 20 nm looks to be optimal only for mobile.
@SirMaster - The reason people care about the process node is because that right now - in mid-2015 - this is an extremely mature (ie: old but well-rehearsed) manufacturing process, which has gone through several iterations and can now yield much better results (literally) than the original 28nm process. This means that it's much cheaper to produce because there are less defective parts per wafer (ie: higher yield). Hence ComputerGuy2006 saying what he said.
Contrary to what other people say "smaller nm" does NOT imply higher performance. Basically when a shrink comes along you can expect manufacturers to do 1 of two things:
a) higher transistor count in a similar die size, with similar power characteristics when compared to its ancestor - and therefore higher performance b) same transistor count in a much smaller die size, therefore better thermals/power characteristics
Neither of these factor in architectural enhancements (which sometimes are not that transparent, due to their immaturity).
So ComputerGuy2006 is absolutely right. Nvidia will make a killing on a very mature process which costs them a below-average amount of money to manufacture.
In this case Nvidia is using "defective" Titan X chips to manufacture 980 Ti. Simple as that. Their Titan X leftovers sell for $350 less and you still get almost all the performance a Titan would give you.
Don't mix up traditional rebadging and what AMD is doing with their upcoming lineup, it's unprecedented. It's also worrisome, and it's clearly a big problem for AMD. What does it say about the state of the company and their R&D budget that they'll only have 1 new GPU this coming generation? One of the big problems AMD is currently facing, amongst many other things, are their profit margins. In order to compete they're selling larger, more expensive GPU's attached to more complex memory interfaces relative to the competition, and that won't change much this coming generation as a result of these top to bottom rebadges. The situation is really becoming quite analogous to their CPU's, which should raise alarms for any informed enthusiast. It's a less than ideal situation, to put it lightly.
If you post this after their launch then maybe I'd understand. As it is their Chips are almost certainly not rebadged because they aren't on TSMC and will include hardware improvements. They maybe be based on hawaii or w/e but not even nvidia managed to put out more than 2 new cards during their major launch last year. Considering this is likely the last year of 28nm launches, it may make sense for them to put out an entire line of modified chips and be done with the 300 series. Keeping it fresh till next year when they can do 2-3 on a smaller process.
"If you post this after their launch then maybe I'd understand." You realize I could just give the exact same response to the rest of your comment. But I won't.
"As it is their Chips are almost certainly not rebadged because they aren't on TSMC and will include hardware improvements.", That's interesting, I haven't heard anything about AMD switching fabs for the 300 series. Source?
"They maybe be based on hawaii or w/e but not even nvidia managed to put out more than 2 new cards during their major launch last year." Not quite a straight comparison there, since Nvidia also launched a new architecture with the 750Ti. Thus far we've gotten 4 new GPU's (not just cards) based on Maxwell. And we're not just talking launch here. All indications point to Fiji as the only new GPU of the coming generation for AMD. And what's more, it might not even be an updated architecture. The rest of the lineup will likely be refreshes of Hawaii, Tonga, Pitcairn, and Bonaire, ranging from GCN 1.2-1.0, with at most manufacturing revisions to improve efficiency. Again it's very important to put this in perspective, in the context of what these GPU's will be competing against. They range in feature set, and in power efficiency, none of which is anywhere close to par with Maxwell, and it's going to be very difficult for AMD to compete with this lineup for another generation. It's not a good situation for AMD. "Keeping it fresh"? How you came to spin your conclusion the way you did is beyond me.
It's not a GPU architecture. GCN is a GPU architecture, Maxwell is a GPU architecture. Fiji is a GPU, GM204 is a GPU. This isn't exactly a new paradigm we're dealing with here. Oh dear, I think I might be telling someone something (aka trolling***). I've done it again.
First, that's a terrible analogy. Puma is not a CPU, it's a CPU architecture, successor to Jaguar. Fiji is a GPU, I never once assumed or suggested that there will be a single Fiji SKU (that was all you), right now it's likely there will be 2 for the consumer market.
I'm honestly not sure what you meant by "segment", perhaps you could clarify? Are you talking about AMD's XT/PRO convention? They're still the same GPU, pro is typically just a harvested XT.
What do you mean mobile GPU's? Are you talking about the 900m series? There are no mobile specific GPU's in that lineup. It's all binned GM204 and GM107 SKU's.
Interesting observation. I see the same behavior, but the situations are different. Both major graphics vendors are stuck on 28 nm. The 285 is a new product. AMD's graphic situation is not even close to the same as CPU. They are not even competitive in most of the markets for CPU. AMD will likely release a very competitive GPU, which is why NV is releasing the Ti now.
Yes, the 285 is a new product, and while it is an improvement and a step in the right direction, Tonga just isn't enough to address the issue of AMD's profit margins this coming generation, or make them anymore competitive in mobile (M295X). It would be as though Nvidia were selling the 980 at the $200 price point. Not exactly, but from a memory interface, die size, PCB complexity, power consumption perspective, that's basically what AMD is doing right now, with no solution forthcoming. But I guess it's better than selling Tahiti for $200.
"AMD's graphic situation is not even close to the same as CPU. They are not even competitive in most of the markets for CPU. AMD will likely release a very competitive GPU, which is why NV is releasing the Ti now." Some might argue they aren't competitive in the dGPU market with Nvidia market share approaching 80%... some might say that's like Intel levels of dominance... And I didn't say it's the same as their CPU situation, I said it's becoming more similar. While AMD will likely be competitive in raw performance, as I've tried to explain in my past 2 comments, that's kind of besides the point.
Yes, both are stuck on 28nm, but only Nvidia came out with a comprehensive ASIC line-up knowing we'd be stuck here for another 2 years (going back to 2H 2014). It is obvious now that AMD's cost-cutting in staff and R&D is starting to manifest itself as they simply can't keep up while losing ground on 2 fronts (CPU and GPU).
The culmination of this will be AMD going to market with a full series of rebrands of mostly old parts going back to 2011 with a single new ASIC at the top of their stack, while Nvidia has fully laid out its arsenal with GM107 (1 SKU), GM206 (1 SKU), GM204 (2 SKU), and now GM200 (2 SKU).
Is it unprecedented? I recall the Geforce 9000, GTS 100 and most of the GTX 200 series being various rebrands from 'generation' to 'generation'. In fact, the 8800GTS 512 MB, 9800GTX, GTS 150 and GTS 250 were all the same chip design (the GTS 250 had a die shrink but was functionally unchanged).
nVidia has gotten better since then, well with the exception of the GF108 that plagued the low end for far too long.
Yes, its unprecedented to launch a full stack of rebrands with just 1 new ASIC, as AMD has done not once, not 2x, not even 3x, but 4 times with GCN (7000 to Boost/GE, 8000 OEM, R9 200, and now R9 300) Generally it is only the low-end, or a gap product to fill a niche. The G92/b isn't even close to this as it was rebranded numerous times over a short 9 month span (Nov 2007 to July 2008), while we are bracing ourselves for AMD rebrands going back to 2011 and Pitcairn.
The first 3 rebrands were still technically within that same product cycle/generation. This rebrand certainly isn't, so rebranding an entire stack with last-gen parts is certainly unprecedented. At least, relative to Nvidia's full next-gen product stack. Hard to say though given AMD just calls everything GCN 1.x, like inbred siblings they have some similarities, but certainly aren't the same "family" of chips.
Cool maybe you can beat each other and show us the precedent where a GPU maker went to market with a full stack of rebrands against the competition's next generation line-up. :)
The G92 got its last prebrand in 2009 and was formally replaced on in 2010 by the GTX 460. It had a full three year life span on the market.
The GTS/GTX 200 series as mostly rebranded. There was the GT200 chip on the high end that was used for the GTX 260 and up. The low end silently got the GT216 for the Geforce 210 a year after the GTX 260/280 launch. At this time, AMD was busy launching the Radeon 4000 series which brought a range of new chips to market as a new generation.
Pitcairn came out in 2012, not 2011. This would mimic the life span of the G92 as well as the number of rebrands. (It never had a vanilla edition, it started with the Ghz edition as the 7870.)
@Kevin G, nice try at revisionist history, but that's not quite how it went down. G92 was rebranded numerous times over the course of a year or so, but it did actually get a refresh from 65nm to 55nm. Indeed, G92 was even more advanced than the newer GT200 in some ways, with more advanced hardware encoding/decoding that was on-die, rather than on a complementary ASIC like G80/GT200.
Also, at the time, prices were much more compacted at the time due to economic recession, so the high-end was really just a glorified performance mid-range due to the price wars started by the 4870 and the economics of the time.
Nvidia found it was easier to simply manipulate the cores on their big chip than to come out with a number of different ASICs, which is how we ended up with GTX 260 core 192, core 216 and the GTX 275:
@chizow Out of that list of GTS/GTX200 series, the new chip in that line up in 2008 was the GT200 and the GT218 that was introduced over a year later in late 2009. For 9 months on the market the three chips used in the 200 series were rebrands of the G94, rebrands of the G92 and the new GT200. The ultra low end at this time was filled in by cards still carrying the 9000 series branding.
The G92 did have a very long life as it was introduced as the 8800GTS with 512 MB in late 2007. In 2008 it was rebranded the 9800GTX roughly six months after it was first introduced. A year later in 2009 the G92 got a die shrink and rebranded as both the GTS 150 for OEMs and GTS 250 for consumers.
So yeah, AMD's R9 300 series launch really does mimic what nVidia did with the GTS/GTX 200 series.
After some research, I posted a long and detailed reply to such a statement before, I believe it was in these forums. Basically, the offending NVIDIA rebrands fell into three categories: One category was that NVIDIA introduced a new architecture and DIDN'T change the name from the previous one, then later, 6 months if I remember, when issuing more cards on the new architecture, decided to change to a new brand (a higher numbered series). That happened once, that I found. The second category is where NVIDIA let a previously released GPU cascade down to a lower segment of a newly updated lineup. So the high end of one generation becomes the middle of the next generation, and in the process gets a new name to be uniform with the entire lineup. The third category is where NVIDIA is targeting low-end OEM segments where they are probably fulfilling specific requests from the OEMs. This is probably the GF108 which you say has "plagued the low end for too long now", as if you are the arbiter of OEM's product offerings and what sort of GPU their customers need or want. I'm sorry I don't want to go looking for specific citations of all the various rebrands, because I did it before in a previous message in another thread.
The rumors of the upcoming retail 300 series rebrand (and the already released OEM 300 series rebrand) is a completely different beast. It is an across-the-board rebrand where the newly-named cards seem to take up the exact same segment as the "old" cards they replace. Of course in the competitive landscape, that place has naturally shifted downward over the last two years, as NVIDIA has introduced a new line up of cards. But all AMD seems to be doing is introducing 1 or 2 new cards in the ultra-enthusiast segment, still based on their ~2 year old architecture, and renaming the entire line up. If they had done that 6 months after the lineup was originally released, it would look like indecision. But being that it's being done almost 2 years since the original cards came out, it looks like a desperate attempt at staying relevant.
AMD is guilty of going on a massive PR offensive, bending the weak minds of it's fanboys and swearing they would never rebrand as it is an unethical business practice.
Then they launched their now completely laughable Gamer's Manifesto, which is one big fat lie.
They broke ever rule they ever laid out for their corpo pig PR halo, and as we can see, their fanboys to this very day cannot face reality.
I bought a bunch of G80 G92 G92b and G94 nvidia cards because you could purchase memory size, bandwidth, bit width, power connector config, essentially any speed at any price point for a gamers rig, install the same driver, change the cards easily, upgrade for your customers without hassles...
IT WAS A GOLD MINE OF FLEXIBILITY
What happened was, the amd fanboys got very angry over the IMMENSE SUCCESS of the initial G80 and it's reworked cores and totally fluid memory, card size, bit width, and pricing configurations... so they HAD TO TRY TO BRING IT DOWN...
Thus AMD launched their PR war, and the clueless amd fan launched their endless lies.
I'll tell you this much, no on would trade me a 9800GTX for a 9800GT
I couldn't get the 92 bit width cards for the same price as the 128 bit
DDR2 and DDR3 also differentiated the stack massively.
What we had wasn't rebranding, but an amazingly flexible GPU core that stood roaring above at the top and could be CUT down to the middle and the low gaming end, an configured successfully with loads of different bit widths and memory configs....
I dont think you realize how much more efficiant this card is even compared to past cards for its nm and performance. This is a feat. Just calm down and enjoy. I am very happy that the cards price us perfect. :) thanks nvidia
Maybe you aren't aware of how silicon works, but this a 601mm^2 die which costs a boat load to produce especially with the rising costs of crystalline silicon dies. Being on 28nm this long just means the yields are higher (which is why a 601mm^2 is even possible).
You aren't going to see a 14nm card that outperforms this by much till 2017 at the earliest which following the recent NVIDIA trends should see the Titan XYZ (whatever they want to call it) which should be a pretty huge jump at a pretty high price.
Not sure I understand your comment, 28nm is precisely why we're paying this much for this level of performance in 2015... But it's also pretty impressive for the same reason.
14/16nm might cost more. 28nm should have better yields and lower cost. These chips do not cost much to make at all (retail price could be 2-3 times the chip cost)
I think you misinterpreted my comment. I was responding to someone who seemed shocked by the fact that price/performance ratios aren't improving dramatically despite the fact that we're on a very mature process. In response I said the fact that we're on the same process is precisely why we aren't seeing dramatic improvements in price/performance ratios.
"28nm should have better yields and lower cost. These chips do not cost much to make at all (retail price could be 2-3 times the chip cost)" Yields are just one part of the equation. Die size also plays a significant role in manufacturing costs. The fact that your trying to say with a straight face that GM200 does not cost much to make says more than your written comment itself.
Assuming perfect scaling 600mm2 28nm chip would shrink to 150mm2 at 14nm.
GM107 is a 148mm2 chip, so basically this "monster" with just a dieshrink would find a nice place for itself at the bottom end of Nvidias lineup with after transition to 14nm.
This does not take into account the fact that at 14nm and 150mm2 they couldn't give it enough memory bandwidth so easily, but just tells you something about how significant the reduction in size and manifacturing cost is after the initial ramp-up of the yields.
That's pointless. 28nm with (Nvidia at least) is very energy efficient and as seen by this review a steal for the power this card delivers. It's a Titan X at $650.00. You're just desperately trying to find anything you can try and gripe about.
Nvidia must make a living. No matter who they have to run over in the process. You, me, AMD... Doesn't matter. I imagine that at Nvidia, the mantra is "MUST MAKE MONEY! LOTS AND LOTS OF MONEY!", and that it repeats all day long, every day.
P. T. Barnum was right... "There's a sucker born every minute."... and we are them.
Personally I hope they make oodles of the stuff, so they can reinvest and make even better tech, because, my heavens, that's how private companies function. :D
It's weird how people complain about companies making profits, yet the very existence & continued success of a company depends on profits (unless of course one is AMD and somehow gets away with year after year of losses without going under).
Depends on where you live. At a mere 3hrs per day (easy if more than one person uses it) at 270w difference even Ocing it, you end up $75 a year savings in a place like Australia. That ends up being $300 if you keep it for 4yrs, more if longer. In 15+ states in USA they are above 15c/kwh (au is 25.5c) so you'd save ~$45+ a year (at 15.5, again 14 quite a bit above this), so again $180 for 4yrs. There are many places around the world like AU.
Note it pretty much catches 295x2 while doing it Oced. It won't put off as much heat either running 270w less, so in a place like AZ where I live, this card is a no brainer. Since I don't want to cool my whole house to game (no zoned air unfortunately), I have to think heat first. With Electricity rising yearly here, I have to think about that over the long haul too. TCO is important. One more point, you don't deal with any of the "problem" games on NV where crossfire does nothing for you. Single chip is always the way to go if possible.
If you have a kid, they can blow those watts up massively during summer for 3 months too! WOW users can do 21hrs on a weekend...LOL. I'd say Skyrim users etc too along with many rpg's that will suck the life out of you (pillars, witcher 3, etc). A kid can put in more time in the summer than an adult all year in today's world where they don't go out and play like I used to when I was a kid. You're shortsighted. Unless AMD's next card blows this away (and I doubt that, HBM will do nothing when bandwidth isn't the problem, as shown by gpu speeds giving far more than ocing memory), you won't see a price drop at all for a while either.
If rumors are true about $850-900 price for AMD's card these will run off the shelf for a good while if they don't win by a pretty hefty margin and drop the watts.
Add Elite Dangerous, Project Reality, GTA V, the upcoming Squad and various other games to your list of titles which one tends to play for long periods if at all.
I guess you're one of those people who care more about specs than actual performance. Seriously, is 28nm just too big for ya? It's 3% slower than the fastest gpu on Earth for $650, and you're whining about the transistor size... get a life.
The 960 is a underwhelming overpriced product.. I'd be more interested in a Ti variant if I was looking to buy right now.. but no.. although that 980Ti is tempting, I'd never purchase it without seeing what AMD is doing next month.
GM206 based 960xx? or a further cut on GM204? ;) My gut feel tells me it would be GM204 based: I am guessing ~3B trans on GM206 on a very mature 28nm process should be relatively doable without much defects.
What more would a review of the 960 tell you that you don't already know, honestly? I'd rather read reviews about interesting products like the 980Ti. People need to let the 960 review go already, geez.
I`m trying to get a cheap small notebook for my father. He is currently on i3-380UM and the choice is between N3558 and i3-4030U. Workload is strictly internet browsing/ms office.
Not much point in changing anything if performance is going to be worse than it was...
DVI may be an obsolescent standard at this point; but 4/5k gaming is still expensive enough that a lot of the people buying into it now are ones who're upgrading from older 2560x1600 displays that don't do DP/HDMI 2. A lot of those people will probably keep using their old monitor as a secondary display after getting a new higher resolution one (I know I plan to); and good DL-DVI to display port adapters are still relatively expensive at ~$70. (There're cheaper ones; but they've all got lots of bad reviews from people who found they weren't operating reliably and were generating display artifacts: messed up scan lines.) Unless it dies first, I'd like to be able to keep using my existing NEC 3090 for a few more years without having to spend money on an expensive dongle.
Dude, majority are still playing on 1920x1080 and just few now are making the leap to 2560x1440p, i have been gaming on 1440p since two years and not planning to go 4k anytime soon since hardware still not mature enough to play at 4k comfortably with single video card.
thus, DVI is not going anywhere since dual layer DVI supports 1440p and probably most of 1080p gamers are using DVI unless if they have G-Sync or want to use Adaptive V-Sync then they have to use DP, and dont forget that there are too many people who bought 27" Korean 1440 monitors that doesnt have except DVI ports.
If you're playing at 1920/60hz this card's massive overkill, and in any event it's a non-issue for you because your monitor is only using a single link in the DVI and you can use a dirt cheap passive DVI-HDMI/DP adapter now; and worst case would only need a cheap single link adapter in the future.
My comment was directed toward Ryan's comment on page 2 (near the bottom, above the last picture) suggesting that the DVI port wasn't really needed since any monitor it could drive wouldn't need this much horse power to run games.
Exactly. I literally just now upgraded to a 1440p monitor, and i can't even express in words how little of a sh*t i give about 4k gaming. Ive been a hardware nerd for a long time, but when i got into home theater i learned just how much resolution actually matters. 4k is overkill for a 120" projected image at a 15' seating distance. 4k at normal desk viewing distances is way beyond overkill. They've done tests on fighter pilots who have ridiculous vision, like 20/7.5 and such, and even they can't see a difference at those seating distances. 4k is almost as much of a marketing BS gimmick than 3D was for tv's.
Anyways im clearly getting angry. But point still stands, every single gamer i know is still on 1080p, i was the first to splurge on a 1440p monitor. And now its put me into a position where my SLI'd 760's aren't really doing the deed, especially being 2gb cards. So, 980ti fits the bill for my gsync 144hz 1440p monitor just about perfectly.
I beg to differ. 4K at monitor viewing distance is not overkill, it's actually quite pleasantly sharp. Phones, tablets and laptops are already pushing for 2K+ displays which is phenomenally sharp and out of the league for normal FHD monitors. Gaming at 4K is still not coming but when it comes it will blow our minds, I am sure.
Basically at a 5' viewing distance, you would have to have a 40" monitor before 4k would start to become noticeable.
Even at 30" monitor you would have to be sitting roughly 3.5' or closer to your monitor to be able to begin to tell the difference.
We also have to keep in mind we're talking about severely diminishing returns. 1440p is about perfect for normal seating distances with a computer on a 27" monitor. 30" some arguments can be made for 4k but its a minor. Its not like we're going from 480p to 1080p or something 1440p is still very good at "normal" computer seating distances.
Human vision varies as to who can discern what at a particular distance. There's no fixed cutoffs for this. Personally, when wandering around a TV store back in January (without knowing what type of screen I was looking at), for visual clarity the only displays that looked properly impressive turned out to be 4Ks. However, they're still a bit too pricey atm for a good one, with the cheaper models employing too many compromises such as reduced chroma sampling to bring down the pricing, or much lower refresh rates, etc. (notice how stores use lots of static imagery to advertise their cheaper 4K TVs?)
Btw, here's a wonderfull irony for you: recent research, mentioned in New Scientist, suggests that long exposure by gamers to high-refresh displays makes them more able to tell the difference between standard displays and high-refresh models, ie. simply using a 144Hz monitor can make one less tolerant of standad 60Hz displays in the long term. :D It's like a self-reinforcing quality tolerance level. Quite funny IMO. No surprise to me though, years working in VR & suchlike resulted in my being able to tell the difference in refresh rates much more than I was able to beforehand.
Anyway, I'm leaving 4K until cheaper models are better quality, etc. In the meantime I bought a decent (but not high-end) 48" Samsung which works pretty well. Certainly looks good for Elite Dangerous running off a 980, and Crysis looks awesome.
Why would most people be using DVI? DVI is big and clunky and just sucks. Everyone that gets new stuff nowadays uses displayport it has the easiest to use plug.
Indeed, one of the few sites to include 580 numbers, though it's a shame it's missing in some of the graphs (people forget there are lots of 3GB 580s around now, I bought ten last month).
If it's of any use, I've done a lot of 580 SLI vs. 980 (SLI) testing, PM for a link to the results. I tested with 832MHz 3GB 580s, though the reference 783MHz 3GB models I was already using I sold for a nice profit to a movie company (excellent cards for CUDA, two of them beat a Titan), reducing the initial 980 upgrade to a mere +150.
Overall, a 980 easily beats 580 SLI, and often comes very close to 3-way 580 SLI. The heavier the load, the bigger the difference, eg. for Firestrike Ultra, one 980 was between 50% and 80% faster than two 3GB 580s. I also tested 2/3-way 980 SLI, so if you'd like the numbers, just PM me or Google "SGI Ian" to find my site, contact page and Yahoo email adr.
I've been looking for a newer test. I gather GTA V has a built-in benchmark, so finally I may have found something suitable, need to look into that.
Only one complaint about the review though, why no CUDA test??? I'd really like to know how the range of NV cards stacks up now, and whether AE yet supports MW CUDA V2. I've tested 980s with Arion and Blender, it came close to two 580s, but not quite. Would be great to see how the 980 Ti compares to the 980 for this. Still plenty of people using CUDA with pro apps, especially AE.
Btw Crest, which model 580s are you using? I do have some 1.5GB 580s aswell, but I've not really done much yet to expose where VRAM issues kick in, though it does show up in Unigine pretty well at 1440p.
For reference, I do most testing with a 5GHz 2700K and a 4.8GHz 3930K, though I've also tested three 980s on a P55 with an i7 870 (currently the fastest P55 system on 3DMark for various tests).
Ryan, did you guys fully test the amount of full-speed VRAM on this 980Ti? Is all 6GB running at full speed and not just 5.5GB or some such nonesense? Have you tested actual in game VRAM usage and seen it reach 6GB? Thanks. :)
This time they said something beforehand, I'm sure they are lying, so I agree with you. My tinfoil is failing one moment I'm receiving a transmission from beta reticuli.
Ah yes, it's confirmed, nVidia is lying, again, the memory is hosed on the 980ti...
This message will self destruct in 5 seconds wether or not you've accepted the mission o-k.
Makes the 980 a very hard sell even at 499$, they should have dropped it to 449$ or even slightly less. The TI is so much faster and the 970 is so much cheaper.
I think it should have been dropped to $250, but that's just me. When price premiums are not linear with performance increases, people complain the higher priced card is overpriced, and when they are, people complain the lower priced card is overpriced. Best solution: All cards $0.
I agree, the fact that AMD new 3xx is mostly (sans 1 new GPU) rebrands scares the crap out of me. and Nvidia knows it too, that's why we're getting the bad witcher 3 on gameworks @ kepler, astronomical prices and a generally very 'apple like marketing' from nvidia.
Don't get me wrong, I certainly appreciate the level of refinements that Nvidia brings to the table, but without any answer from AMD, prices are very far from reasonable.
few years ago, I would never guessed PC gaming will be dead due to single GPU supplier situation, nowadays I am a lot more unsure...
In that case why assume that the 980 should have been dropped in price more. Maybe the 980 Ti should have been priced at $700?
The difference between $500 and $650 is palpable. And the performance one requires depends on the monitor one has. What you seem to be saying is you would be willing to pay more than 30% price premium for a 30% increase in performance, which is usual. But when prices are actually set that way, there always seem to be people complaining the premium card is priced too high, and quoting the price/performance difference as the reason.
@Yojimbo lol so true, people seem to think price:perf should be perfectly linear and comparable to some bargain bin part at $75, but if that was always the case, we'd all be using 2-3 gen old cards that can't play the games we want to play, today.
The 980 is $550, not $499. Despite that it still has a similar price/performance ratio to the 980 Ti. So technically it's no worse of a deal than the 980 Ti, but I think the 980 should still drop in price to ~$500 or $450. It should have a better price/performance ratio than the higher-end Ti.
The 980 price has been dropped to 499$ and the point was that the TI and the 970 are much better buys, the 870 being way cheaper for little perf loss while the TI offers a lot more perf and is far better at 4k.
Ahh, sorry I missed that. However, at $500 the 980 still has a similar price/performance ratio as the 980 Ti. So while I do think it should drop by more, I'm also a bit confused by why people are calling it a terrible buy when it really isn't anymore terrible than the Ti at $650.
No, drop everything in your life AT staff, and effing TRIPLE check, and make sure to provide a notarized video of the process. Anything, ANYTHING at all, that can wash away the salt of the AMD rebadge, C'MOOOOON!
They already answered your question. Fact is, the whole 970 RAM issue is totally irrelevant. Nobody has shown any game that exhibits problematic behaviour as a result of how the card works, none of it changes how good the card is based on initial reviews, and anyone doing something that needs close to 4GB RAM is probably in need of greater baseline horsepower than a 970 anyway, so who the hell cares?
The 980 Ti runs at full speed across the board, check the ROP specs, etc.
Unless you're a prosumer and need all the VRAM you can get. Titan X is ideal for AE, various types of GPU heavy rendering, compute involving SP only, etc. Seen a guy on another forum saying he loves his Titan X for compute because of its huge RAM.
I cant wait to see what Third party coolers, with 8 phase vrms and big coolers are capable of. As someone with a 1200p screen, this gpu could well serve 5+ years.
Great card, performance and price! Nvidia is certainly being very aggressive with pricing this part, all without any competition from AMD!
This is going to put a lot of pressure on anything AMD does in June. As the review stated, AMD's new GPU has a pretty tough act to follow for a $650 mini-Titan X and as we have already seen, AMD won't be competitive in the $500 and lower price point if they come to market with a stack of rebrands.
lol they got way high margins on the big cards ,they could sell this easily at 500$ (where perf per price would be close to the 970) and have good margins, if AMD has the big enough card, all they need is to want to be price competitive. These are not products for people that look for reasonable value, nobody that looks for that would ever pay 500$ for a GPU to begin with so both AMD and Nvidia are just keeping the high margins since volumes can't really go up.
I don't know how you can say the value is unreasonable when there is no other way to achieve what these cards achieve. The high end of the graphics market is less sensitive to price than more mainstream segments. Both AMD and NVIDIA are trying to maximize their profits over the entire market range.
You seem to not understand the term value and then you explain why the high end cards are poor value. Value and competitiveness are 2 different things. One is perf per price and the other is how the product relates to it's competitors. Yes the high end is less price sensitive (something my first comment agreed with) and the cards are poor value that's why i found it amusing that someone thinks the perf per price is great, when it never is in the category.
"You do not seem to don't understand the term value, and then you explain why the high end cards are poor value."
Oh, that's rich.
Value in this case is a subjective term. Each consumer defines the value of a product to them. It isn't something you can put on a chart, or quantify, unless that's how they choose to define it for themselves.
I own two TITAN X in SLI to drive a 1440p display. You'd probably call that "poor value". You'd be correct... for yourself. You'd be woefully incorrect if we're discussing what I consider "value", because I considered the price/performance ratio to be perfectly acceptable, and moreover, a "good deal" to eliminate all possibility of VRAM related stuttering. All this while giving me the same number of shader cores as a tri-SLI 980 setup with better scaling because it's only two cards. A resounding value in my books.
lol, you lost all credibility with anyone here after you said you were using Titan X's in SLI because you are worried about running out of VRAM and stuttering...
I wouldn't call that poor value, I would just call that retarded.
This whole thread could have been fixed if the guy said "objectively" instead of just "value".
Yes, what constitutes a good value varies from person to person. Anyone with a brain can infer what he was trying to say in his argument, which is absolutely correct, is that at this price bracket, people aren't buying cards based on price/perf. So, whether or not the price/perf is comparable to lower cards is irrelevant. They price according to what the market will be bear. If they're selling them as fast as they can make them, then they're selling them at the right price.
No. Especially since he replied back to me and ridiculed me for not knowing what "value" is. That by itself is enough of a refutation of your "just try to guess what he meant" argument. But if he really did mean what you think he did, his post is irrelevant, because value is the proper metric, and it is not "objective". That really is the entire point, and why something needed to be said.
And one more thing. The fact that they aren't buying the cards to maximize "price/perf" is blatantly obvious, and just as blatantly irrelevant. The problem I'm having is, and I could be wrong here, that you and he both seem to be convinced they SHOULD be buying the cards on "price/perf."
I'd buy you a beer for posting that if I could. 8) It's the perfect summation of what I've said so often, namely an item is only ever worth what someone is willing to pay. It's funny how people can get so offended that someone else can afford and is happy to buy a far better config than they do; really it's just hidden jealousy IMO. Either way, kudos for that rig, and please post some 3DMark bench links! 8) Actually, you should buy the beer, you can afford it, hehe...
I wonder if you have a similar MO to me, I like to max out visual settings for the games I play, modding if need be to improve visuals. I hate scenery popping, etc.
Thumbs up daroller, 1920x1200 and only 980TI is capable of driving it properly without eye candy loss and fps failures.
People claim I'm crazy but then I never have to worry about my settings and I can seamlessly choose and change and view and analyze and I'm never frustrated "having to turn down the settings" to make things playable.
The rest of the world stretches livability to the limit and loves stressing everything to the max an grinding it all down to pathetic perf, all the while claiming "it's awesome !"
Value is not performance per price. Value is what benefit is achieved by the purchase of the product. I'll repeat my previous post by asking how can you assume that purchasing a card has "unreasonable value"? If I, as someone who is in the market for a video card, have a range of options to choose from for myself, how can you off-the-cuff judge how much I should be willing to spend to get a better experience (higher resolution, more detailed graphics, smoother game play, etc) from a higher-priced offering compared with a lower-priced offering? You have no idea what my value of those experiences are, so how can you judge whether the cards offer good value or not?
The people buying those high priced cards are buying them because in their minds they are getting more value from them at the higher price than they would be getting from the lower-priced alternatives. Now people don't always make the most accurate decisions. They can be fooled, or they can have misconceived notions of what they are going to be getting, but the point is that they THINK they are getting more value at the time of purchase.
Or they can simply afford it and want the *best* for other reasons such as being able to max out visuals without ever worrying about VRAM issues. It's wrong to assume such purchases are down to incorrect value judgements. You're imposing your own value perception on someone else.
Yeah, unfortunately anyone who actually buys high-end GPUs understands price and performance goes out the window the higher you go up the product chain. Nvidia made their value play to the masses with the 970 at an amazing $330 price point, memory snafu notwithstanding, and the card has sold incredibly well.
There was no reason for them to drop prices further, and I think most observers will recognize the $650 price point of the 980Ti is actually very aggressive, given there was absolutely no pressure from AMD to price it that low.
If AMD gets to launch first and nVidia must respond, it seems like they're trading blows. If nVidia makes a preemptive strike now, they make AMD's launch seem late and weak. They know AMD is betting on Win10 driving sales, so they could read their launch plan like an open book and torpedo it out of the gate. I think this will be a miserable month for AMD, it's hard to see how GCN-based cards are going to compete with Maxwell, HBM or not.
Yep, Nvidia just pre-emptively torpedoed AMD's product launch and set pricing again. All very impressive how Nvidia has gone about 28nm product launches despite the uncertainty we'd see anything new until 14/16nm after word 20nm was cancelled.
I don't think Nvidia is dumb enough to launch 980TI without knowing where FIJI would lay on their stack. I think this is more of a powerplay from them saying, 'here's your price point AMD, good luck'
like you said, the fact is they have no competitive pressure on titan X, why ruin it's pricing now if you don't know where FIJI would land.
here's my guess: Nvidia just torpedoed their titan X, mainly because FIJI probably around 97% of titan X, and AMD was about to ask 850~1000 USD for it. now Nvidia will launch this 980TI at 650 to control the price. (which I bet they have been readying for quite some time, simply waiting for the right time/price point)
I think you are right. Fuji was estimated to be close to Titan but cheaper by $200. Now Nvidia delivered a Fuji-like card for $650, Fuji cannot go above $650. In fact, consider Fuji to be limited to 4GB and hot enough to have a watercooled version, Fuji might have to go below $600 with bundle game or $550 without bundle to make any sense. With the big chip and expensive memory Fuji is using, AMD/ATI's margin on those card are going to be slim compares with Nvidia.
Likely true, and it's really a sad situation for AMD... which is a bad situation for PC gaming in general. AMD desperately needed NV to price this card $150 - $200 higher.
Look at the reviews of the 980 Hybrid that just came out. A watercooled version is a very good thing even on a less than 300W card. A watercooled Fiji is going to be putting out way less heat than a 295X even if it's overclocked, and compare the 295X temp and noise to an OC 980 Ti. Using a much better cooler that should allow some great OC performance is not a guarantee of weakness.
"preemptive" seems like a strong word. AMD was supposed to release the 3XX series in February, then March. Then it was "comming soon". We're in June, it's still not out. And AMD makes a silly youtube video "It's Coming", then Nvidia releases the 980 Ti before them!
Bench results are compiled as we test cards for articles. We've had no need to test GTX 970 for any articles yet this year, so its results are not yet in Bench '15.
970 performance can be inferred from the charts anyways. Step one: look at 980 perf, step two: subtract 10~15%. Method valid up to 1440p, above that 970 chokes on the VRAM requirements. Alternatively, take 290 perf, and add 10~20%, depending on whether the game is TWIMTBP or not.
The larger-than-expected gap in pixel fillrate suggests that 980 Ti has a partially disabled ROP/MC partition and segmented memory just like 970. Could AT please investigate this?
I don't care what GPU-Z says right now. It was wrong about 970 at first and it could be wrong now.
Remember that the 980 and 970 both used the same amount of RAM with the 970 having some ROP cut making it impossible to have the full 4 GB. This is a different situation with the 980TI and Titan X both having the full 96 ROP's and the 980TI using half the RAM. Two completely different situations !
Besides the fact that Titan X has twice the DRAM chips per ROP/MC partition, how is it different? Anyway I won't belabor the point as Ryan has already confirmed.
Why do people care if it uses the whole 6GB or not (and apparently it does)? It's completely and utterly irrelevant to 99% of users. If the card has the performance you want at the price you're prepared to pay, the memory situation is irrelevant.
Idk, I think it is a fair question, and the article covers it pretty well. The current-gen consoles pushed up VRAM requirements significantly for this generation of games and while 3-4GB was generally viewed as enough last-gen, that quickly changed when games start using 4+GB at the resolutions (1440p and higher) and settings (MSAA, max textures etc) someone paying a $650 would expect their GPU to handle.
12GB will almost certainly be overkill, 6GB is probably minimum to hold you over til 14/16nm, 8GB would be just right, imo.
No, its not irrelevant if that means tomorrow's games aren't performing relative to what you saw in reviews today because its hitting its VRAM limit. Knowing how close you are to that limit at the settings you play at helps you gauge and understand whether it will be enough for longer than a few months.
All true, but I think what he's getting at is that it shouldn't materially affect your purchase decision. You don't have an alternative at this price point, even if it is partially gimped. AMD doesn't have currently have anything at this price point which performs this way either. It's this or TITAN X for $350.00 more. Pick your poison.
It's all moot, there are no hobbled ROPs on this card.
Eh...if I knew for sure games at 1440p were already using, say 5.5-6GB of VRAM, that would materially affect my purchase decision to buy 2 of these cards, or stick to a single Titan X and look at picking up a 2nd.
But, I know most of my games at 1440p are using <4GB and the most demanding ones are using 4-5GB max, so I feel pretty good about 6GB being enough.
how is 6GB the minimum ram needed till finfet gpus? Even at 1440p with max settings no game requires 6GB of ram. Even if a game can use 6GB of ram the way some games are programmed they just use up extra ram if it is available but that used ram isn't crucial to the operation of the game. So it will show a high ram usage when in reality it can use way less and be fine.
You are overly paranoid. 4GB of ram should be just fine to hold u off a year or 2 till finfet gpus comes out for 1440p res. If you are smart you will skip these and just wait for 2h 2016 where 14/16nm finfet gpu's are going to make a large leap in performance. That generation of gpu's should be able to be kept long term with good results. This is when you would want an 8GB card to keep it running smooth for a good 3-4 years, since you should get good lifespan with the first finfet gpu's.
Again, spoken from the perspective of someone who doesn't have the requisite hardware to test or know the difference. I've had both a 980 and a Titan X, and there are without a doubt, games that run sluggishly as if you are moving through molasses as soon as you turn up bandwidth intensive settings, like MSAA, texture quality and stereo 3D and hit your VRAM limits even with the FRAPs meter saying you should be getting smooth frame rates.
With Titan X, none of these problems and of course, VRAM shoots over the 4GB celing I was hitting before.
And why would I bother to keep running old cards that aren't good enough now and wait for FinFET cards that MIGHT be able to run for 3-4 years after that? I'll just upgrade to 14/16nm next year if the difference is big enough, it'll be a similar 18-24 month timeframe when I usually make my upgrades anyways. What am I supposed to do in this year while I wait for good enough GPUs? Not play any games? Deal with 2-3GB slow cards at 1440p? No thanks.
Why is the most popular mid-high card: GTX 970, not on the comparison list? It is exactly half the price as 980 Ti and it would be great to see if it is exactly 50% the speed and uses half the power as well.
Ryan's selection is not random. it seems he selects the likely upgrade candidates & nearest competitors. it's the same reasoning why there is no R9 290 here. most 970 and R9 290 owners probably know how to infer their card performance from the un-harversted versions (980 and 290x).
Granted, it's odd to see 580 here and 970 will be more valuable technically.
Plus, most requests I've seen on forums have been for 970 SLI results rather than a 970 on its own, as 970 SLI is the more likely config to come anywhere a 980 Ti, assuming VRAM isn't an issue. Data for 970 SLI would thus show where in the various resolution/detail space one sees performance tail off because it needs more than 4GB.
The 295X2 still crushes it. But blind Nvidia fanboys will claim it doesn't matter because it is either a) not a single GPU or b)AMD (and therefore sucks).
I owns 290 crossfire currently, previously a single 780 TI. Witcher 3 still sucks for my 290 CF, as well as the 295X2. so... depends on your game selections. I also have to spend more time customizing most of my games to get the optimal settings on my 290 CF than my 780TI.
Witcher 3 runs just fine on my single 290. Is it just the xfire profile? Do you have the new driver and latest patches? Also, have you turned down tesselation or turned off hairworks?
4K... was hoping my U28D590D will have freesync, but alas... no such luck. I am very sensitive to stutter, it gives me motion sickness, to the point I have to stop playing :(
limiting hairworks to 8x does help, but I really dislike the hair without it. I rather wait for 15.5.1 or 15.6. I have other games to keep me busy for a while.
I can get 45 avg if I drop to 21:9 ratio using 2840 x 1646, but even then I still get motion sickness from the occasional drops.
Yes CrossFire support of TW3 is broken from Day1, its a well-known issue. AMD hastily released a driver last week with a CF profile, but its virtually unusable as it introduces a number of other issues with AA and flickering icons.
Are you sure? Did they release a follow-up to the 15.5 Beta? Because the notes and independent user feedback stated there was still flickering:
*The Witcher 3: Wild Hunt - To enable the best performance and experience in Crossfire, users must disable Anti-Aliasing from the games video-post processing options. Some random flickering may occur when using Crossfire. If the issue is affecting the game experience, as a work around we suggest disabling Crossfire while we continue to work with CD Projekt Red to resolve this issue
295X2 is indeed faster but it also uses twice as much power. You have to take the 1000W PSU into account as well as one or two additional 120mm fans that's needed to get the heat out the case. When you add up all the extra cost for PSU, fans, electricity, noise and stutter against an overclocked 980Ti (last few pages of review), the slight speed advantage aren't going to be worth it.
Also, Maxwell 2 supports DirectX 12, I am not so sure about any of the current AMD/ATI cards since they were designed in 2013.
You don't have to buy a new PSU every time you buy a high TDP card, but otherwise a valid point. Going multi-GPU for the same performance requires a much bigger price difference to be worth it vs. a single card.
Basically you're gonna spend an extra $5/mo on electricity with that card, or $60/yr vs a 980ti. thats actually pretty huge. Thats at 4hrs/day of gaming, at an average of 12c/kwh. If you game 6 or 7 hours a day, its even worse.
These high power cards are a little ridiculous. 600w just for one video card?!!
I had a GTX690, and I run SLI TITAN X. Dual GPU IS a hindrance. You'd have to be blind, stupid, or a rabid fanboy to claim otherwise. The 295x2 isn't exempt from that just because you dislike NV and harbor a not so secret love for AMD.
I had a GTX690, and I run SLI TITAN X. Dual GPU IS a hindrance. You'd have to be blind, stupid, or a rabid fanboy to claim otherwise. The 295x2 isn't exempt from that just because you dislike NV and harbor a not so secret love for AMD.
I had a GTX690, and I run SLI TITAN X. I've been running dual GPU setups for as long as they've been available. Dual GPU IS a hindrance. You'd have to be blind, stupid, or a rabid fanboy to claim otherwise. The 295x2 isn't exempt from that just because you dislike NV and harbor a not so secret love for AMD.
Yea cause using dual GPU's just sucks. Just adds a bunch more complexities and problems to everything. Always get 1 of the largest, fastest single gpu's you can get.
I guess if by crush you mean thermally crush. Then yes, you're absolutely correct. I mean, why not have a portable nuclear reactor nearby to power your video card!
Hope you enjoy your power bill, heat, worse stuttering, etc., and the numerous CF fails for all sorts of scenarios. I checked some forums, lots of moans about 15.5 for CF support.
It JUST came out. A significant portion of this review was probably carried out before it was even released. Not to mention its already had two patches that affect performance substantially.
@Ryan Any word on when we might see some SLI results? I know this is generally dependent on limited review samples, but vendors will probably start sending you cards soon yeah?
I don't think it is all good with the overclocking part. On stock air (albeit 80% fan speed unless used Accelero IV at 40%) cooler TitanX cards easily get to 1300/1440 normal/boost clocks. Same cards on water got to 1375/1500 with cool-ish 55°C at max load. That applies to two TitanX in SLI with modified BIOS that allows for more power consumption and thus removes artificial limit.
Since the chip is identical and 980Ti is actually partially defective TitanX with 50% less RAM and switched off defective parts, I highly doubt clock potential differs, especially not in favor of 980Ti.
I would and do expect 980Ti to clock the same as Titan X (loosing some on chip quality, gaining some on half the VRAM).
We won't be doing any complete Win10 benchmarking until that OS is finished and released. As for DX12, there are no games out yet that re using it; the handful of benchmarks are focused tech demos.
Nvidia must have seen some undisclosed AMD benchmarks, went into panic mode, and rushed a release for the 980TI to get customers before the AMD launch.
While its a great card the problem is Nvidia screwed some of their own customers.
I take this as a sign that whatever AMD is coming out with must be pretty good. :)
Maybe, but it could prove to be of little importance. You see, Win10 will be out on June 29th. Realistically speaking, DX12 games won't be real before Christmas or 2016. It is more than enough time for a possible counterstrike from nVidia. Having said that, unless one really really needs to upgrade now I would strongly recommend waiting for another month, just to check what Fiji is up to. As of me, I have a pair of 980GTX Strix and have been with nVidia for a while, but I really hope AMD gets this one right. Real competition is always good.
I've been waiting for this beast to drop... now to decide whether it's a good time to bite.
Current setup is a 2500K OC to 4.4Ghz and a GTX 670, so kinda oldish... Was considering upgrading to a Skylake proc come Sept and this 980 Ti, but probably Gigabyte variant... hmmm.
Why upgrade CPU? 2500K at 4.4GHz is still very fast and shouldn't affect performance of a 980Ti much. Maybe 10% less fps vs if you have a 6-core Extreme but why spend $300-400 to get 10% improvement?
Plus if he does need some more CPU oomph, just put in a 2700K. I've built six so far, every one of them happily runs at 5GHz with just a decent air cooler & one fan for quiet operation, though for final setups I use an H80 and two quiet fans. Some games will benefit from more than 4 cores, depends on the game (eg. PvP online FPS can involve a lot of host side scripting, eg. Project Reality, and the upcoming Squad).
True though, 2500K is still very potent, just built a 4.8 setup for a friend. She lives on an island, it'll probably be the quickest system for miles around. :D
Any card that can do true RGB color schemes are NOT MEANT for normal users. It brings a lot of drawbacks for games and normal tasks. These type of cards are for graphics professionals only. Google it to see why.
Indeed, the way colourspaces interact with different types of monitor can result in some nasty issues for accurate colour presentation. For home users, it's really not suitable since so many normal apps & games aren't written to utilise such modes correctly. Besides, I doubt any 4K TVs could properly resolve 10bis/channel anyway. Funny though that people are still asking about 10bit colour when pro users were already using 12bit more than 20 years ago. :D Also 16bit greyscale for medical/GIS/etc.
Yikes! That overclock ability! I always buy EVGA's superclocked NVidia cards as they as super stable and have great benchmarks (as well as playing games well, heh). I might buy into this even tho I have a GTX980.
As for AMD, NVidia has 76% of the discrete GPU graphics card market (and still rising) while AMD has lost 12% market share in the last 12 months alone. Whatever AMD has up for new products, it better hurry and be a LOT better than NVidia cards. AMD has tried the " rebadge existing GPU family cards, reduce its price, and bundle games" for too long and IT IS NOT WORKING. C'mon AMD, get back into the fight.
Well I recently upgraded with a second 970 for SLI for 1440p gaming and have them overclocked to 980 performance. It's roughly 15% faster than this single card solution for $700 vs. $650 (7.5% increase in cost). But one thing is for certain: we are still a long time away from realistic 4K gaming with a G-sync 120Hz monitor when those come out. I would much prefer 1440p gaming with max quality and high AA settings and faster FPS matched to screen Hz than detuned 4K settings (even if AA is less meaningful at 2160p).
By the way: are you guys ever going to add Project Cars to your benchmarks? It has rapidly become THE racer to own. Grid Autosport is not really a good benchmark these days because it's just a rehash of the Grid 2 engine (EGO 3.0)...easy on GPUs. Many, including me, haven't touched Autosport since PCars was released and may never touch it again.
Project Cards is one game that runs badly in CF atm (driver issues), which would make the 295x2 look horrible. Might be better to wait until AMD has fixed the issue first.
A GTX Titan X for $649, DOH BART! Oh well I've enjoyed my SLI Titan X's for a few months so I guess that was worth the $700 premium. I keep falling for nVidia's Titan brand gimmick, I also bought the original Titan luckily just 1 of them and ended up selling it for about half what I paid. Lesson learned, AGAIN, don't buy the Titan brand wait for the regular GTX version instead.
The performance difference between the 980ti and 980 is WAY larger than the performance difference between the 980 and 970 yet the price gap is larger between the 980 and 970. The 980 was stupidly overpriced at 550 and is still overpriced at 500. It needs to be at the 420-430 mark.
I would be upset if I just paid 550 for a GTX 980 and now for only 100 more I could basically have titan x performance.
And what value do you place on the 9 months that 980 users have been enjoying that level of performance? Again, if you think the 970 is the better deal, it is there for you to buy at $300-330. The 980 was overpriced by maybe $50 at launch, but it still dropped the entire price and performance landscape at the time where 780Ti was still $650+, 290X was $550, 780 was $450 and 290 was $400. In that context, it wasn't so bad, was it?
In reality, Nvidia has no reason to drop the 980 as there is no pressure at all from AMD. All these price cuts are self-induced as they are simply competing with themselves and pre-emptively firing a shot across the bow at $650 with 980Ti.
"In reality, Nvidia has no reason to drop the 980 as there is no pressure at all from a card with 3.5 GB of VRAM that, in part, runs at 28 GB/s and has XOR contention."
"In reality, Nvidia has no reason to drop pricing on the 980, as there is no point in threatening the golden calf that may have single-handedly killed AMD graphics, 3.5GB VRAM and all."
I dunno. I can't really justify an upgrade from my 980 STRIX (which would then replace beloved 680 in my HTPC) - I was hoping for at least 40% improvement. Not really worth it for 20-30%. Better off getting another 980 and SLI it.
Just small bug in your article: Page "GRID Autosport" has one paragraph from previous page. "Switching out to another strategy game, even given Attila’s significant GPU requirements at higher settings, GTX 980 Ti still doesn’t falter. It trails GTX Titan X by just 2% at all settings."
As for theoretical pixel test with anomalous 15% drop from Titan X, there is ready explanation: Under specific conditions there won't be enough power to push those two Raster engines with cut down blocks. (also only three paths instead of four)
Today, Alienware is offering 15" laptops with an option for an R9-390x. Their spec sheet isn't updated, nor could I find updated specs for anything other than R9-370 on AMD's own website. Are you going to review some of these R9-300 series cards anytime soon?
From most result the 980Ti offer 20% more @1440p than a 980 (GM204) and given the 980Ti cost like 18-19% more that the orginal MSRP of the 980 ($550) It's really not any big thing.
Given GM200 a 38% larger die, and 38% more SU's over a GM204 and you get 20% increase? It worse when a full TitanX is considered, that has 50% more SU's and the TitanX get perhaps 4% more in FpS over the 980Ti. This points to the fact that Maxwell doesn't scale. Looking at power the 980Ti is needing approx. 28% more power, which is not the worst but is starting to indicate there a losses as Nvidia scaled it up.
This is obviously a comment by a frustrated AMD fan. Maxwell scales perfectly as you didn't consider the frequency it runs. GM200 is 50% more than a GM204 in all resources. But those GPU run at about 0.86% of GM204 frequency (1250 vs 1075). If you can do simple math, you'll see that for any 980 results, if you multiply it by 1.5 and then for 0.86 (or directly for 1.3, that means 30% more) you'll find almost exactly the numbers the 980Ti bench shows. Now that the new 980 $500 price, do the same and... yes, it is $650 for 980Ti. Oh, the die size... let's see... 398mm^2of GM204 * 1.5 = 597mm^2 which compares almost exactly with the calculated 601m^2 of GM200. Pretty simply. It shows everything scales perfectly in nvidia house. Seen custom cards are coming, we'll see GM200 going to 50% more than GM204 at same frequency. Yet these cards will consume a bit more, as expected.
You cannot say the same for AMD architecture though, as with smaller chips GCN is somewhat on par or even better with respect to nvidia for perf/mm^2, but as soon as real crunching power is requested GCN becomes extremely inefficient under the point of both perf/Watt or perf/mm^2.
If you tried to plant a doubt about the quality of this GM200 or Maxwell architecture in general, sorry, you choose the wrong architecture/chip/method. You simply failed.
Nvidia places the TitanX in play just so that logic works... But when you pull TitanX from the equation, and work from the 980 (GM204) a 20% increase in FpS for, almost 20% more money, and use 28% more power. It look really humdrum.
You felt the need to post basically the same comment in 2 different places?
Regardless you're cherry picking data. Overall its about a 30% increase in perf, for about a 30% increase in price. Its still a "good" deal if you want a powerful single GPU.
I'm still shocked at how much the 295x2 kills it. It's so much more powerful that even a Titan X. Newegg had the 295x2 on a sale for $550 2 weekends ago as well. Crossfire driver issues aside if your in the market for the high end I just don't see how you could go past the much more powerful and cheaper 295x2. If I had been in the USA with that $550 sale going I would of snapped that up so fast. Hell I would of bought several and sold a couple when I got home. Those cards are still $1600 in Australia.
Really? CF issues aside?? It's a freakin' CF card! What the heck is the point in buying the thing if CF support just doesn't work properly for so many games? And did AMD ever fix DX9/CF issues? Still sucks the last time I tested 7970 CF. Feel free to whack your power bill with the 295x2, spew out heat, etc. Every time I see a crazy extended power usage graph just so the enormous line for the 295x2 can be included, it blows my mind that people ever bother buying it. One person from OZ here commented that heat output is of primary concern where he lives, so chucking out so much heat from a 295x2 would be a real problem. I noticed the same thing with 580 SLI, sooo glad I eventually switched to a single 980.
The fact that a supposedly more powerful card is sold at a lower price should automatically raise you some questions... it is not that because 290x2 is a single card that all crossfire problems magically go away. Drive issue apart is not an option. Crossfire and SLI performances depend heavily on driver quality. And, sorry, but AMD dual GPUs cards have always been the worst choice since they were created. See what is the support for 7990 cards. 690 cards are still supported as you can see in these very benchmarks.
Moreover, if you want a better dual configuration with support done as one would expect for the spent money, you can just buy 2x GTX970 and live much more happily. Consuming much more less. Dual GPU comparison here is not even to take into account. The simplicity, scaling and smoothness of a single GPU like Titan X or this GTX980TI simply crush the dual GPU competition without any doubt, even though they do some FPS less as average. If you cannot understand that, it is right that you continue buying crappy cards and be happy with those.
Compared to my Evga 770 SuperClocked Sli, it only generates a few extra fps, scored just over 100 points higher in Firestrike(770 Sli graphic score- 16,837/ 980Ti graphic score- 16,900), its a great single card at a cheap price point, but little improvement over what I currently use.
Ryan! We never saw a review from GTX 960, will it be published. And there is no data in bench from it hence. Could you at least upload performance to bench section..
Wow ,Crysis 3 and Battlefield 4 hitting 80 fps at very high 2560x1440. Clearly there's much room for better graphics at lower resolutions. I would buy this card , but not if I knew the only benefit would be to run games at higher resolutions, that is, graphics has some way to go still and this card could accommodate such a prospect.
I'll wait to see what this 980GTX "METAL" is all about before I order a new gpu. With 770 Sli I'm in no real rush, as the 980Ti just barley surpassed my Firestrike Score(less than 100 graphic points)
I am just going through the AMD Fury X's reviews as it came out today with a ton of reviews (June 24th, 2015). It is excellent card and would have absolutely dominated the price point if this pesky GTX 980Ti just had not come along a month before AMD's much hyped launch. It cannot be such a coincidence that NVidia's card just happened to be so close in its benchmarks to the AMD's card. It is also such a coincidence that AMD set the price of their card at exactly the same price as NVidia. So my theory is that NVidia managed to get their hands on a reference Fury X and dialed in their 980Ti to match it. In the meantime, AMD was planning on charging a LOT more for their extremely well designed card (with its own built in water cooler and HBM) expecting the Ti to be launched much later this year but was forced to chop the price. If the price was cut, I don't expect the AIB manufacturers for the Fury X to be very pleased to have a lot less profit margin on the card. Three things may hold back the Fury X as well. One is that the Ti overclocks much better. Next is that the water cooler may or may not be welcomed by all considering its size and possible installation issues. Finally, AMD has been getting complaints by many folks over drivers (or lack thereof). Otherwise, a successful launch for both companies.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
290 Comments
Back to Article
ComputerGuy2006 - Sunday, May 31, 2015 - link
Well at $500 this would be 'acceptable', but paying this much for 28nm in mid 2015?SirMaster - Sunday, May 31, 2015 - link
Why do people care about the nm? If the performance is good isn't that what really matters?Galaxy366 - Sunday, May 31, 2015 - link
I think the reason people talk about nm is because a smaller nm means more graphical power and less usage.ComputerGuy2006 - Sunday, May 31, 2015 - link
Yeah, we also 'skipped' a generation, so it will be even a bigger bang... And with how old the 28nm is, it should be more mature process with better yields, so these prices look even more out of control.Kevin G - Monday, June 1, 2015 - link
Even with a mature process, producing a 601 mm^2 chip isn't going to be easy. The only larger chips I've heard of are ultra high end server processors (18 core Haswell-EX, IBM POWER8 etc.) which typically go for several grand a piece.chizow - Monday, June 1, 2015 - link
Heh, I guess you don't normally shop in this price range or haven't been paying very close attention. $650 is getting back to Nvidia's typical flagship pricing (8800GTX, GTX 280), they dropped it to $500 for the 480/580 due to economic circumstances and the need to regain marketshare from AMD, but raised it back to $650-700 with the 780/780Ti.In terms of actual performance gains, the actual performance increases are certainly justified. You could just as easily be paying the same price or more for 28nm parts that aren't any faster (stay tuned for AMD's rebranded chips in the upcoming month).
extide - Monday, June 1, 2015 - link
AMD will launch the HBM card on 400 series. 300 series is an OEM only series. ... just like ... wait for it .... nVidia's 300 series. WOW talk about unprecedented!chizow - Monday, June 1, 2015 - link
AMD already used that excuse...for the...wait for it...8000 series. Which is now the...wait for it....R9 300 OEM series (confirmed) and Rx 300 Desktop series (soon to underwhelm).NvidiaWins - Thursday, June 18, 2015 - link
RIGHT! AMD has been backpedaling for the last 3 years!Morawka - Monday, June 1, 2015 - link
980 was $549 at release.. So was the 780Nvidia is charging $650 for the first few weeks, but when AMD's card drops, you'll see the 980 Ti get discounted down to $500.
Just wait for AMD's release and the price will have to drop.
chizow - Monday, June 1, 2015 - link
780 was $650 at launch actually, and stayed there for some 6 months until AMD launched the 290X. The only way Nvidia will drop price on the 980Ti is if Fiji is both faster than it and priced similar, and even then Nvidia may not touch a thing.I think Nvidia knows what AMD has and they've already set the price points for AMD so that they won't have to change their pricing no matter what.
fingerbob69 - Tuesday, June 2, 2015 - link
"The only way Nvidia will drop price on the 980Ti is if Fiji is both faster than it and priced similar" ...and given themselves a month's head start ...and AMD a whole month too look at this Ti and adjust accordingly (if even necessary).I think it's nVidia who's looking weak here. In the UK Scan/Overclockers are really low on AMD stock 290 thru to 290x ...big launch coming?
chizow - Tuesday, June 2, 2015 - link
Yes, I'm sure it was AMD taking a position of strength to allow Nvidia to completely dominate the dGPU landscape for the last 9 months, unopposed, unchallenged since the GTX 970/980 launch, followed by the GTX 960, GTX Titan X, and GTX 980Ti. Makes perfect sense.Everyone expects a big launch from AMD sure, but I guess they are just waiting for Nvidia to tire themselves out first. I mean a quick scan in the US shows you can't find the GTX 980Ti anywhere, sold out instantly in a single day. AMD is just biding their time though for something big to pop out of a hole in the ground! :)
HeavyHemi - Saturday, September 12, 2015 - link
'Hole in the ground' if someone was buried a55 up. Ha...theuglyman0war - Thursday, June 4, 2015 - link
that would be a shame. It's been a long time since I pulled the trigger on my upgrade itches. Never thought it would be this long. ( gtx 480 tri sli at $399 with the starcraft release discount coupon )I think I assumed that the rise in pricing after gtx 580 was just going to be a short term fluke and that the world would return to sanity with resumption of the $499 pricing.
I imagine I would have probably have upgraded twice otherwise. I wonder if my demographic adds up to a significant lost market at the end of the day?
On the bright side...
Having waited this long, If I start buying used in two years the performance gain will be worth the loss of warranty considering the high cost.
NvidiaWins - Wednesday, June 3, 2015 - link
Agreed. Nvidia has no worries when it comes to AMD next series gpu. AMD will not be able to compete with the 980Ti price point.n13L5 - Tuesday, August 4, 2015 - link
"Just wait for AMD's release and the price will have to drop."Exactly why all those Nvidia fanboys should shut their mouths:
If they were to succeed in maligning AMD to death, handing Nvidia a monopoly, Nvidia will have not only lost any reason to ever drop prices, they'll also loose any reason to rush new gaming cards out. They'd put consumer cards at the very back of the queue, to be released at Nvidia's convenience - because Nvidia gets better margins from HPC sales.
Frenetic Pony - Monday, June 1, 2015 - link
Sir! You're trolling is commendable, the people biting, so serious. I salute you.StevoLincolnite - Monday, June 1, 2015 - link
You're = You are.In this instance it is "your".
Leyawiin - Monday, June 1, 2015 - link
Their, they're, there - no call for typo nitpicking. He who is free of guilt and all. As far as the 28nm bit, its probably ore of a well-poisoning comment by someone who leans towards the other camp.mapesdhs - Wednesday, June 3, 2015 - link
He was highlighting incorrect grammar, not a typo. Btw, that should have been "it's", not "its", for the same reason. ;D My old English teacher would have used bullets for such errors if such as allowed...mapesdhs - Wednesday, June 3, 2015 - link
...and of course my own typo of 'as' instead of 'was' once again shows how annoying it is that in 2015 we still can't edit our posts on AT. :\FlushedBubblyJock - Wednesday, June 10, 2015 - link
thank you so much this is english class after allblastlike - Friday, June 26, 2015 - link
LOL you are so retarded. You try being a bitch and correcting others and at the end you fall into your own pit.Pathetic, lol keep it up.Gothmoth - Monday, June 1, 2015 - link
Toll, du beherrscht deine Mutterprache.. welch gewaltige Geistesleistung...... Amis.....Gothmoth - Monday, June 1, 2015 - link
2015 and on Anandtech we are still unable to edit comments.... here is the missing "s",freedom4556 - Friday, June 12, 2015 - link
...and you typo in yours? Or so Google translate suggests.shaolin95 - Friday, October 23, 2015 - link
Yes but the 980ti is still untouchable basically so 28nm or not...ComputerGuy2006...show me something better ;)Flunk - Sunday, May 31, 2015 - link
lower nm = more transistors in the same space = more performance, and possibly reductions in heat and power usage. There is only so much they can do on the 28nm node, which is why the next generation will likely utterly massacre the current one. 4K will be feasible at a reasonable price point.Refuge - Monday, June 1, 2015 - link
Depending on yields, lets not get too excited yet.I'll be excited when I get the email from Amazon saying my GTX1000TI is going to be here in two days and says I only spent $500 after rebate. :)
xenol - Monday, June 1, 2015 - link
Transistor count means nothing. The GTX 780 Ti has 2.8 billion transistors. The GTX 980 has around 2 billion transistors, and yet the GTX 980 can dance with the GTX 780 Ti in performance.As the saying goes... it's not the size that matters, only how you use it.
Niabureth - Monday, June 1, 2015 - link
Don't want to sound like a messer schmitt but thats 2,8K cuda cores for GK110, and 2K for the GM204. The GK110 has 7.1 billion transistors.jman9295 - Tuesday, June 2, 2015 - link
In this very article they list the transistor count of those two cards in a giant graph. The 980 has 5.2 billion transistors and the 780ti 7.1 billion. Still, your point is the same, they got more performance out of less transistors on the same manufacturing node. All 28nm means is how small the gap is between identical components, in this case the CUDA cores. Each Maxwell CUDA is clearly more efficient than each Kepler. Also helping is the double VRAM size which probably allowed them to also double the ROP count which greatly improved transistor efficiency and performance.Mithan - Sunday, May 31, 2015 - link
It matters because we are close to .16/20nm GPU's, which will destroy these.dragonsqrrl - Sunday, May 31, 2015 - link
"we are close to .16/20nm GPU's"People said the same thing when the 750Ti launched. I'll give give you one thing, we are closer than we were, but we are not "close".
Kevin G - Monday, June 1, 2015 - link
The difference now is that there are actually 20 nm products on the market today, just none of them are GPUs. It seems that without FinFET, 20 nm looks to be optimal only for mobile.felicityc - Tuesday, January 11, 2022 - link
What if I told you we are on 8nm now?LemmingOverlord - Monday, June 1, 2015 - link
@SirMaster - The reason people care about the process node is because that right now - in mid-2015 - this is an extremely mature (ie: old but well-rehearsed) manufacturing process, which has gone through several iterations and can now yield much better results (literally) than the original 28nm process. This means that it's much cheaper to produce because there are less defective parts per wafer (ie: higher yield). Hence ComputerGuy2006 saying what he said.Contrary to what other people say "smaller nm" does NOT imply higher performance. Basically when a shrink comes along you can expect manufacturers to do 1 of two things:
a) higher transistor count in a similar die size, with similar power characteristics when compared to its ancestor - and therefore higher performance
b) same transistor count in a much smaller die size, therefore better thermals/power characteristics
Neither of these factor in architectural enhancements (which sometimes are not that transparent, due to their immaturity).
So ComputerGuy2006 is absolutely right. Nvidia will make a killing on a very mature process which costs them a below-average amount of money to manufacture.
In this case Nvidia is using "defective" Titan X chips to manufacture 980 Ti. Simple as that. Their Titan X leftovers sell for $350 less and you still get almost all the performance a Titan would give you.
royalcrown - Wednesday, June 3, 2015 - link
I take issue with point b) " same transistor count in a much smaller die size, therefore better thermals/power characteristics"I disagree because the same die shrink can also cause a rise in power density, therefore WORSE characteristics (especially thermals).
Gasaraki88 - Monday, June 1, 2015 - link
Smaller nm, bigger e-peen.Oxford Guy - Monday, June 1, 2015 - link
I'm sure you posted that from your 65nm processor and 80nm GPU.godrilla - Thursday, June 4, 2015 - link
Because next shrink is in the 3d chips plus hbm category and that should be amazing performance leap.Gastec - Sunday, September 4, 2016 - link
Oh yeah, that's "1337 5p34k" for ya! :PWreckage - Sunday, May 31, 2015 - link
Wait until AMD tries to sell you a rebadge for that much.Rezurecta - Sunday, May 31, 2015 - link
I don't understand the rage about rebadge all of a sudden. GPU has been doing this for YEARS!Anyway, nice card from Nvidia for a decent price.
dragonsqrrl - Sunday, May 31, 2015 - link
Don't mix up traditional rebadging and what AMD is doing with their upcoming lineup, it's unprecedented. It's also worrisome, and it's clearly a big problem for AMD. What does it say about the state of the company and their R&D budget that they'll only have 1 new GPU this coming generation? One of the big problems AMD is currently facing, amongst many other things, are their profit margins. In order to compete they're selling larger, more expensive GPU's attached to more complex memory interfaces relative to the competition, and that won't change much this coming generation as a result of these top to bottom rebadges. The situation is really becoming quite analogous to their CPU's, which should raise alarms for any informed enthusiast. It's a less than ideal situation, to put it lightly.Azix - Sunday, May 31, 2015 - link
If you post this after their launch then maybe I'd understand. As it is their Chips are almost certainly not rebadged because they aren't on TSMC and will include hardware improvements. They maybe be based on hawaii or w/e but not even nvidia managed to put out more than 2 new cards during their major launch last year. Considering this is likely the last year of 28nm launches, it may make sense for them to put out an entire line of modified chips and be done with the 300 series. Keeping it fresh till next year when they can do 2-3 on a smaller process.dragonsqrrl - Sunday, May 31, 2015 - link
"If you post this after their launch then maybe I'd understand."You realize I could just give the exact same response to the rest of your comment. But I won't.
"As it is their Chips are almost certainly not rebadged because they aren't on TSMC and will include hardware improvements.",
That's interesting, I haven't heard anything about AMD switching fabs for the 300 series. Source?
"They maybe be based on hawaii or w/e but not even nvidia managed to put out more than 2 new cards during their major launch last year."
Not quite a straight comparison there, since Nvidia also launched a new architecture with the 750Ti. Thus far we've gotten 4 new GPU's (not just cards) based on Maxwell. And we're not just talking launch here. All indications point to Fiji as the only new GPU of the coming generation for AMD. And what's more, it might not even be an updated architecture. The rest of the lineup will likely be refreshes of Hawaii, Tonga, Pitcairn, and Bonaire, ranging from GCN 1.2-1.0, with at most manufacturing revisions to improve efficiency. Again it's very important to put this in perspective, in the context of what these GPU's will be competing against. They range in feature set, and in power efficiency, none of which is anywhere close to par with Maxwell, and it's going to be very difficult for AMD to compete with this lineup for another generation. It's not a good situation for AMD. "Keeping it fresh"? How you came to spin your conclusion the way you did is beyond me.
ImSpartacus - Monday, June 1, 2015 - link
Be nice. The guy is basically telling you at this point.ImSpartacus - Monday, June 1, 2015 - link
trolling***przemo_li - Monday, June 1, 2015 - link
Fiji is NOT gpu name.Its gpu segment name.
Just like VI, SI, and some more.
Its chip name if anything.
If we for a moment switched to CPU-speak, You just claimed that Puma is single CPU from AMD ;)
Refuge - Monday, June 1, 2015 - link
It is the name of a GPU Architecture. Which could be a one run chip, or it could have multiple versions based on binning.dragonsqrrl - Monday, June 1, 2015 - link
It's not a GPU architecture. GCN is a GPU architecture, Maxwell is a GPU architecture. Fiji is a GPU, GM204 is a GPU. This isn't exactly a new paradigm we're dealing with here. Oh dear, I think I might be telling someone something (aka trolling***). I've done it again.dragonsqrrl - Wednesday, June 3, 2015 - link
First, that's a terrible analogy. Puma is not a CPU, it's a CPU architecture, successor to Jaguar. Fiji is a GPU, I never once assumed or suggested that there will be a single Fiji SKU (that was all you), right now it's likely there will be 2 for the consumer market.I'm honestly not sure what you meant by "segment", perhaps you could clarify? Are you talking about AMD's XT/PRO convention? They're still the same GPU, pro is typically just a harvested XT.
Refuge - Monday, June 1, 2015 - link
They released more than four if you include mobile GPU's I believe it goes up 2 more to 6.dragonsqrrl - Monday, June 1, 2015 - link
What do you mean mobile GPU's? Are you talking about the 900m series? There are no mobile specific GPU's in that lineup. It's all binned GM204 and GM107 SKU's.eanazag - Sunday, May 31, 2015 - link
Interesting observation. I see the same behavior, but the situations are different. Both major graphics vendors are stuck on 28 nm. The 285 is a new product. AMD's graphic situation is not even close to the same as CPU. They are not even competitive in most of the markets for CPU. AMD will likely release a very competitive GPU, which is why NV is releasing the Ti now.dragonsqrrl - Sunday, May 31, 2015 - link
Yes, the 285 is a new product, and while it is an improvement and a step in the right direction, Tonga just isn't enough to address the issue of AMD's profit margins this coming generation, or make them anymore competitive in mobile (M295X). It would be as though Nvidia were selling the 980 at the $200 price point. Not exactly, but from a memory interface, die size, PCB complexity, power consumption perspective, that's basically what AMD is doing right now, with no solution forthcoming. But I guess it's better than selling Tahiti for $200."AMD's graphic situation is not even close to the same as CPU. They are not even competitive in most of the markets for CPU. AMD will likely release a very competitive GPU, which is why NV is releasing the Ti now."
Some might argue they aren't competitive in the dGPU market with Nvidia market share approaching 80%... some might say that's like Intel levels of dominance...
And I didn't say it's the same as their CPU situation, I said it's becoming more similar. While AMD will likely be competitive in raw performance, as I've tried to explain in my past 2 comments, that's kind of besides the point.
chizow - Sunday, May 31, 2015 - link
Yes, both are stuck on 28nm, but only Nvidia came out with a comprehensive ASIC line-up knowing we'd be stuck here for another 2 years (going back to 2H 2014). It is obvious now that AMD's cost-cutting in staff and R&D is starting to manifest itself as they simply can't keep up while losing ground on 2 fronts (CPU and GPU).The culmination of this will be AMD going to market with a full series of rebrands of mostly old parts going back to 2011 with a single new ASIC at the top of their stack, while Nvidia has fully laid out its arsenal with GM107 (1 SKU), GM206 (1 SKU), GM204 (2 SKU), and now GM200 (2 SKU).
Kevin G - Monday, June 1, 2015 - link
Is it unprecedented? I recall the Geforce 9000, GTS 100 and most of the GTX 200 series being various rebrands from 'generation' to 'generation'. In fact, the 8800GTS 512 MB, 9800GTX, GTS 150 and GTS 250 were all the same chip design (the GTS 250 had a die shrink but was functionally unchanged).nVidia has gotten better since then, well with the exception of the GF108 that plagued the low end for far too long.
chizow - Monday, June 1, 2015 - link
Yes, its unprecedented to launch a full stack of rebrands with just 1 new ASIC, as AMD has done not once, not 2x, not even 3x, but 4 times with GCN (7000 to Boost/GE, 8000 OEM, R9 200, and now R9 300) Generally it is only the low-end, or a gap product to fill a niche. The G92/b isn't even close to this as it was rebranded numerous times over a short 9 month span (Nov 2007 to July 2008), while we are bracing ourselves for AMD rebrands going back to 2011 and Pitcairn.Gigaplex - Monday, June 1, 2015 - link
If it's the 4th time as you claim, then by definition, it's most definitely not unprecedented.chizow - Monday, June 1, 2015 - link
The first 3 rebrands were still technically within that same product cycle/generation. This rebrand certainly isn't, so rebranding an entire stack with last-gen parts is certainly unprecedented. At least, relative to Nvidia's full next-gen product stack. Hard to say though given AMD just calls everything GCN 1.x, like inbred siblings they have some similarities, but certainly aren't the same "family" of chips.Refuge - Monday, June 1, 2015 - link
Thanks Gigaplex, you beat me to it... lolchizow - Monday, June 1, 2015 - link
Cool maybe you can beat each other and show us the precedent where a GPU maker went to market with a full stack of rebrands against the competition's next generation line-up. :)FlushedBubblyJock - Wednesday, June 10, 2015 - link
Nothing like total fanboy denialKevin G - Monday, June 1, 2015 - link
The G92 got its last prebrand in 2009 and was formally replaced on in 2010 by the GTX 460. It had a full three year life span on the market.The GTS/GTX 200 series as mostly rebranded. There was the GT200 chip on the high end that was used for the GTX 260 and up. The low end silently got the GT216 for the Geforce 210 a year after the GTX 260/280 launch. At this time, AMD was busy launching the Radeon 4000 series which brought a range of new chips to market as a new generation.
Pitcairn came out in 2012, not 2011. This would mimic the life span of the G92 as well as the number of rebrands. (It never had a vanilla edition, it started with the Ghz edition as the 7870.)
chizow - Monday, June 1, 2015 - link
@Kevin G, nice try at revisionist history, but that's not quite how it went down. G92 was rebranded numerous times over the course of a year or so, but it did actually get a refresh from 65nm to 55nm. Indeed, G92 was even more advanced than the newer GT200 in some ways, with more advanced hardware encoding/decoding that was on-die, rather than on a complementary ASIC like G80/GT200.Also, at the time, prices were much more compacted at the time due to economic recession, so the high-end was really just a glorified performance mid-range due to the price wars started by the 4870 and the economics of the time.
Nvidia found it was easier to simply manipulate the cores on their big chip than to come out with a number of different ASICs, which is how we ended up with GTX 260 core 192, core 216 and the GTX 275:
Low End: GT205, 210, GT 220, GT 230
Mid-range: GT 240, GTS 250
High-end: GTX 260, GTX 275
Enthusiast: GTX 280, GTX 285, GTX 295
The only rebranded chip in that entire stack is the G92, so again, certainly not the precedent for AMD's entire stack of Rebrandeon chips.
Kevin G - Wednesday, June 3, 2015 - link
@chizowOut of that list of GTS/GTX200 series, the new chip in that line up in 2008 was the GT200 and the GT218 that was introduced over a year later in late 2009. For 9 months on the market the three chips used in the 200 series were rebrands of the G94, rebrands of the G92 and the new GT200. The ultra low end at this time was filled in by cards still carrying the 9000 series branding.
The G92 did have a very long life as it was introduced as the 8800GTS with 512 MB in late 2007. In 2008 it was rebranded the 9800GTX roughly six months after it was first introduced. A year later in 2009 the G92 got a die shrink and rebranded as both the GTS 150 for OEMs and GTS 250 for consumers.
So yeah, AMD's R9 300 series launch really does mimic what nVidia did with the GTS/GTX 200 series.
FlushedBubblyJock - Wednesday, June 10, 2015 - link
G80 was not G92 not G92b nor G94 mr kevin gYojimbo - Monday, June 1, 2015 - link
After some research, I posted a long and detailed reply to such a statement before, I believe it was in these forums. Basically, the offending NVIDIA rebrands fell into three categories: One category was that NVIDIA introduced a new architecture and DIDN'T change the name from the previous one, then later, 6 months if I remember, when issuing more cards on the new architecture, decided to change to a new brand (a higher numbered series). That happened once, that I found. The second category is where NVIDIA let a previously released GPU cascade down to a lower segment of a newly updated lineup. So the high end of one generation becomes the middle of the next generation, and in the process gets a new name to be uniform with the entire lineup. The third category is where NVIDIA is targeting low-end OEM segments where they are probably fulfilling specific requests from the OEMs. This is probably the GF108 which you say has "plagued the low end for too long now", as if you are the arbiter of OEM's product offerings and what sort of GPU their customers need or want. I'm sorry I don't want to go looking for specific citations of all the various rebrands, because I did it before in a previous message in another thread.The rumors of the upcoming retail 300 series rebrand (and the already released OEM 300 series rebrand) is a completely different beast. It is an across-the-board rebrand where the newly-named cards seem to take up the exact same segment as the "old" cards they replace. Of course in the competitive landscape, that place has naturally shifted downward over the last two years, as NVIDIA has introduced a new line up of cards. But all AMD seems to be doing is introducing 1 or 2 new cards in the ultra-enthusiast segment, still based on their ~2 year old architecture, and renaming the entire line up. If they had done that 6 months after the lineup was originally released, it would look like indecision. But being that it's being done almost 2 years since the original cards came out, it looks like a desperate attempt at staying relevant.
Oxford Guy - Monday, June 1, 2015 - link
Nice spin. The bottom line is that both companies are guilty of deceptive naming practices, and that includes OEM nonsense.Yojimbo - Monday, June 1, 2015 - link
In for a penny, in for a pound, eh? I too could say "nice spin" in turn. But I prefer to weigh facts.Oxford Guy - Monday, June 1, 2015 - link
"I too could say 'nice spin' in turn. But I prefer to weigh facts."Like the fact that both companies are guilty of deceptive naming practices or the fact that your post was a lot of spin?
FlushedBubblyJock - Wednesday, June 10, 2015 - link
AMD is guilty of going on a massive PR offensive, bending the weak minds of it's fanboys and swearing they would never rebrand as it is an unethical business practice.Then they launched their now completely laughable Gamer's Manifesto, which is one big fat lie.
They broke ever rule they ever laid out for their corpo pig PR halo, and as we can see, their fanboys to this very day cannot face reality.
AMD is dirtier than black box radiation
chizow - Monday, June 1, 2015 - link
Nice spin, no one is saying either company has clean hands here, but the level to which AMD has rebranded GCN is certainly, unprecedented.Oxford Guy - Monday, June 1, 2015 - link
Hear that sound? It's Orwell applauding.Klimax - Tuesday, June 2, 2015 - link
I see only rhetoric. But facts and counter points are missing. Fail...Yojimbo - Tuesday, June 2, 2015 - link
Because I already posted them in another thread and I believe they were in reply to the same guy.Yojimbo - Tuesday, June 2, 2015 - link
Orwell said that severity doesn't matter, everything is binary?FlushedBubblyJock - Wednesday, June 10, 2015 - link
I bought a bunch of G80 G92 G92b and G94 nvidia cards because you could purchase memory size, bandwidth, bit width, power connector config, essentially any speed at any price point for a gamers rig, install the same driver, change the cards easily, upgrade for your customers without hassles...IT WAS A GOLD MINE OF FLEXIBILITY
What happened was, the amd fanboys got very angry over the IMMENSE SUCCESS of the initial G80 and it's reworked cores and totally fluid memory, card size, bit width, and pricing configurations... so they HAD TO TRY TO BRING IT DOWN...
Thus AMD launched their PR war, and the clueless amd fan launched their endless lies.
I'll tell you this much, no on would trade me a 9800GTX for a 9800GT
I couldn't get the 92 bit width cards for the same price as the 128 bit
DDR2 and DDR3 also differentiated the stack massively.
What we had wasn't rebranding, but an amazingly flexible GPU core that stood roaring above at the top and could be CUT down to the middle and the low gaming end, an configured successfully with loads of different bit widths and memory configs....
64 bit width, 92, 128, 256, 384, 192, ETC...
That was an is a awesome core, period.
BillyONeal - Sunday, May 31, 2015 - link
And people have been bent out of shape about it. For "YEARS" :)dragonsqrrl - Sunday, May 31, 2015 - link
Their highest-end rebadge, the 390X, will likely compete with the 980, not the 980 Ti. The 980 Ti will be closer to Fiji's performance profile.austinsguitar - Sunday, May 31, 2015 - link
I dont think you realize how much more efficiant this card is even compared to past cards for its nm and performance. This is a feat. Just calm down and enjoy. I am very happy that the cards price us perfect. :) thanks nvidiaMapRef41N93W - Sunday, May 31, 2015 - link
Maybe you aren't aware of how silicon works, but this a 601mm^2 die which costs a boat load to produce especially with the rising costs of crystalline silicon dies. Being on 28nm this long just means the yields are higher (which is why a 601mm^2 is even possible).You aren't going to see a 14nm card that outperforms this by much till 2017 at the earliest which following the recent NVIDIA trends should see the Titan XYZ (whatever they want to call it) which should be a pretty huge jump at a pretty high price.
Thomas_K - Monday, June 1, 2015 - link
Actually AMD is doing 14nm starting next yearhttp://www.guru3d.com/news-story/it-is-official-am...
"Although this was a rumor for a long time now we now know that AMD skips 20nm and jumps onto a 14nm fabrication node for their 2016 GPUs."
dragonsqrrl - Sunday, May 31, 2015 - link
Not sure I understand your comment, 28nm is precisely why we're paying this much for this level of performance in 2015... But it's also pretty impressive for the same reason.Azix - Sunday, May 31, 2015 - link
14/16nm might cost more. 28nm should have better yields and lower cost. These chips do not cost much to make at all (retail price could be 2-3 times the chip cost)dragonsqrrl - Sunday, May 31, 2015 - link
I think you misinterpreted my comment. I was responding to someone who seemed shocked by the fact that price/performance ratios aren't improving dramatically despite the fact that we're on a very mature process. In response I said the fact that we're on the same process is precisely why we aren't seeing dramatic improvements in price/performance ratios."28nm should have better yields and lower cost. These chips do not cost much to make at all (retail price could be 2-3 times the chip cost)"
Yields are just one part of the equation. Die size also plays a significant role in manufacturing costs. The fact that your trying to say with a straight face that GM200 does not cost much to make says more than your written comment itself.
zepi - Monday, June 1, 2015 - link
Assuming perfect scaling 600mm2 28nm chip would shrink to 150mm2 at 14nm.GM107 is a 148mm2 chip, so basically this "monster" with just a dieshrink would find a nice place for itself at the bottom end of Nvidias lineup with after transition to 14nm.
This does not take into account the fact that at 14nm and 150mm2 they couldn't give it enough memory bandwidth so easily, but just tells you something about how significant the reduction in size and manifacturing cost is after the initial ramp-up of the yields.
Klimax - Tuesday, June 2, 2015 - link
@zepi: Perfect scaling on non-Intel's fabs doesn't exist as 16/14nm has 20nm metal layer and thus cannot scale as effectively.Refuge - Monday, June 1, 2015 - link
Distribution costs them more than manufacturing. lolChaser - Sunday, May 31, 2015 - link
That's pointless. 28nm with (Nvidia at least) is very energy efficient and as seen by this review a steal for the power this card delivers. It's a Titan X at $650.00. You're just desperately trying to find anything you can try and gripe about.Oxford Guy - Monday, June 1, 2015 - link
A GPU that expensive only qualifies as a steal if you're Leona Helmsley.mapesdhs - Tuesday, March 12, 2019 - link
2019 says hold my beer. :)Mark_gb - Monday, June 1, 2015 - link
Nvidia must make a living. No matter who they have to run over in the process. You, me, AMD... Doesn't matter. I imagine that at Nvidia, the mantra is "MUST MAKE MONEY! LOTS AND LOTS OF MONEY!", and that it repeats all day long, every day.P. T. Barnum was right... "There's a sucker born every minute."... and we are them.
Michael Bay - Tuesday, June 2, 2015 - link
What a shock that must be, for-profit corporation going about making profit. How dare they.mapesdhs - Wednesday, June 3, 2015 - link
Indeed, quite shocking. :)Personally I hope they make oodles of the stuff, so they can reinvest and make even better tech, because, my heavens, that's how private companies function. :D
It's weird how people complain about companies making profits, yet the very existence & continued success of a company depends on profits (unless of course one is AMD and somehow gets away with year after year of losses without going under).
(btw MB, please stop making crap Transformers movies... ;D Sorry, couldn't resist, hehe...)
FlushedBubblyJock - Wednesday, June 10, 2015 - link
the arabs have been shoring up amd - it's dirty oil money for the crime that amd isfingerbob69 - Tuesday, June 2, 2015 - link
Yeah ...but imagine how pissed you'd be if you'd brought a titan x in the last couple of months!I hope that extra 3% feels good!
TheJian - Tuesday, June 2, 2015 - link
Depends on where you live. At a mere 3hrs per day (easy if more than one person uses it) at 270w difference even Ocing it, you end up $75 a year savings in a place like Australia. That ends up being $300 if you keep it for 4yrs, more if longer. In 15+ states in USA they are above 15c/kwh (au is 25.5c) so you'd save ~$45+ a year (at 15.5, again 14 quite a bit above this), so again $180 for 4yrs. There are many places around the world like AU.Note it pretty much catches 295x2 while doing it Oced. It won't put off as much heat either running 270w less, so in a place like AZ where I live, this card is a no brainer. Since I don't want to cool my whole house to game (no zoned air unfortunately), I have to think heat first. With Electricity rising yearly here, I have to think about that over the long haul too. TCO is important. One more point, you don't deal with any of the "problem" games on NV where crossfire does nothing for you. Single chip is always the way to go if possible.
If you have a kid, they can blow those watts up massively during summer for 3 months too! WOW users can do 21hrs on a weekend...LOL. I'd say Skyrim users etc too along with many rpg's that will suck the life out of you (pillars, witcher 3, etc). A kid can put in more time in the summer than an adult all year in today's world where they don't go out and play like I used to when I was a kid. You're shortsighted. Unless AMD's next card blows this away (and I doubt that, HBM will do nothing when bandwidth isn't the problem, as shown by gpu speeds giving far more than ocing memory), you won't see a price drop at all for a while either.
If rumors are true about $850-900 price for AMD's card these will run off the shelf for a good while if they don't win by a pretty hefty margin and drop the watts.
mapesdhs - Wednesday, June 3, 2015 - link
Add Elite Dangerous, Project Reality, GTA V, the upcoming Squad and various other games to your list of titles which one tends to play for long periods if at all.Ian.
Deacz - Wednesday, June 3, 2015 - link
860€ atm :(ddferrari - Sunday, June 14, 2015 - link
I guess you're one of those people who care more about specs than actual performance. Seriously, is 28nm just too big for ya? It's 3% slower than the fastest gpu on Earth for $650, and you're whining about the transistor size... get a life.Michael Bay - Sunday, May 31, 2015 - link
I`d much rather read about 960...pvgg - Sunday, May 31, 2015 - link
Me too...Ryan Smith - Sunday, May 31, 2015 - link
And you will. Next week.just4U - Sunday, May 31, 2015 - link
The 960 is a underwhelming overpriced product.. I'd be more interested in a Ti variant if I was looking to buy right now.. but no.. although that 980Ti is tempting, I'd never purchase it without seeing what AMD is doing next month.Oxford Guy - Monday, June 1, 2015 - link
"The 960 is a underwhelming overpriced product."There you go, Michael, you've read about it.
PEJUman - Sunday, May 31, 2015 - link
GM206 based 960xx? or a further cut on GM204? ;)My gut feel tells me it would be GM204 based: I am guessing ~3B trans on GM206 on a very mature 28nm process should be relatively doable without much defects.
RaistlinZ - Sunday, May 31, 2015 - link
What more would a review of the 960 tell you that you don't already know, honestly? I'd rather read reviews about interesting products like the 980Ti. People need to let the 960 review go already, geez.Michael Bay - Sunday, May 31, 2015 - link
I only trust AT numbers and am in no hurry to upgrade.God I wish they would compare Baytrail/Cherrytrail to i3s.
Brett Howse - Sunday, May 31, 2015 - link
I did compare Cherry Trail to the i3 SP3 in the Surface 3 review. Was there more you were looking for?Michael Bay - Monday, June 1, 2015 - link
I`m trying to get a cheap small notebook for my father. He is currently on i3-380UM and the choice is between N3558 and i3-4030U. Workload is strictly internet browsing/ms office.Not much point in changing anything if performance is going to be worse than it was...
sandy105 - Monday, June 1, 2015 - link
Exactly , it would be interesting to see how much faster than baytrail they are ?DanNeely - Sunday, May 31, 2015 - link
DVI may be an obsolescent standard at this point; but 4/5k gaming is still expensive enough that a lot of the people buying into it now are ones who're upgrading from older 2560x1600 displays that don't do DP/HDMI 2. A lot of those people will probably keep using their old monitor as a secondary display after getting a new higher resolution one (I know I plan to); and good DL-DVI to display port adapters are still relatively expensive at ~$70. (There're cheaper ones; but they've all got lots of bad reviews from people who found they weren't operating reliably and were generating display artifacts: messed up scan lines.) Unless it dies first, I'd like to be able to keep using my existing NEC 3090 for a few more years without having to spend money on an expensive dongle.YazX_ - Sunday, May 31, 2015 - link
Dude, majority are still playing on 1920x1080 and just few now are making the leap to 2560x1440p, i have been gaming on 1440p since two years and not planning to go 4k anytime soon since hardware still not mature enough to play at 4k comfortably with single video card.thus, DVI is not going anywhere since dual layer DVI supports 1440p and probably most of 1080p gamers are using DVI unless if they have G-Sync or want to use Adaptive V-Sync then they have to use DP, and dont forget that there are too many people who bought 27" Korean 1440 monitors that doesnt have except DVI ports.
DanNeely - Sunday, May 31, 2015 - link
If you're playing at 1920/60hz this card's massive overkill, and in any event it's a non-issue for you because your monitor is only using a single link in the DVI and you can use a dirt cheap passive DVI-HDMI/DP adapter now; and worst case would only need a cheap single link adapter in the future.My comment was directed toward Ryan's comment on page 2 (near the bottom, above the last picture) suggesting that the DVI port wasn't really needed since any monitor it could drive wouldn't need this much horse power to run games.
FlushedBubblyJock - Wednesday, June 10, 2015 - link
totally disagree - I game at 1920x1200, the only rez the 980ti is capable of without knocking down the eye candy.Kutark - Monday, June 1, 2015 - link
Exactly. I literally just now upgraded to a 1440p monitor, and i can't even express in words how little of a sh*t i give about 4k gaming. Ive been a hardware nerd for a long time, but when i got into home theater i learned just how much resolution actually matters. 4k is overkill for a 120" projected image at a 15' seating distance. 4k at normal desk viewing distances is way beyond overkill. They've done tests on fighter pilots who have ridiculous vision, like 20/7.5 and such, and even they can't see a difference at those seating distances. 4k is almost as much of a marketing BS gimmick than 3D was for tv's.Anyways im clearly getting angry. But point still stands, every single gamer i know is still on 1080p, i was the first to splurge on a 1440p monitor. And now its put me into a position where my SLI'd 760's aren't really doing the deed, especially being 2gb cards. So, 980ti fits the bill for my gsync 144hz 1440p monitor just about perfectly.
Kosiostin - Monday, June 1, 2015 - link
I beg to differ. 4K at monitor viewing distance is not overkill, it's actually quite pleasantly sharp. Phones, tablets and laptops are already pushing for 2K+ displays which is phenomenally sharp and out of the league for normal FHD monitors. Gaming at 4K is still not coming but when it comes it will blow our minds, I am sure.Oxford Guy - Monday, June 1, 2015 - link
People who care so much for immersion should be using 1440 with HDTV screen sizes, not sitting way up close with small monitors.Too bad HDTVs have so much input lag, though.
Kutark - Monday, June 1, 2015 - link
Basically at a 5' viewing distance, you would have to have a 40" monitor before 4k would start to become noticeable.Even at 30" monitor you would have to be sitting roughly 3.5' or closer to your monitor to be able to begin to tell the difference.
We also have to keep in mind we're talking about severely diminishing returns. 1440p is about perfect for normal seating distances with a computer on a 27" monitor. 30" some arguments can be made for 4k but its a minor. Its not like we're going from 480p to 1080p or something 1440p is still very good at "normal" computer seating distances.
mapesdhs - Wednesday, June 3, 2015 - link
Human vision varies as to who can discern what at a particular distance. There's no fixed cutoffs for this. Personally, when wandering around a TV store back in January (without knowing what type of screen I was looking at), for visual clarity the only displays that looked properly impressive turned out to be 4Ks. However, they're still a bit too pricey atm for a good one, with the cheaper models employing too many compromises such as reduced chroma sampling to bring down the pricing, or much lower refresh rates, etc. (notice how stores use lots of static imagery to advertise their cheaper 4K TVs?)Btw, here's a wonderfull irony for you: recent research, mentioned in New Scientist, suggests that long exposure by gamers to high-refresh displays makes them more able to tell the difference between standard displays and high-refresh models, ie. simply using a 144Hz monitor can make one less tolerant of standad 60Hz displays in the long term. :D It's like a self-reinforcing quality tolerance level. Quite funny IMO. No surprise to me though, years working in VR & suchlike resulted in my being able to tell the difference in refresh rates much more than I was able to beforehand.
Anyway, I'm leaving 4K until cheaper models are better quality, etc. In the meantime I bought a decent (but not high-end) 48" Samsung which works pretty well. Certainly looks good for Elite Dangerous running off a 980, and Crysis looks awesome.
Laststop311 - Monday, June 1, 2015 - link
Why would most people be using DVI? DVI is big and clunky and just sucks. Everyone that gets new stuff nowadays uses displayport it has the easiest to use plug.Crest - Sunday, May 31, 2015 - link
Thank you for including the GTX580. I'm still living and working on a pair of 580's and it's nice to know where they stand in these new releases.TocaHack - Monday, June 1, 2015 - link
I upgraded from SLI'd 580s to a 980 at the start of April. Now I'm wishing I'd waited for the Ti! It wasn't meant to launch this soon! :-/mapesdhs - Wednesday, June 3, 2015 - link
Indeed, one of the few sites to include 580 numbers, though it's a shame it's missing in some of the graphs (people forget there are lots of 3GB 580s around now, I bought ten last month).If it's of any use, I've done a lot of 580 SLI vs. 980 (SLI) testing, PM for a link to the results. I tested with 832MHz 3GB 580s, though the reference 783MHz 3GB models I was already using I sold for a nice profit to a movie company (excellent cards for CUDA, two of them beat a Titan), reducing the initial 980 upgrade to a mere +150.
Overall, a 980 easily beats 580 SLI, and often comes very close to 3-way 580 SLI. The heavier the load, the bigger the difference, eg. for Firestrike Ultra, one 980 was between 50% and 80% faster than two 3GB 580s. I also tested 2/3-way 980 SLI, so if you'd like the numbers, just PM me or Google "SGI Ian" to find my site, contact page and Yahoo email adr.
I've been looking for a newer test. I gather GTA V has a built-in benchmark, so finally I may have found something suitable, need to look into that.
Only one complaint about the review though, why no CUDA test??? I'd really like to know how the range of NV cards stacks up now, and whether AE yet supports MW CUDA V2. I've tested 980s with Arion and Blender, it came close to two 580s, but not quite. Would be great to see how the 980 Ti compares to the 980 for this. Still plenty of people using CUDA with pro apps, especially AE.
Ian.
mapesdhs - Wednesday, June 3, 2015 - link
Btw Crest, which model 580s are you using? I do have some 1.5GB 580s aswell, but I've not really done much yet to expose where VRAM issues kick in, though it does show up in Unigine pretty well at 1440p.For reference, I do most testing with a 5GHz 2700K and a 4.8GHz 3930K, though I've also tested three 980s on a P55 with an i7 870 (currently the fastest P55 system on 3DMark for various tests).
Mikemk - Sunday, May 31, 2015 - link
Since it has 2 SMM's disabled, does it have the memory issue of the 970? (Haven't read full article yet, sorry if answered in article)madwolfa - Sunday, May 31, 2015 - link
No, it has full access.MapRef41N93W - Sunday, May 31, 2015 - link
It has the full ROPs. The memory is tied to the ROPs which is why the 970 had it's issue.RaistlinZ - Sunday, May 31, 2015 - link
Ryan, did you guys fully test the amount of full-speed VRAM on this 980Ti? Is all 6GB running at full speed and not just 5.5GB or some such nonesense? Have you tested actual in game VRAM usage and seen it reach 6GB? Thanks. :)madwolfa - Sunday, May 31, 2015 - link
http://www.extremetech.com/extreme/206956-nvidia-g... confirmed by nvidia that full memory access is availableo-k - Sunday, May 31, 2015 - link
that's what they said last time.FlushedBubblyJock - Wednesday, June 10, 2015 - link
No they didn't say anything and 6 months later...This time they said something beforehand, I'm sure they are lying, so I agree with you.
My tinfoil is failing one moment I'm receiving a transmission from beta reticuli.
Ah yes, it's confirmed, nVidia is lying, again, the memory is hosed on the 980ti...
This message will self destruct in 5 seconds wether or not you've accepted the mission o-k.
Ryan Smith - Sunday, May 31, 2015 - link
Yep. We've checked."Just to be sure we checked to make sure the ROP/MC configuration of GTX 980 Ti was unchanged at 96 ROPs"
None of the ROP/MC partitions have been disabled, and all 3MB of L2 cache is available.
jjj - Sunday, May 31, 2015 - link
Makes the 980 a very hard sell even at 499$, they should have dropped it to 449$ or even slightly less. The TI is so much faster and the 970 is so much cheaper.Yojimbo - Sunday, May 31, 2015 - link
I think it should have been dropped to $250, but that's just me. When price premiums are not linear with performance increases, people complain the higher priced card is overpriced, and when they are, people complain the lower priced card is overpriced. Best solution: All cards $0.jjj - Sunday, May 31, 2015 - link
I wasn't complaining, i was commenting on their strategy and your childish comment is just inappropriate.PEJUman - Monday, June 1, 2015 - link
I agree, the fact that AMD new 3xx is mostly (sans 1 new GPU) rebrands scares the crap out of me. and Nvidia knows it too, that's why we're getting the bad witcher 3 on gameworks @ kepler, astronomical prices and a generally very 'apple like marketing' from nvidia.Don't get me wrong, I certainly appreciate the level of refinements that Nvidia brings to the table, but without any answer from AMD, prices are very far from reasonable.
few years ago, I would never guessed PC gaming will be dead due to single GPU supplier situation, nowadays I am a lot more unsure...
Yojimbo - Monday, June 1, 2015 - link
In that case why assume that the 980 should have been dropped in price more. Maybe the 980 Ti should have been priced at $700?The difference between $500 and $650 is palpable. And the performance one requires depends on the monitor one has. What you seem to be saying is you would be willing to pay more than 30% price premium for a 30% increase in performance, which is usual. But when prices are actually set that way, there always seem to be people complaining the premium card is priced too high, and quoting the price/performance difference as the reason.
chizow - Monday, June 1, 2015 - link
@Yojimbo lol so true, people seem to think price:perf should be perfectly linear and comparable to some bargain bin part at $75, but if that was always the case, we'd all be using 2-3 gen old cards that can't play the games we want to play, today.dragonsqrrl - Sunday, May 31, 2015 - link
The 980 is $550, not $499. Despite that it still has a similar price/performance ratio to the 980 Ti. So technically it's no worse of a deal than the 980 Ti, but I think the 980 should still drop in price to ~$500 or $450. It should have a better price/performance ratio than the higher-end Ti.jjj - Sunday, May 31, 2015 - link
The 980 price has been dropped to 499$ and the point was that the TI and the 970 are much better buys, the 870 being way cheaper for little perf loss while the TI offers a lot more perf and is far better at 4k.dragonsqrrl - Sunday, May 31, 2015 - link
Ahh, sorry I missed that. However, at $500 the 980 still has a similar price/performance ratio as the 980 Ti. So while I do think it should drop by more, I'm also a bit confused by why people are calling it a terrible buy when it really isn't anymore terrible than the Ti at $650.just4U - Sunday, May 31, 2015 - link
(...sigh) $740 here in Canada.dragonsqrrl - Sunday, May 31, 2015 - link
:(o-k - Sunday, May 31, 2015 - link
could you please make sure this time that the ram is 384-Bit, 6GB total @ 7GHz GDDR5. Please double check.D. Lister - Sunday, May 31, 2015 - link
No, drop everything in your life AT staff, and effing TRIPLE check, and make sure to provide a notarized video of the process. Anything, ANYTHING at all, that can wash away the salt of the AMD rebadge, C'MOOOOON!Michael Bay - Monday, June 1, 2015 - link
Oh come on, have some mercy. ^_^D. Lister - Monday, June 1, 2015 - link
Peace. :)chizow - Monday, June 1, 2015 - link
Will it change the results of 97% of a Titan X's performance for 65% of the price? If not, why do you care?Randomoneh - Monday, June 1, 2015 - link
Some users are using applications where they need all the VRAM available @ full speed.chizow - Monday, June 1, 2015 - link
And for those users there is Titan X and/or Tesla.mapesdhs - Wednesday, June 3, 2015 - link
They already answered your question. Fact is, the whole 970 RAM issue is totally irrelevant. Nobody has shown any game that exhibits problematic behaviour as a result of how the card works, none of it changes how good the card is based on initial reviews, and anyone doing something that needs close to 4GB RAM is probably in need of greater baseline horsepower than a 970 anyway, so who the hell cares?The 980 Ti runs at full speed across the board, check the ROP specs, etc.
loguerto - Sunday, May 31, 2015 - link
lesson n° 1: never buy a titan!mapesdhs - Wednesday, June 3, 2015 - link
Unless you're a prosumer and need all the VRAM you can get. Titan X is ideal for AE, various types of GPU heavy rendering, compute involving SP only, etc. Seen a guy on another forum saying he loves his Titan X for compute because of its huge RAM.TheinsanegamerN - Sunday, May 31, 2015 - link
I cant wait to see what Third party coolers, with 8 phase vrms and big coolers are capable of. As someone with a 1200p screen, this gpu could well serve 5+ years.chizow - Sunday, May 31, 2015 - link
Great card, performance and price! Nvidia is certainly being very aggressive with pricing this part, all without any competition from AMD!This is going to put a lot of pressure on anything AMD does in June. As the review stated, AMD's new GPU has a pretty tough act to follow for a $650 mini-Titan X and as we have already seen, AMD won't be competitive in the $500 and lower price point if they come to market with a stack of rebrands.
jjj - Sunday, May 31, 2015 - link
lol they got way high margins on the big cards ,they could sell this easily at 500$ (where perf per price would be close to the 970) and have good margins, if AMD has the big enough card, all they need is to want to be price competitive.These are not products for people that look for reasonable value, nobody that looks for that would ever pay 500$ for a GPU to begin with so both AMD and Nvidia are just keeping the high margins since volumes can't really go up.
Yojimbo - Sunday, May 31, 2015 - link
I don't know how you can say the value is unreasonable when there is no other way to achieve what these cards achieve. The high end of the graphics market is less sensitive to price than more mainstream segments. Both AMD and NVIDIA are trying to maximize their profits over the entire market range.jjj - Sunday, May 31, 2015 - link
You seem to not understand the term value and then you explain why the high end cards are poor value.Value and competitiveness are 2 different things. One is perf per price and the other is how the product relates to it's competitors. Yes the high end is less price sensitive (something my first comment agreed with) and the cards are poor value that's why i found it amusing that someone thinks the perf per price is great, when it never is in the category.
Daroller - Monday, June 1, 2015 - link
"You do not seem to don't understand the term value, and then you explain why the high end cards are poor value."Oh, that's rich.
Value in this case is a subjective term. Each consumer defines the value of a product to them. It isn't something you can put on a chart, or quantify, unless that's how they choose to define it for themselves.
I own two TITAN X in SLI to drive a 1440p display. You'd probably call that "poor value". You'd be correct... for yourself. You'd be woefully incorrect if we're discussing what I consider "value", because I considered the price/performance ratio to be perfectly acceptable, and moreover, a "good deal" to eliminate all possibility of VRAM related stuttering. All this while giving me the same number of shader cores as a tri-SLI 980 setup with better scaling because it's only two cards. A resounding value in my books.
Refuge - Monday, June 1, 2015 - link
lol, you lost all credibility with anyone here after you said you were using Titan X's in SLI because you are worried about running out of VRAM and stuttering...I wouldn't call that poor value, I would just call that retarded.
Yojimbo - Monday, June 1, 2015 - link
And yet despite his lack of "credibility" everything he said was completely correct. What does that say about your ability to judge?Kutark - Monday, June 1, 2015 - link
This whole thread could have been fixed if the guy said "objectively" instead of just "value".Yes, what constitutes a good value varies from person to person. Anyone with a brain can infer what he was trying to say in his argument, which is absolutely correct, is that at this price bracket, people aren't buying cards based on price/perf. So, whether or not the price/perf is comparable to lower cards is irrelevant. They price according to what the market will be bear. If they're selling them as fast as they can make them, then they're selling them at the right price.
Yojimbo - Tuesday, June 2, 2015 - link
No. Especially since he replied back to me and ridiculed me for not knowing what "value" is. That by itself is enough of a refutation of your "just try to guess what he meant" argument. But if he really did mean what you think he did, his post is irrelevant, because value is the proper metric, and it is not "objective". That really is the entire point, and why something needed to be said.Yojimbo - Tuesday, June 2, 2015 - link
And one more thing. The fact that they aren't buying the cards to maximize "price/perf" is blatantly obvious, and just as blatantly irrelevant. The problem I'm having is, and I could be wrong here, that you and he both seem to be convinced they SHOULD be buying the cards on "price/perf."mapesdhs - Wednesday, June 3, 2015 - link
I'd buy you a beer for posting that if I could. 8) It's the perfect summation of what I've said so often, namely an item is only ever worth what someone is willing to pay. It's funny how people can get so offended that someone else can afford and is happy to buy a far better config than they do; really it's just hidden jealousy IMO. Either way, kudos for that rig, and please post some 3DMark bench links! 8) Actually, you should buy the beer, you can afford it, hehe...I wonder if you have a similar MO to me, I like to max out visual settings for the games I play, modding if need be to improve visuals. I hate scenery popping, etc.
FlushedBubblyJock - Wednesday, June 10, 2015 - link
Thumbs up daroller, 1920x1200 and only 980TI is capable of driving it properly without eye candy loss and fps failures.People claim I'm crazy but then I never have to worry about my settings and I can seamlessly choose and change and view and analyze and I'm never frustrated "having to turn down the settings" to make things playable.
The rest of the world stretches livability to the limit and loves stressing everything to the max an grinding it all down to pathetic perf, all the while claiming "it's awesome !"
In other words, stupidity is absolutely rampant.
Yojimbo - Monday, June 1, 2015 - link
Value is not performance per price. Value is what benefit is achieved by the purchase of the product. I'll repeat my previous post by asking how can you assume that purchasing a card has "unreasonable value"? If I, as someone who is in the market for a video card, have a range of options to choose from for myself, how can you off-the-cuff judge how much I should be willing to spend to get a better experience (higher resolution, more detailed graphics, smoother game play, etc) from a higher-priced offering compared with a lower-priced offering? You have no idea what my value of those experiences are, so how can you judge whether the cards offer good value or not?The people buying those high priced cards are buying them because in their minds they are getting more value from them at the higher price than they would be getting from the lower-priced alternatives. Now people don't always make the most accurate decisions. They can be fooled, or they can have misconceived notions of what they are going to be getting, but the point is that they THINK they are getting more value at the time of purchase.
mapesdhs - Wednesday, June 3, 2015 - link
Or they can simply afford it and want the *best* for other reasons such as being able to max out visuals without ever worrying about VRAM issues. It's wrong to assume such purchases are down to incorrect value judgements. You're imposing your own value perception on someone else.chizow - Sunday, May 31, 2015 - link
Yeah, unfortunately anyone who actually buys high-end GPUs understands price and performance goes out the window the higher you go up the product chain. Nvidia made their value play to the masses with the 970 at an amazing $330 price point, memory snafu notwithstanding, and the card has sold incredibly well.There was no reason for them to drop prices further, and I think most observers will recognize the $650 price point of the 980Ti is actually very aggressive, given there was absolutely no pressure from AMD to price it that low.
Kjella - Sunday, May 31, 2015 - link
If AMD gets to launch first and nVidia must respond, it seems like they're trading blows. If nVidia makes a preemptive strike now, they make AMD's launch seem late and weak. They know AMD is betting on Win10 driving sales, so they could read their launch plan like an open book and torpedo it out of the gate. I think this will be a miserable month for AMD, it's hard to see how GCN-based cards are going to compete with Maxwell, HBM or not.chizow - Sunday, May 31, 2015 - link
Yep, Nvidia just pre-emptively torpedoed AMD's product launch and set pricing again. All very impressive how Nvidia has gone about 28nm product launches despite the uncertainty we'd see anything new until 14/16nm after word 20nm was cancelled.PEJUman - Monday, June 1, 2015 - link
I don't think Nvidia is dumb enough to launch 980TI without knowing where FIJI would lay on their stack. I think this is more of a powerplay from them saying, 'here's your price point AMD, good luck'like you said, the fact is they have no competitive pressure on titan X, why ruin it's pricing now if you don't know where FIJI would land.
here's my guess:
Nvidia just torpedoed their titan X, mainly because FIJI probably around 97% of titan X, and AMD was about to ask 850~1000 USD for it. now Nvidia will launch this 980TI at 650 to control the price. (which I bet they have been readying for quite some time, simply waiting for the right time/price point)
Peichen - Monday, June 1, 2015 - link
I think you are right. Fuji was estimated to be close to Titan but cheaper by $200. Now Nvidia delivered a Fuji-like card for $650, Fuji cannot go above $650. In fact, consider Fuji to be limited to 4GB and hot enough to have a watercooled version, Fuji might have to go below $600 with bundle game or $550 without bundle to make any sense. With the big chip and expensive memory Fuji is using, AMD/ATI's margin on those card are going to be slim compares with Nvidia.PEJUman - Monday, June 1, 2015 - link
yeah... time to buy AMD stock options :)In all honestly though, I really would like to have them around, if only for the 2 horses race...
bloodypulp - Monday, June 1, 2015 - link
For christsake... it's Fiji. NOT Fuji.Daroller - Monday, June 1, 2015 - link
Likely true, and it's really a sad situation for AMD... which is a bad situation for PC gaming in general. AMD desperately needed NV to price this card $150 - $200 higher.xthetenth - Monday, June 1, 2015 - link
Look at the reviews of the 980 Hybrid that just came out. A watercooled version is a very good thing even on a less than 300W card. A watercooled Fiji is going to be putting out way less heat than a 295X even if it's overclocked, and compare the 295X temp and noise to an OC 980 Ti. Using a much better cooler that should allow some great OC performance is not a guarantee of weakness.andychow - Monday, June 1, 2015 - link
"preemptive" seems like a strong word. AMD was supposed to release the 3XX series in February, then March. Then it was "comming soon". We're in June, it's still not out. And AMD makes a silly youtube video "It's Coming", then Nvidia releases the 980 Ti before them!AMD drops the ball, again.
FlushedBubblyJock - Wednesday, June 10, 2015 - link
Now that's funny - all those nvidia strategist posters .... their pants have fallen below the ankles and over and off their feet.StealthGhost - Sunday, May 31, 2015 - link
Any reason why the GTX 970 is being left out of the charts and bench GPU 2015?Ryan Smith - Sunday, May 31, 2015 - link
Bench results are compiled as we test cards for articles. We've had no need to test GTX 970 for any articles yet this year, so its results are not yet in Bench '15.takeship - Monday, June 1, 2015 - link
970 performance can be inferred from the charts anyways. Step one: look at 980 perf, step two: subtract 10~15%. Method valid up to 1440p, above that 970 chokes on the VRAM requirements. Alternatively, take 290 perf, and add 10~20%, depending on whether the game is TWIMTBP or not.octiceps - Sunday, May 31, 2015 - link
The larger-than-expected gap in pixel fillrate suggests that 980 Ti has a partially disabled ROP/MC partition and segmented memory just like 970. Could AT please investigate this?I don't care what GPU-Z says right now. It was wrong about 970 at first and it could be wrong now.
Ryan Smith - Sunday, May 31, 2015 - link
Yep. We've checked."Just to be sure we checked to make sure the ROP/MC configuration of GTX 980 Ti was unchanged at 96 ROPs"
None of the ROP/MC partitions have been disabled, and all 3MB of L2 cache is available.
octiceps - Monday, June 1, 2015 - link
Did you check with CUDA deviceQuery and Nai's Benchmark?Ryan Smith - Monday, June 1, 2015 - link
Yes on the former, no on the latter.will54 - Monday, June 1, 2015 - link
Remember that the 980 and 970 both used the same amount of RAM with the 970 having some ROP cut making it impossible to have the full 4 GB. This is a different situation with the 980TI and Titan X both having the full 96 ROP's and the 980TI using half the RAM. Two completely different situations !octiceps - Monday, June 1, 2015 - link
Besides the fact that Titan X has twice the DRAM chips per ROP/MC partition, how is it different? Anyway I won't belabor the point as Ryan has already confirmed.BillyONeal - Sunday, May 31, 2015 - link
Typo: On the BF4 page the last sentence should probably say "It in fact" rather than "In fact"rpg1966 - Sunday, May 31, 2015 - link
Why do people care if it uses the whole 6GB or not (and apparently it does)? It's completely and utterly irrelevant to 99% of users. If the card has the performance you want at the price you're prepared to pay, the memory situation is irrelevant.chizow - Sunday, May 31, 2015 - link
Idk, I think it is a fair question, and the article covers it pretty well. The current-gen consoles pushed up VRAM requirements significantly for this generation of games and while 3-4GB was generally viewed as enough last-gen, that quickly changed when games start using 4+GB at the resolutions (1440p and higher) and settings (MSAA, max textures etc) someone paying a $650 would expect their GPU to handle.12GB will almost certainly be overkill, 6GB is probably minimum to hold you over til 14/16nm, 8GB would be just right, imo.
rpg1966 - Monday, June 1, 2015 - link
Again, all but irrelevant. It performs as you expect (as per reviews), or it doesn't?chizow - Monday, June 1, 2015 - link
No, its not irrelevant if that means tomorrow's games aren't performing relative to what you saw in reviews today because its hitting its VRAM limit. Knowing how close you are to that limit at the settings you play at helps you gauge and understand whether it will be enough for longer than a few months.Daroller - Monday, June 1, 2015 - link
All true, but I think what he's getting at is that it shouldn't materially affect your purchase decision. You don't have an alternative at this price point, even if it is partially gimped. AMD doesn't have currently have anything at this price point which performs this way either. It's this or TITAN X for $350.00 more. Pick your poison.It's all moot, there are no hobbled ROPs on this card.
chizow - Monday, June 1, 2015 - link
Eh...if I knew for sure games at 1440p were already using, say 5.5-6GB of VRAM, that would materially affect my purchase decision to buy 2 of these cards, or stick to a single Titan X and look at picking up a 2nd.But, I know most of my games at 1440p are using <4GB and the most demanding ones are using 4-5GB max, so I feel pretty good about 6GB being enough.
Laststop311 - Monday, June 1, 2015 - link
how is 6GB the minimum ram needed till finfet gpus? Even at 1440p with max settings no game requires 6GB of ram. Even if a game can use 6GB of ram the way some games are programmed they just use up extra ram if it is available but that used ram isn't crucial to the operation of the game. So it will show a high ram usage when in reality it can use way less and be fine.You are overly paranoid. 4GB of ram should be just fine to hold u off a year or 2 till finfet gpus comes out for 1440p res. If you are smart you will skip these and just wait for 2h 2016 where 14/16nm finfet gpu's are going to make a large leap in performance. That generation of gpu's should be able to be kept long term with good results. This is when you would want an 8GB card to keep it running smooth for a good 3-4 years, since you should get good lifespan with the first finfet gpu's.
chizow - Monday, June 1, 2015 - link
Again, spoken from the perspective of someone who doesn't have the requisite hardware to test or know the difference. I've had both a 980 and a Titan X, and there are without a doubt, games that run sluggishly as if you are moving through molasses as soon as you turn up bandwidth intensive settings, like MSAA, texture quality and stereo 3D and hit your VRAM limits even with the FRAPs meter saying you should be getting smooth frame rates.With Titan X, none of these problems and of course, VRAM shoots over the 4GB celing I was hitting before.
And why would I bother to keep running old cards that aren't good enough now and wait for FinFET cards that MIGHT be able to run for 3-4 years after that? I'll just upgrade to 14/16nm next year if the difference is big enough, it'll be a similar 18-24 month timeframe when I usually make my upgrades anyways. What am I supposed to do in this year while I wait for good enough GPUs? Not play any games? Deal with 2-3GB slow cards at 1440p? No thanks.
Refuge - Monday, June 1, 2015 - link
So you are saying I shouldn't be asking questions about something I'm spending my hard earned money on? Not a small sum of which at that?You sir should buy my car, it is a great deal, just don't ask me about it. Because that would be stupid!
Yojimbo - Monday, June 1, 2015 - link
He's not questioning your concern, he's questioning your criteria.Peichen - Sunday, May 31, 2015 - link
Why is the most popular mid-high card: GTX 970, not on the comparison list? It is exactly half the price as 980 Ti and it would be great to see if it is exactly 50% the speed and uses half the power as well.dragonsqrrl - Sunday, May 31, 2015 - link
It's definitely more than 50% the performance and power consumption, but yes it would've been nice to include in the charts.PEJUman - Monday, June 1, 2015 - link
Ryan's selection is not random. it seems he selects the likely upgrade candidates & nearest competitors. it's the same reasoning why there is no R9 290 here. most 970 and R9 290 owners probably know how to infer their card performance from the un-harversted versions (980 and 290x).Granted, it's odd to see 580 here and 970 will be more valuable technically.
mapesdhs - Wednesday, June 3, 2015 - link
Plus, most requests I've seen on forums have been for 970 SLI results rather than a 970 on its own, as 970 SLI is the more likely config to come anywhere a 980 Ti, assuming VRAM isn't an issue. Data for 970 SLI would thus show where in the various resolution/detail space one sees performance tail off because it needs more than 4GB.bloodypulp - Monday, June 1, 2015 - link
The 295X2 still crushes it. But blind Nvidia fanboys will claim it doesn't matter because it is either a) not a single GPU or b)AMD (and therefore sucks).PEJUman - Monday, June 1, 2015 - link
I owns 290 crossfire currently, previously a single 780 TI. Witcher 3 still sucks for my 290 CF, as well as the 295X2. so... depends on your game selections. I also have to spend more time customizing most of my games to get the optimal settings on my 290 CF than my 780TI.kyuu - Monday, June 1, 2015 - link
Witcher 3 runs just fine on my single 290. Is it just the xfire profile? Do you have the new driver and latest patches? Also, have you turned down tesselation or turned off hairworks?PEJUman - Monday, June 1, 2015 - link
4K... was hoping my U28D590D will have freesync, but alas... no such luck. I am very sensitive to stutter, it gives me motion sickness, to the point I have to stop playing :(limiting hairworks to 8x does help, but I really dislike the hair without it. I rather wait for 15.5.1 or 15.6. I have other games to keep me busy for a while.
I can get 45 avg if I drop to 21:9 ratio using 2840 x 1646, but even then I still get motion sickness from the occasional drops.
chizow - Monday, June 1, 2015 - link
Yes CrossFire support of TW3 is broken from Day1, its a well-known issue. AMD hastily released a driver last week with a CF profile, but its virtually unusable as it introduces a number of other issues with AA and flickering icons.PEJUman - Monday, June 1, 2015 - link
15.5 no longer flickers with or without AA. still slow though.chizow - Monday, June 1, 2015 - link
Are you sure? Did they release a follow-up to the 15.5 Beta? Because the notes and independent user feedback stated there was still flickering:*The Witcher 3: Wild Hunt - To enable the best performance and experience in Crossfire, users must disable Anti-Aliasing from the games video-post processing options. Some random flickering may occur when using Crossfire. If the issue is affecting the game experience, as a work around we suggest disabling Crossfire while we continue to work with CD Projekt Red to resolve this issue
Peichen - Monday, June 1, 2015 - link
295X2 is indeed faster but it also uses twice as much power. You have to take the 1000W PSU into account as well as one or two additional 120mm fans that's needed to get the heat out the case. When you add up all the extra cost for PSU, fans, electricity, noise and stutter against an overclocked 980Ti (last few pages of review), the slight speed advantage aren't going to be worth it.Also, Maxwell 2 supports DirectX 12, I am not so sure about any of the current AMD/ATI cards since they were designed in 2013.
xthetenth - Monday, June 1, 2015 - link
You don't have to buy a new PSU every time you buy a high TDP card, but otherwise a valid point. Going multi-GPU for the same performance requires a much bigger price difference to be worth it vs. a single card.Kutark - Monday, June 1, 2015 - link
Basically you're gonna spend an extra $5/mo on electricity with that card, or $60/yr vs a 980ti. thats actually pretty huge. Thats at 4hrs/day of gaming, at an average of 12c/kwh. If you game 6 or 7 hours a day, its even worse.These high power cards are a little ridiculous. 600w just for one video card?!!
Daroller - Monday, June 1, 2015 - link
I had a GTX690, and I run SLI TITAN X. Dual GPU IS a hindrance. You'd have to be blind, stupid, or a rabid fanboy to claim otherwise. The 295x2 isn't exempt from that just because you dislike NV and harbor a not so secret love for AMD.Daroller - Monday, June 1, 2015 - link
I had a GTX690, and I run SLI TITAN X. Dual GPU IS a hindrance. You'd have to be blind, stupid, or a rabid fanboy to claim otherwise. The 295x2 isn't exempt from that just because you dislike NV and harbor a not so secret love for AMD.Daroller - Monday, June 1, 2015 - link
I had a GTX690, and I run SLI TITAN X. I've been running dual GPU setups for as long as they've been available. Dual GPU IS a hindrance. You'd have to be blind, stupid, or a rabid fanboy to claim otherwise. The 295x2 isn't exempt from that just because you dislike NV and harbor a not so secret love for AMD.Laststop311 - Monday, June 1, 2015 - link
Yea cause using dual GPU's just sucks. Just adds a bunch more complexities and problems to everything. Always get 1 of the largest, fastest single gpu's you can get.Kutark - Monday, June 1, 2015 - link
I guess if by crush you mean thermally crush. Then yes, you're absolutely correct. I mean, why not have a portable nuclear reactor nearby to power your video card!mapesdhs - Wednesday, June 3, 2015 - link
Hope you enjoy your power bill, heat, worse stuttering, etc., and the numerous CF fails for all sorts of scenarios. I checked some forums, lots of moans about 15.5 for CF support.poohbear - Monday, June 1, 2015 - link
Why didn't u include witcher 3 in the benchmarks? It's the latest graphics intensive game for sure & looks gorgeous!kyuu - Monday, June 1, 2015 - link
It JUST came out. A significant portion of this review was probably carried out before it was even released. Not to mention its already had two patches that affect performance substantially.Oxford Guy - Monday, June 1, 2015 - link
It looks horribly watered-down compared with the earlier demo, probably because they wanted so much to keep the VRAM requirements really low.chizow - Monday, June 1, 2015 - link
@Ryan Any word on when we might see some SLI results? I know this is generally dependent on limited review samples, but vendors will probably start sending you cards soon yeah?Ryan Smith - Tuesday, June 2, 2015 - link
Not any time soon.NA1NSXR - Monday, June 1, 2015 - link
Those revised 970/980 prices are stingy.Daroller - Monday, June 1, 2015 - link
HAHAHAHAHA website lagged out and triple posted. That's awesome. Go go Google Chrome!naxeem - Monday, June 1, 2015 - link
I don't think it is all good with the overclocking part. On stock air (albeit 80% fan speed unless used Accelero IV at 40%) cooler TitanX cards easily get to 1300/1440 normal/boost clocks. Same cards on water got to 1375/1500 with cool-ish 55°C at max load. That applies to two TitanX in SLI with modified BIOS that allows for more power consumption and thus removes artificial limit.Since the chip is identical and 980Ti is actually partially defective TitanX with 50% less RAM and switched off defective parts, I highly doubt clock potential differs, especially not in favor of 980Ti.
I would and do expect 980Ti to clock the same as Titan X (loosing some on chip quality, gaining some on half the VRAM).
FlushedBubblyJock - Saturday, June 13, 2015 - link
Nope. Other test sites show the opposite - 980TI is an overclock monster and beats the TXtruongpham - Monday, June 1, 2015 - link
Ryan, can you bench this one with Windows 10 and DX12?Ryan Smith - Monday, June 1, 2015 - link
We won't be doing any complete Win10 benchmarking until that OS is finished and released. As for DX12, there are no games out yet that re using it; the handful of benchmarks are focused tech demos.cknobman - Monday, June 1, 2015 - link
Nvidia must have seen some undisclosed AMD benchmarks, went into panic mode, and rushed a release for the 980TI to get customers before the AMD launch.While its a great card the problem is Nvidia screwed some of their own customers.
I take this as a sign that whatever AMD is coming out with must be pretty good. :)
galta - Monday, June 1, 2015 - link
Maybe, but it could prove to be of little importance.You see, Win10 will be out on June 29th. Realistically speaking, DX12 games won't be real before Christmas or 2016.
It is more than enough time for a possible counterstrike from nVidia.
Having said that, unless one really really needs to upgrade now I would strongly recommend waiting for another month, just to check what Fiji is up to.
As of me, I have a pair of 980GTX Strix and have been with nVidia for a while, but I really hope AMD gets this one right.
Real competition is always good.
JayFiveAlive - Monday, June 1, 2015 - link
I've been waiting for this beast to drop... now to decide whether it's a good time to bite.Current setup is a 2500K OC to 4.4Ghz and a GTX 670, so kinda oldish... Was considering upgrading to a Skylake proc come Sept and this 980 Ti, but probably Gigabyte variant... hmmm.
Peichen - Tuesday, June 2, 2015 - link
Why upgrade CPU? 2500K at 4.4GHz is still very fast and shouldn't affect performance of a 980Ti much. Maybe 10% less fps vs if you have a 6-core Extreme but why spend $300-400 to get 10% improvement?mapesdhs - Wednesday, June 3, 2015 - link
Plus if he does need some more CPU oomph, just put in a 2700K. I've built six so far, every one of them happily runs at 5GHz with just a decent air cooler & one fan for quiet operation, though for final setups I use an H80 and two quiet fans. Some games will benefit from more than 4 cores, depends on the game (eg. PvP online FPS can involve a lot of host side scripting, eg. Project Reality, and the upcoming Squad).True though, 2500K is still very potent, just built a 4.8 setup for a friend. She lives on an island, it'll probably be the quickest system for miles around. :D
douglord - Monday, June 1, 2015 - link
I need to know if the 980ti can output 10-bit color correctly? Is it ready for UHD Blueray?dragonsqrrl - Monday, June 1, 2015 - link
To my knowledge only Quadro's and Firepro's output 10 bit color depth.johnpombrio - Monday, June 1, 2015 - link
Any card that can do true RGB color schemes are NOT MEANT for normal users. It brings a lot of drawbacks for games and normal tasks. These type of cards are for graphics professionals only. Google it to see why.mapesdhs - Wednesday, June 3, 2015 - link
Indeed, the way colourspaces interact with different types of monitor can result in some nasty issues for accurate colour presentation. For home users, it's really not suitable since so many normal apps & games aren't written to utilise such modes correctly. Besides, I doubt any 4K TVs could properly resolve 10bis/channel anyway. Funny though that people are still asking about 10bit colour when pro users were already using 12bit more than 20 years ago. :D Also 16bit greyscale for medical/GIS/etc.johnpombrio - Monday, June 1, 2015 - link
Yikes! That overclock ability! I always buy EVGA's superclocked NVidia cards as they as super stable and have great benchmarks (as well as playing games well, heh). I might buy into this even tho I have a GTX980.As for AMD, NVidia has 76% of the discrete GPU graphics card market (and still rising) while AMD has lost 12% market share in the last 12 months alone. Whatever AMD has up for new products, it better hurry and be a LOT better than NVidia cards. AMD has tried the " rebadge existing GPU family cards, reduce its price, and bundle games" for too long and IT IS NOT WORKING. C'mon AMD, get back into the fight.
mapesdhs - Wednesday, June 3, 2015 - link
True, I kept finding EVGA's cards work really well. The ACX2 980 (1266MHz) is particularly good.Nfarce - Monday, June 1, 2015 - link
Well I recently upgraded with a second 970 for SLI for 1440p gaming and have them overclocked to 980 performance. It's roughly 15% faster than this single card solution for $700 vs. $650 (7.5% increase in cost). But one thing is for certain: we are still a long time away from realistic 4K gaming with a G-sync 120Hz monitor when those come out. I would much prefer 1440p gaming with max quality and high AA settings and faster FPS matched to screen Hz than detuned 4K settings (even if AA is less meaningful at 2160p).By the way: are you guys ever going to add Project Cars to your benchmarks? It has rapidly become THE racer to own. Grid Autosport is not really a good benchmark these days because it's just a rehash of the Grid 2 engine (EGO 3.0)...easy on GPUs. Many, including me, haven't touched Autosport since PCars was released and may never touch it again.
mapesdhs - Wednesday, June 3, 2015 - link
Project Cards is one game that runs badly in CF atm (driver issues), which would make the 295x2 look horrible. Might be better to wait until AMD has fixed the issue first.agentbb007 - Monday, June 1, 2015 - link
A GTX Titan X for $649, DOH BART! Oh well I've enjoyed my SLI Titan X's for a few months so I guess that was worth the $700 premium. I keep falling for nVidia's Titan brand gimmick, I also bought the original Titan luckily just 1 of them and ended up selling it for about half what I paid.Lesson learned, AGAIN, don't buy the Titan brand wait for the regular GTX version instead.
mapesdhs - Tuesday, March 12, 2019 - link
2019 calling! I wonder if he bought the 2080 Ti or RTX Titan... :}Laststop311 - Monday, June 1, 2015 - link
The performance difference between the 980ti and 980 is WAY larger than the performance difference between the 980 and 970 yet the price gap is larger between the 980 and 970. The 980 was stupidly overpriced at 550 and is still overpriced at 500. It needs to be at the 420-430 mark.I would be upset if I just paid 550 for a GTX 980 and now for only 100 more I could basically have titan x performance.
chizow - Monday, June 1, 2015 - link
And what value do you place on the 9 months that 980 users have been enjoying that level of performance? Again, if you think the 970 is the better deal, it is there for you to buy at $300-330. The 980 was overpriced by maybe $50 at launch, but it still dropped the entire price and performance landscape at the time where 780Ti was still $650+, 290X was $550, 780 was $450 and 290 was $400. In that context, it wasn't so bad, was it?In reality, Nvidia has no reason to drop the 980 as there is no pressure at all from AMD. All these price cuts are self-induced as they are simply competing with themselves and pre-emptively firing a shot across the bow at $650 with 980Ti.
Oxford Guy - Monday, June 1, 2015 - link
"In reality, Nvidia has no reason to drop the 980 as there is no pressure at all from a card with 3.5 GB of VRAM that, in part, runs at 28 GB/s and has XOR contention."fify
chizow - Monday, June 1, 2015 - link
"In reality, Nvidia has no reason to drop pricing on the 980, as there is no point in threatening the golden calf that may have single-handedly killed AMD graphics, 3.5GB VRAM and all."FTFY ;)
http://store.steampowered.com/hwsurvey/videocard/
NVIDIA GeForce GTX 970 2.81%
AMD Radeon R9 200 Series 0.94%
Oxford Guy - Tuesday, June 2, 2015 - link
I hope you're being paid for all this nonsense.Michael Bay - Tuesday, June 2, 2015 - link
Oh the pain.chizow - Tuesday, June 2, 2015 - link
Is it nonsense? I hope you are being paid for posting 3.5GB nonsense?darkfalz - Tuesday, June 2, 2015 - link
I dunno. I can't really justify an upgrade from my 980 STRIX (which would then replace beloved 680 in my HTPC) - I was hoping for at least 40% improvement. Not really worth it for 20-30%. Better off getting another 980 and SLI it.darkfalz - Tuesday, June 2, 2015 - link
I'm not sure why they aren't offering Witcher III as well as Batman. Why would a 970 get you two games? Not a great incentive to buy.SeanJ76 - Tuesday, June 2, 2015 - link
Yeah this card barely surpasses my 770 Sli, and I mean BARELY! I think I'll pass and wait for another die.Klimax - Tuesday, June 2, 2015 - link
Just small bug in your article:Page "GRID Autosport" has one paragraph from previous page.
"Switching out to another strategy game, even given Attila’s significant GPU requirements at higher settings, GTX 980 Ti still doesn’t falter. It trails GTX Titan X by just 2% at all settings."
As for theoretical pixel test with anomalous 15% drop from Titan X, there is ready explanation:
Under specific conditions there won't be enough power to push those two Raster engines with cut down blocks. (also only three paths instead of four)
Ryan Smith - Wednesday, June 3, 2015 - link
Fixed. Thanks for pointing that out.bdiddytampa - Tuesday, June 2, 2015 - link
Really great and thorough review as usual :-) Thanks Ryan!Hrobertgar - Tuesday, June 2, 2015 - link
Today, Alienware is offering 15" laptops with an option for an R9-390x. Their spec sheet isn't updated, nor could I find updated specs for anything other than R9-370 on AMD's own website. Are you going to review some of these R9-300 series cards anytime soon?Hrobertgar - Tuesday, June 2, 2015 - link
When I went to checkout (didn't actually buy - just checking schedule) it indicated 6-8 day shipping with the R9-390X.3DJF - Tuesday, June 2, 2015 - link
Ummm....$599 for R9 295X2?......where exactly? every search i have done for that card over the last 4 months up to today shows a LOWEST price of $619.Ryan Smith - Wednesday, June 3, 2015 - link
It is currently $599 after rebate over at Newegg.Casecutter - Tuesday, June 2, 2015 - link
From most result the 980Ti offer 20% more @1440p than a 980 (GM204) and given the 980Ti cost like 18-19% more that the orginal MSRP of the 980 ($550) It's really not any big thing.Given GM200 a 38% larger die, and 38% more SU's over a GM204 and you get 20% increase? It worse when a full TitanX is considered, that has 50% more SU's and the TitanX get perhaps 4% more in FpS over the 980Ti. This points to the fact that Maxwell doesn't scale. Looking at power the 980Ti is needing approx. 28% more power, which is not the worst but is starting to indicate there a losses as Nvidia scaled it up.
chizow - Tuesday, June 2, 2015 - link
Well, I guess its a good thing 980Ti isn't just 20% faster than the 980 then lol.CiccioB - Thursday, June 4, 2015 - link
This is obviously a comment by a frustrated AMD fan.Maxwell scales perfectly as you didn't consider the frequency it runs.
GM200 is 50% more than a GM204 in all resources. But those GPU run at about 0.86% of GM204 frequency (1250 vs 1075). If you can do simple math, you'll see that for any 980 results, if you multiply it by 1.5 and then for 0.86 (or directly for 1.3, that means 30% more) you'll find almost exactly the numbers the 980Ti bench shows.
Now that the new 980 $500 price, do the same and... yes, it is $650 for 980Ti.
Oh, the die size... let's see... 398mm^2of GM204 * 1.5 = 597mm^2 which compares almost exactly with the calculated 601m^2 of GM200.
Pretty simply. It shows everything scales perfectly in nvidia house. Seen custom cards are coming, we'll see GM200 going to 50% more than GM204 at same frequency. Yet these cards will consume a bit more, as expected.
You cannot say the same for AMD architecture though, as with smaller chips GCN is somewhat on par or even better with respect to nvidia for perf/mm^2, but as soon as real crunching power is requested GCN becomes extremely inefficient under the point of both perf/Watt or perf/mm^2.
If you tried to plant a doubt about the quality of this GM200 or Maxwell architecture in general, sorry, you choose the wrong architecture/chip/method. You simply failed.
Casecutter - Tuesday, June 2, 2015 - link
Nvidia places the TitanX in play just so that logic works... But when you pull TitanX from the equation, and work from the 980 (GM204) a 20% increase in FpS for, almost 20% more money, and use 28% more power. It look really humdrum.Kutark - Wednesday, June 3, 2015 - link
You felt the need to post basically the same comment in 2 different places?Regardless you're cherry picking data. Overall its about a 30% increase in perf, for about a 30% increase in price. Its still a "good" deal if you want a powerful single GPU.
uglyduckling81 - Tuesday, June 2, 2015 - link
I'm still shocked at how much the 295x2 kills it. It's so much more powerful that even a Titan X. Newegg had the 295x2 on a sale for $550 2 weekends ago as well. Crossfire driver issues aside if your in the market for the high end I just don't see how you could go past the much more powerful and cheaper 295x2. If I had been in the USA with that $550 sale going I would of snapped that up so fast. Hell I would of bought several and sold a couple when I got home. Those cards are still $1600 in Australia.mapesdhs - Wednesday, June 3, 2015 - link
Really? CF issues aside?? It's a freakin' CF card! What the heck is the point in buying the thing if CF support just doesn't work properly for so many games? And did AMD ever fix DX9/CF issues? Still sucks the last time I tested 7970 CF. Feel free to whack your power bill with the 295x2, spew out heat, etc. Every time I see a crazy extended power usage graph just so the enormous line for the 295x2 can be included, it blows my mind that people ever bother buying it. One person from OZ here commented that heat output is of primary concern where he lives, so chucking out so much heat from a 295x2 would be a real problem. I noticed the same thing with 580 SLI, sooo glad I eventually switched to a single 980.CiccioB - Thursday, June 4, 2015 - link
The fact that a supposedly more powerful card is sold at a lower price should automatically raise you some questions... it is not that because 290x2 is a single card that all crossfire problems magically go away.Drive issue apart is not an option. Crossfire and SLI performances depend heavily on driver quality. And, sorry, but AMD dual GPUs cards have always been the worst choice since they were created.
See what is the support for 7990 cards. 690 cards are still supported as you can see in these very benchmarks.
Moreover, if you want a better dual configuration with support done as one would expect for the spent money, you can just buy 2x GTX970 and live much more happily. Consuming much more less. Dual GPU comparison here is not even to take into account. The simplicity, scaling and smoothness of a single GPU like Titan X or this GTX980TI simply crush the dual GPU competition without any doubt, even though they do some FPS less as average.
If you cannot understand that, it is right that you continue buying crappy cards and be happy with those.
NvidiaWins - Wednesday, June 3, 2015 - link
Compared to my Evga 770 SuperClocked Sli, it only generates a few extra fps, scored just over 100 points higher in Firestrike(770 Sli graphic score- 16,837/ 980Ti graphic score- 16,900), its a great single card at a cheap price point, but little improvement over what I currently use.Zak - Saturday, June 13, 2015 - link
Compare your dual 770 against dual 980ti and then we'll talk...godrilla - Thursday, June 4, 2015 - link
It seems that nvidia created the titan x just to make the 980ti seem like a bargain so that gamers will jump at it pretty clever.nadia28 - Thursday, June 18, 2015 - link
Yeah, that what I thought too. They keep thinking of new marketing strategies to boost the sales and this one works like a charm.CHRAHL - Saturday, June 6, 2015 - link
Ryan! We never saw a review from GTX 960, will it be published. And there is no data in bench from it hence. Could you at least upload performance to bench section..IUU - Saturday, June 6, 2015 - link
Wow ,Crysis 3 and Battlefield 4 hitting 80 fps at very high 2560x1440.Clearly there's much room for better graphics at lower resolutions.
I would buy this card , but not if I knew the only benefit would be to
run games at higher resolutions, that is, graphics has some way to go still
and this card could accommodate such a prospect.
FlushedBubblyJock - Wednesday, June 10, 2015 - link
the R9 295X2 has completely lost it's luster and value, forget itlooper - Sunday, June 14, 2015 - link
This 980 Ti.... Is it the same physical size as my current 780?bullie - Monday, June 15, 2015 - link
<a href="http://fineartamerica.com/art/paintings/birds/all&... style="font: 10pt arial; text-decoration: underline;">birds paintings for sale</a>bullie - Monday, June 15, 2015 - link
<a href="http://fineartamerica.com/art/all/birds/canvas+pri... style="font: 10pt arial; text-decoration: underline;">birds canvas prints and birds canvas art for sale</a>bullie - Monday, June 15, 2015 - link
Buy prints:<a href="http://fineartamerica.com/art/all/birds/canvas+pri... style="font: 10pt arial; text-decoration: underline;">birds canvas prints and birds canvas art for sale</a>NvidiaWins - Thursday, June 18, 2015 - link
I'll wait to see what this 980GTX "METAL" is all about before I order a new gpu. With 770 Sli I'm in no real rush, as the 980Ti just barley surpassed my Firestrike Score(less than 100 graphic points)johnpombrio - Thursday, June 25, 2015 - link
I am just going through the AMD Fury X's reviews as it came out today with a ton of reviews (June 24th, 2015). It is excellent card and would have absolutely dominated the price point if this pesky GTX 980Ti just had not come along a month before AMD's much hyped launch. It cannot be such a coincidence that NVidia's card just happened to be so close in its benchmarks to the AMD's card. It is also such a coincidence that AMD set the price of their card at exactly the same price as NVidia. So my theory is that NVidia managed to get their hands on a reference Fury X and dialed in their 980Ti to match it. In the meantime, AMD was planning on charging a LOT more for their extremely well designed card (with its own built in water cooler and HBM) expecting the Ti to be launched much later this year but was forced to chop the price. If the price was cut, I don't expect the AIB manufacturers for the Fury X to be very pleased to have a lot less profit margin on the card. Three things may hold back the Fury X as well. One is that the Ti overclocks much better. Next is that the water cooler may or may not be welcomed by all considering its size and possible installation issues. Finally, AMD has been getting complaints by many folks over drivers (or lack thereof). Otherwise, a successful launch for both companies.deteugma - Thursday, July 16, 2015 - link
It is extremely frustrating that none of the charts include the 970.Loretta946 - Wednesday, August 1, 2018 - link
To solve this I made a small below base https://plungeroutersreview.voog.com/woodworking-p... you need to pierce you could actually simply use a nail rather.