Funny, I swear I was thinking those days that Samsung buying AMD would be a great, great thing. AMD is really stuck on 28nm, while Samsung is already doing 14nm. Combining the know-how on x86 processors and graphics with Samsung resources, fabs and technology would bring some real competition to Intel.
Just the fact that Samsung would allow AMD processors to be under 20nm process, and AMD GPU/APU tech could be used on Samsung mobile chips would be a win/win for everybody.
AMD is not in the x86 business. And since it's so low-margin I don't think they want in even if the license was transferable. Although for AMD an exception might be made in order for Intel to avoid anti-trust lawsuits and because Intel is licensing x86-64 from AMD.
qualcomm is the only one who can really buy it for several reasons To avoid anti trust like he said to retain the X86 licence Intel is more inclined to give it to an american company and they have the cash. and already done buisness with ATI previously. They bought Adreno GPU technology. compund that with Qualcomm moving into ARM and server market. Zen would be a good pickup if IF its anygood.
The only thing AMD could offer NVIDIA is patents to hold on to and maybe a couple engineers. I don't think there's anything else NVIDIA would want at this point, they totally outclass AMD.
A gateway into x86 would be useful. NVIDIA tried that already but Intel refused licensing. It's not clear though whether AMD's x86 licenses would transfer in a buyout.
Although technically, the current cross-license agreement would terminate, Intel would almost certainly renegotiate with anyone buying AMD, or they'd risk losing rights to AMD64.
"However, it should be noted that if the cross-license between AMD and Intel is terminated because a party gets acquired by a third company, licenses granted to another party will survive unless that other party gets acquired too (i.e., if AMD is taken over, Intel sustains rights to AMD’s IP), in accordance with the term 5.2d of the agreement. The same happens if one company gets bankrupt."
Both Intel and AMD are public companies, and a change in control of Intel would invalidate the agreement too. Funds and institutions owns most of AMD and Intel. The only thing that I could see as out of question is mergers or a buyout of the x86-business.
Then how is it that AMD keeps inventing things that Nvidia then feels the need to try to copy, but once Nvidia is done, the original AMD is better on both AMD and Nvidia hardware? Tresshair vs Hairworks for one.
Or how about AMD being the major player in GDDR5, and as soon as that was done, starting work on HBM... Where was Nvidia on those? They sure are willing to use the AMD created technology though. It is shame AMD does not make stuff like this free to everyone except Nvidia.
So what? You'd rather nVidia and everyone else not use the new HBM memory standard and kept using GDDR5?
Both AMD and nVidia make amazing new technologies every year or so. Some of those shouldn't be used because they come from AMD? Even when they are the new standard?
Point with the broadly accepted standards is, adoption happens across the whole industry. So, nV will use HMC regardless of what AMD does, especially considering how little it means to the enduser right now. Trying to parade normal process as some kind of muh epic win for AMD is just desperation.
give AMD where credit is due. Much of these things are their work. Even dx12 as we know it is likely got a lot to do with AMDs work. I understand you might love nvidia but just be honest.
Oh please, DX was and always will be primarily driven by MS, not specific GPU vendors. It`s MS who listened to developers` wishes and then went on to building a new architecture.
AMD invents things that nvidia wants to copy???? This is the funniest thing I have ever read.
AMD has a history of creating (copying) technology and offering it as a free gift to the market in order to try to limit its competitors gain on their own custom technology they have developed much earlier than AMD ever could thing that that could be done.
Physix vs Bullets Library (where is the latter, BTW?) CUDA vs lots of failed language extension till OpenCL (Apple thought) came GamesWork is much more than Hairworks or Tresshair Games Experience vs 3rd party app to do the same Gsync came a year before the HW on the market could support such technology (AMD just queue when other created the HW to support Adaptive Sync, which then AMD called FreeSync as their own implementation). Still Gsync works better. New fast AA methods have always been copied by AMD. nvidia has been useing memory compression since GDDR5 bandwidth was starting to be insufficient for high end cards. AMD has always been using bigger (and more expensive) memory controller instead. Now they have understood that memory compression helps. GDDR5 as any other memory adoption is NOT AMD's merit at all. It's just a sort of offer own products to first use the new standard technology in order to help the real creator of this one to have better and earlier testing devices. It just means that AMD gains few months of exclusivity. nvida has never needed to use the latest memory technology to be superior to AMD. They jumped GDDR4 all together, for example. And HBM v1, which nvidia proved to equal with GDDR5 and better memory handling, keeping manufacturing cost lower. Not even needing it to lower its already low power consumption. nvida has always worked with a scalar architecture, while AMD insisted for 6 years with VLIW, claiming it was a good architecture for GPGPU. Then at the end AMD had to create an architecture very similar to the one nvidia was developing, improving its GPGPU capabilities but loosing lots of efficiency in 3D work, both under the point of view of power consumption and die space. This was the opposite when nvidia was scalar vs VLIW. It was AMD claiming "small is better" when using VLIW, while slowly coming to the dimension of nvidia professional GPU die area and at the end greatly surpassing them for the same performances, clearly indicating that they were trailing nvidia on technological choices, but failing to do better than them.
What are those great invention AMD put on the table in the last 8 years that nvidia copied? I'm speaking about technology coming from AMD R&D laboratories, not free specifications that have been (partially) implemented by others with the aim to steal some of competition revenues with custom, closed, expensive, but working (which is fundamental for market adoption!). I'm speaking about real technology that brought advantage to AMD that nvidia had to copy in order to not be left behind. As far as I can think, I can't remember none. But maybe it's my fault. Waiting for a clear list.
CUDA is just another implementation of GPGPU. Nvidia cannot lay claim to it just because they put out a proprietary version.
Gameworks is not worth mentioning. It took existing effects and blackboxed them. It also runs like crap
G-sync was most likely nvidia cutting in before Adaptive sync got approved as a standard and even if not so the means already existed. brute forcing it with an FPGA is not innovation. The hardware already existed, the issue is certification for use with PC gpus etc. http://www.vesa.org/news/vesa-adds-adaptive-sync-t... - since 2009 its been possible.
There were already tons of AA methods. Nvidias own method just used a combination of things AMD had a decade before.
Nvidia chose to use higher clocked VRAM and AMD went for higher bandwidth. Both already used compression. eg. http://www.anandtech.com/show/2231/11 They just happen to improve the compression when they can. Ignorant people like you assume nvidia is the only one doing it because a lot of noise was made about the tech when maxwell 2 launched. probably because those cards had crap bandwidth and the compression was an excuse.
Not really understanding your point regarding GDDR5. It was developed by AMD. http://www.vrworld.com/2008/11/22/100th-story-gddr... You're just being crazy. There is no reason to go out of your way to deny everything AMD and praise everything Nvidia. Its just crazy.
What nvidia did to lower their power consumption is partly why they are getting beaten in dx12. They made consessions in other places. AMD has been putting out chips that had quite a bit more power than nvidia, but were not taken advantage of due to dx11. I don't think we are quite at the point where memory bandwidth between gddr5 and hbm would make a significant difference. Its only 512GB/s vs ~336GB/s. The Fury often matching or beating the 980ti at 4K might be the most we see from that currently, as well as maybe xfire comparisons.
At this point AMD's hardware is more forward thinking than nvidia's. I am not sure about the history of moving to VLIW but I do know nvidia took a step back by removing their hardware scheduler with kepler and maxwell (for the sake of efficiency). AMD did maintain smaller die sizes after moving away from VLIW when comparing launch years. Hawaii was much smaller than kepler, Fiji is smaller than big maxwell 2.
So you want to cut out a lot of AMDs contributions just because they made it open? It only matters if its proprietary? How do you steal revenue using free open technology? Why does it HAVE TO bring advantage to AMD? Is innovation not innovation?
I doubt anyone will give you a list. You entire list was rubbish and not something that needs a countering list.
AMD has hot large dies that are inefficient, that is why they HAD to use HBM for their Fury line, just so they could cut down on the power draw and even then it sucks down more power than Maxwell which is just pathetic. What is more pathetic is Maxwell beats fiji in DX 11 and 12.
That's only because you are looking at 4K which is bottle necked by memory amount.. HBM is limited to 4GB 1st Gen and hence you are not comparing GPU architecture itself fairly. I can argue equally that FuryX beats 980Ti at 1080p. So what's your point? It's apple to oranges.
... AMD might well be more forward thinking... but immediate past history shows Nvidia has sold many more products for many years in a row. Warm fuzzy feelings about moving the world forwards don't contribute to earnings or PE multiples as much as superior products and sales do.
AMDs 390, is faster and consumes about 50w (total system) more than slower yet similarly priced 970. And that despite so much more modest R&D budget and all the wonderful "competitive" things that nVidia does with tesselation in GameWorks' libs.
nVidia spends gazillion to develop proprietary G-Sync that adds 200$ to Monitor costs, just to be countered in a couple of month by Display Port 1.2a/FreeSync, that comes FREE OF CHARGE with most (all?) scaler chips out there. (effectively only lazy manufacturers won't enable it in their monitors)
Man i have so much to say about this...being a computer hardware nerd for 25 years and building over 3000 personal computers. I hate this change. hear me out... first off Intel needs AMD and visa versus. without AMD being a power horse in the cpu business Intel has no reason to do any price fluctuations. that means the next get processors may not decrease the price of the current generation like in years past. 2nd thing is amd is so close to chapter 11 just look at there stock, enough said. I would like to see amd get purchased by nvidia just for the purpose of keeping the intel prices at bay. so what do you think
It's not going to happen, because AMD has over 40 years of silicon valley patents, an x86 license, AMD64 technology, and the entire Radeon Graphics division worth as much as nVidia alone (currently $12 billion market cap).
Waiting for AMD's bankruptcy? You might as well wait for the second coming of Christ. People have been seeing AMD's imminent bankruptcy for over 40 years, and they're still waiting.
Sounds to me like this article is more damage control for AMD than anything else by the way it's worded. You made it sound like this is some natural transition for both Jim Keller and Phil Rogers with them wrapping their work up and moving on. A fellow just doesn't wrap up his work after 21 years and jump to his companies arch rival, when something huge like this happens, it means there's a very dark cloud looming over AMD's future. But hey, at least they can still afford to pay their ineffectual PR + CEO.
Heard about AMD going bankrupt for 10 years now, let's keep going. Based on Q3 earnings, it's clear they will have enough cash to survive in 2016 as well. By that point 16nm GPUs will come online, Zen is getting closer (Q4 2016/Q1 2017), they will most likely start getting $1B of extra revenue from two custom design wins they announced before (Nintendo NX). The financial arm chair experts at AT have been wrong for 10 years in a row but every quarter we read about DOOM and how AMD is going bankrupt/death spiral any quarter now. Funny part is not a single one of you has the balls to put your money where your mouth is. Anyone who is confident in their predictions is buying puts or shorting the stock all the way to the bank. Most is just empty talk from people who would love nothing more than an Intel/NV monopoly.
2 executives leave and the company is going to be bankrupt? The most insane thing I've read in a long time considering that the same people have touted how no 1-2 people can ever do enough to turn AMD around but if 1-2 people leave, AMD is dead? You guys need to stop taking your meds.
I really don't want to jump to any conclusions until the 15th... I think if they execute well on their next iteration of GPUs and CPUs, they should be in a reasonably competitive shape...
While I would have agreed with you in the past (Bulldozer was too big of a risk and they really shouldn't have dropped the ball on single-core performance) Jim Keller designed the next CPU, and he was on board the last time AMD made a competitive processor.
As for the GPU's, HBM + process shrink + Raja Koduri being in charge just might make things better. I can only hope, honestly. I don't want a x86 or dGPU monopoly. The only way I can see things continuing if AMD fails is if Intel started making Video cards, and Nvidia bought out AMD and started making x86 CPU's or whatnot.
Or well, someone buying somebody (probably AMD) and finding a way to make sure dGPU's and x86 CPU's are still a 2-horse race somehow.
Zen is basically AMD's last hope. If it doesn't pan out, there's not much left to carry them. The GPUs are still somewhat competitive, but right now AMD's in a bit of a lull with no real new products while Zen is still getting ready for production.
Pfft. So crappy that all you sheep bought your 2500k then started going oh poor AMD... You put them out of business and then realize it's a bad thing... Lovely.
I've been buying AMD since I built my first computer 18 years ago. Performance has always been good enough. And hell back then, when the first Athlons came out, and AMD started x64, and I ran Windows XP x64 edition, AMD was actually superior to Intel. In fact, they where the first real threat. I firmly believe had Intel not engaged in anti competitive behavior back then, essentially locking AMD out of companies like Dell, and HP back then, AMD would have had a lot more market share, and income, and today would be a very different day, than wondering if AMD is about to go bankrupt.
The illegal actions from Intel definitely did set AMD back, but even if that didn't happen, AMD would have struggled to produce enough chips to meet demand as their production capacity was only a fraction of Intels. The fact that the Core architecture from Intel was such an improvement, and AMDs Bulldozer was a bad design from the start, suggests AMD may not have fared all that much better.
Also worthy of note is that AMD's chipsets back when their CPUs were competitive/beating Intel performance-wise could be kinda flaky. I mean, there was an era where the best/most stable choice for AMDs cheaper, and P4 beating cpus was a VIA chipset. Or nforce2. Not sure Dell and the like were willing to risk alienating Intel for a flaky chipset like Via's KT series, no matter how awesome the Thunderbird, Thoroughbred, etc were versus the P4.
Same for me. I was holding on to AMD since I bought my first Tbird 1.4...tried as much as I could with the phenom II 965 as my last one until I tried the i7 920 and from there never turned back. I need performance not sentiment...AMD just couldnt keep it up so it was time to move on but indeed their lack of progress has also made Intel get lazy and CPU performance has been moving so slowly is not even interesting to me right now.
Didn't Cyrix merge with Centaur, and then get bought by Via?
Technically, there are 3 x86 CPU makers: Intel, AMD, and Via.
You just don't hear a lot about Via as they tend to create embedded/SFF builds for specific purposes, as opposed to general purpose desktops or gaming behemoths. Plus, their integrated graphics suck, and their chipset drivers are pretty much Windows-only.
I don't know where your hostility is directed at. I've been buying 90% AMD CPU's for the past 10 years or so. Literally every computer in my house except for 1 Laptop we bought last year has been AMD. I also have been buying only AMD GPU's since I switched sides when I got the 3870 (I was Nvidia-only before then).
There are a lot of people who are AMD loyalists (to keep competition going if nothing else), but the problem is that more people buy Intel/Vidia. The technology is often better, and the marketing definitely is.
While I want AMD to do better, I have to say that marketing-wise at least, some of their problems are self-inflicted.
pretty much. AMD needs to remain viable or nvidia becomes a lazy company like intel with no real incentive to innovate. yet this thread is swarming with the green machine angry at any mention counter to nvidia. eh.
Exactly. Most of people claiming AMD bankruptcy and how competition doesn't drive innovation have probably never taken university level finance, accounting and economics classes. With more than $700M of cash on hand and over $300M coming in 2016, AMD is far from going bankrupt. One has to wonder how people that uneducated/lacking diverse knowledge base even get real world jobs in other fields.
AMD has stolen numerous top people from Nvidia. All those peopele who think that this one guy determines the future of AMD are MORONS! Jim Keller did indeed finish ZEN! His MO is to finish his job and then join another company. Did Apple implode after Keller left them? HSA has finished most of its job. Now its up to other companies to produce supporting software. So, Anand tech is correct! Nvidia needs Rogers, because they do NOT have an HSA equivalent, but they do have money to pay him. AMD does NOT need Rogers, because HSA is finished!
To all MORONS, PLEASE put your money where your mouth is... take ALL your money and SHORT AMD ... OK?
Please post your short positions, so I can track YOUR BK!
What has HSA being completed got to do with anything? Phil Rogers was with ATI/AMD for 21 years, he doesn't have a history of moving on when projects finish or switching jobs for a new challenge. People rarely get up and leave a job after such a long time unless they're are worried about their future.
software would be made by software developers. AMD is not going to build photoshop or sony vegas to use HSA. There are a few other companies involved with HSA besides AMD and rogers role will be filled. One person does not make the whole thing
"To all MORONS, PLEASE put your money where your mouth is... take ALL your money and SHORT AMD ... OK? Please post your short positions, so I can track YOUR BK!"
They can't and won't. First, most of the are broke middle class and secondly, even if they had $10-20K to throw around, they have no balls as it probably took them 1 year just to save that much.
It is just eerie how every few months staunch AMD supporters just disappear from the comments section and replaced by new names, with the same zeal, and very similarly worded rhetoric. :p
I wouldn't know, I don't drop by this site very often. That said, even as an AMD fan, I don't think it's cool to be a fanboy. AMD has lots of flaws, and some of the stuff they do makes me facepalm.
I feel that the last thing AMD or any failing company needs is suck-ups, honestly. Especially when you talk to them in a place where they can hear. I like AMD, so I give them tons of constructive criticism, especially over social media, but I feel like it falls on deaf ears.
I tell them stuff like "You need a low-power card like the 750Ti for people who just use a 300w PSU on a stock CPU from Dell". If a friend asks me what card to get and they don't have a lot of money and don't want to muck around inside their case that much, I tell them to get a 750Ti. Where's your counter to that?
Or "If you really wanted to market your 300 series cards and Freesync, EVERY 300 card should've been Freesync compatible. The 370 (Pitcairn) shouldn't even be in the lineup and your marketing spiel should've been something simple like AMD 300 = Freesync" and stuff like that.
Basically, just some real common sense stuff, but I feel like while AMD has good tech, that their ability to think of the end user and how their customers think and act in real life is very lacking. Also, their marketing in my eyes is very insufficient when it doesn't outright actually suck and make me wince.
" Rogers’ departure comes as a bit of a surprise. Prior to leaving the company, Rogers’ had been with AMD (and ATI before it) for 21 years, serving as a fellow for the last 8 of those years. "
Why is this a surprise? AMD for sure is financially constraint and hence NV for sure can offer higher salaries to top people than AMD can.
It's probably not a question of salary, it's a question of work and job opportunities. It a technician can't be involved in interesting projects, he just looses it's enthusiasm on the work it's doing and tries to change his job. If you see that your company, where you worked for 21 years, isn't presenting new interesting opportunities, you just leave. Leaving after 21 years means you make a big jump where you really do not not where you are landing (you leave long time colleagues, friends, comfort of the place you have been working for so long) but if someone is forced to do so it just means that on this side of the platform jump nothing is good enough and you want to go to the other side.
Right now you are most certainly typing your responses on a x86-64 platform (or an Android/Apple phone, tablet). x86-64 was AMD's doing.
Had Intel gotten their way, we'd be all using Itaniums without backwards compatibility for most everything at the time. Or worse, still stuck with Netburst.
GDDR5, HBM are only a couple of things that AMD/ATI have come up in the recent past or were heavily invested in the development.
The final point is: we need competition. AMD/ATI need nVidia, and vice versa, to keep pushing things forward on the graphics and compute front. AMD has had problems on the CPU side, but I understand their bet on HSA. Too bad it took too long to come to fruition, and software support is very little. Meanwhile, we see Intel being able to just coast along because they don't need to push hard to stay ahead. And that is bad for all of us.
Granted, AMD developing x64 was as big a coup as they have ever managed versus Intel, but who's to say we wouldn't be better off nowadays if Itanium/IA64 had taken over? Sure, Intel's baby has pretty much died off now, but what if Itanium was now the standard (anti-trust would have likely forced intel to license the instruction set to AMD or some other competitor) and had gotten the same capital investment in R&D that x64 got?
Still, theres no question that AMD pretty much single-handedly crushed Itanium. That is an amazing achievement, considering how small they were vs the mighty Intel.
HUGE loss for AMD. I know Phil personally and know what an incredibly brilliant man he is. This is a giant blow to AMD although they will downplay the significance of it.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
83 Comments
Back to Article
ImSpartacus - Tuesday, October 13, 2015 - link
Oh man...Me2011 - Wednesday, October 14, 2015 - link
Rats from a sinking ship?Me2011 - Wednesday, October 14, 2015 - link
Not to imply Phil is a rat...Wreckage - Tuesday, October 13, 2015 - link
I wonder if NVIDIA will just buy AMD at this point.Shadow7037932 - Tuesday, October 13, 2015 - link
Doubtable, as that will likely expose nVidia to antitrust/monopoly issues. I think the most likely candidate to buy them would be Samsung.5150Joker - Tuesday, October 13, 2015 - link
Samsung is not in the business of buying failures.ImSpartacus - Wednesday, October 14, 2015 - link
They'd prefer to just poach their talent. :3Probably cheaper that way.
Mushkins - Wednesday, October 14, 2015 - link
Depends on what you want, the talent or the patents.MobiusPizza - Thursday, October 22, 2015 - link
AMD will lose all license to x86 from Intel if they were purchased by another entity, so I don't think anybody would buy AMD for the patnetstaisserroots - Saturday, October 24, 2015 - link
not really purchased but merged, also amd have the 64bit license, the only reason why intel can't make them pay for x86Human Bass - Tuesday, October 13, 2015 - link
Funny, I swear I was thinking those days that Samsung buying AMD would be a great, great thing. AMD is really stuck on 28nm, while Samsung is already doing 14nm. Combining the know-how on x86 processors and graphics with Samsung resources, fabs and technology would bring some real competition to Intel.Just the fact that Samsung would allow AMD processors to be under 20nm process, and AMD GPU/APU tech could be used on Samsung mobile chips would be a win/win for everybody.
Gigaplex - Tuesday, October 13, 2015 - link
AMD will be fabbing on sub-20nm tech licensed from Samsung with their Zen architecture.Mark_gb - Tuesday, October 13, 2015 - link
AMD will have 14/16 nm CPU's, APU's, AND GPU's within a year. With or without Samsung.And I for one would prefer that AMD remain an American company.
close - Wednesday, October 14, 2015 - link
AMD is not in the x86 business. And since it's so low-margin I don't think they want in even if the license was transferable. Although for AMD an exception might be made in order for Intel to avoid anti-trust lawsuits and because Intel is licensing x86-64 from AMD.0razor1 - Wednesday, October 14, 2015 - link
You mean *Samsung* not AMD ?ImSpartacus - Wednesday, October 14, 2015 - link
Anymore, it's tough to tell, lol...RU482 - Thursday, October 15, 2015 - link
x86 processors are not low marginnunya112 - Wednesday, October 14, 2015 - link
qualcomm is the only one who can really buy it for several reasonsTo avoid anti trust like he said
to retain the X86 licence Intel is more inclined to give it to an american company
and they have the cash.
and already done buisness with ATI previously. They bought Adreno GPU technology.
compund that with Qualcomm moving into ARM and server market. Zen would be a good pickup if IF its anygood.
Morawka - Wednesday, October 14, 2015 - link
dont see how as Intel, VIA, ARM, PowerVR, SIS, all make gpu's, Nvidia just holds all the good patents.5150Joker - Tuesday, October 13, 2015 - link
The only thing AMD could offer NVIDIA is patents to hold on to and maybe a couple engineers. I don't think there's anything else NVIDIA would want at this point, they totally outclass AMD.Gigaplex - Tuesday, October 13, 2015 - link
A gateway into x86 would be useful. NVIDIA tried that already but Intel refused licensing. It's not clear though whether AMD's x86 licenses would transfer in a buyout.Kvaern2 - Wednesday, October 14, 2015 - link
Yes it is clear. The cross patent agreement is not transferable, to anyone.http://www.sec.gov/Archives/edgar/data/2488/000119...
anubis44 - Wednesday, October 14, 2015 - link
You need to read this:http://www.kitguru.net/components/graphic-cards/an...
Although technically, the current cross-license agreement would terminate, Intel would almost certainly renegotiate with anyone buying AMD, or they'd risk losing rights to AMD64.
JoeMonco - Wednesday, October 14, 2015 - link
Wrong. You didn't read your own link very well:"However, it should be noted that if the cross-license between AMD and Intel is terminated because a party gets acquired by a third company, licenses granted to another party will survive unless that other party gets acquired too (i.e., if AMD is taken over, Intel sustains rights to AMD’s IP), in accordance with the term 5.2d of the agreement. The same happens if one company gets bankrupt."
Penti - Thursday, October 15, 2015 - link
Both Intel and AMD are public companies, and a change in control of Intel would invalidate the agreement too. Funds and institutions owns most of AMD and Intel. The only thing that I could see as out of question is mergers or a buyout of the x86-business.Mark_gb - Tuesday, October 13, 2015 - link
Then how is it that AMD keeps inventing things that Nvidia then feels the need to try to copy, but once Nvidia is done, the original AMD is better on both AMD and Nvidia hardware? Tresshair vs Hairworks for one.Or how about AMD being the major player in GDDR5, and as soon as that was done, starting work on HBM... Where was Nvidia on those? They sure are willing to use the AMD created technology though. It is shame AMD does not make stuff like this free to everyone except Nvidia.
Michael Bay - Wednesday, October 14, 2015 - link
So, useless gimmick and a freaking memory standart. Wow, such great innovation.Le Québécois - Wednesday, October 14, 2015 - link
So what? You'd rather nVidia and everyone else not use the new HBM memory standard and kept using GDDR5?Both AMD and nVidia make amazing new technologies every year or so. Some of those shouldn't be used because they come from AMD? Even when they are the new standard?
Michael Bay - Wednesday, October 14, 2015 - link
Point with the broadly accepted standards is, adoption happens across the whole industry. So, nV will use HMC regardless of what AMD does, especially considering how little it means to the enduser right now.Trying to parade normal process as some kind of muh epic win for AMD is just desperation.
Azix - Wednesday, October 14, 2015 - link
give AMD where credit is due. Much of these things are their work. Even dx12 as we know it is likely got a lot to do with AMDs work. I understand you might love nvidia but just be honest.Michael Bay - Thursday, October 15, 2015 - link
Oh please, DX was and always will be primarily driven by MS, not specific GPU vendors. It`s MS who listened to developers` wishes and then went on to building a new architecture.asniper - Wednesday, October 14, 2015 - link
Not even GDDR5, AMD had hands in that as well.CiccioB - Wednesday, October 14, 2015 - link
AMD invents things that nvidia wants to copy???? This is the funniest thing I have ever read.AMD has a history of creating (copying) technology and offering it as a free gift to the market in order to try to limit its competitors gain on their own custom technology they have developed much earlier than AMD ever could thing that that could be done.
Physix vs Bullets Library (where is the latter, BTW?)
CUDA vs lots of failed language extension till OpenCL (Apple thought) came
GamesWork is much more than Hairworks or Tresshair
Games Experience vs 3rd party app to do the same
Gsync came a year before the HW on the market could support such technology (AMD just queue when other created the HW to support Adaptive Sync, which then AMD called FreeSync as their own implementation). Still Gsync works better.
New fast AA methods have always been copied by AMD.
nvidia has been useing memory compression since GDDR5 bandwidth was starting to be insufficient for high end cards. AMD has always been using bigger (and more expensive) memory controller instead. Now they have understood that memory compression helps.
GDDR5 as any other memory adoption is NOT AMD's merit at all. It's just a sort of offer own products to first use the new standard technology in order to help the real creator of this one to have better and earlier testing devices. It just means that AMD gains few months of exclusivity.
nvida has never needed to use the latest memory technology to be superior to AMD. They jumped GDDR4 all together, for example. And HBM v1, which nvidia proved to equal with GDDR5 and better memory handling, keeping manufacturing cost lower. Not even needing it to lower its already low power consumption.
nvida has always worked with a scalar architecture, while AMD insisted for 6 years with VLIW, claiming it was a good architecture for GPGPU. Then at the end AMD had to create an architecture very similar to the one nvidia was developing, improving its GPGPU capabilities but loosing lots of efficiency in 3D work, both under the point of view of power consumption and die space. This was the opposite when nvidia was scalar vs VLIW.
It was AMD claiming "small is better" when using VLIW, while slowly coming to the dimension of nvidia professional GPU die area and at the end greatly surpassing them for the same performances, clearly indicating that they were trailing nvidia on technological choices, but failing to do better than them.
What are those great invention AMD put on the table in the last 8 years that nvidia copied?
I'm speaking about technology coming from AMD R&D laboratories, not free specifications that have been (partially) implemented by others with the aim to steal some of competition revenues with custom, closed, expensive, but working (which is fundamental for market adoption!).
I'm speaking about real technology that brought advantage to AMD that nvidia had to copy in order to not be left behind.
As far as I can think, I can't remember none. But maybe it's my fault. Waiting for a clear list.
Azix - Wednesday, October 14, 2015 - link
Delusional fanboy.Physx was bought.
CUDA is just another implementation of GPGPU. Nvidia cannot lay claim to it just because they put out a proprietary version.
Gameworks is not worth mentioning. It took existing effects and blackboxed them. It also runs like crap
G-sync was most likely nvidia cutting in before Adaptive sync got approved as a standard and even if not so the means already existed. brute forcing it with an FPGA is not innovation. The hardware already existed, the issue is certification for use with PC gpus etc. http://www.vesa.org/news/vesa-adds-adaptive-sync-t... - since 2009 its been possible.
There were already tons of AA methods. Nvidias own method just used a combination of things AMD had a decade before.
Nvidia chose to use higher clocked VRAM and AMD went for higher bandwidth. Both already used compression. eg. http://www.anandtech.com/show/2231/11 They just happen to improve the compression when they can. Ignorant people like you assume nvidia is the only one doing it because a lot of noise was made about the tech when maxwell 2 launched. probably because those cards had crap bandwidth and the compression was an excuse.
Not really understanding your point regarding GDDR5. It was developed by AMD. http://www.vrworld.com/2008/11/22/100th-story-gddr...
You're just being crazy. There is no reason to go out of your way to deny everything AMD and praise everything Nvidia. Its just crazy.
What nvidia did to lower their power consumption is partly why they are getting beaten in dx12. They made consessions in other places. AMD has been putting out chips that had quite a bit more power than nvidia, but were not taken advantage of due to dx11. I don't think we are quite at the point where memory bandwidth between gddr5 and hbm would make a significant difference. Its only 512GB/s vs ~336GB/s. The Fury often matching or beating the 980ti at 4K might be the most we see from that currently, as well as maybe xfire comparisons.
At this point AMD's hardware is more forward thinking than nvidia's. I am not sure about the history of moving to VLIW but I do know nvidia took a step back by removing their hardware scheduler with kepler and maxwell (for the sake of efficiency). AMD did maintain smaller die sizes after moving away from VLIW when comparing launch years. Hawaii was much smaller than kepler, Fiji is smaller than big maxwell 2.
So you want to cut out a lot of AMDs contributions just because they made it open? It only matters if its proprietary? How do you steal revenue using free open technology? Why does it HAVE TO bring advantage to AMD? Is innovation not innovation?
I doubt anyone will give you a list. You entire list was rubbish and not something that needs a countering list.
5150Joker - Wednesday, October 14, 2015 - link
AMD still loses in the latest DX 12 benchmark, I see 980 Ti at the top, what do you see? http://www.anandtech.com/show/9659/fable-legends-d...AMD has hot large dies that are inefficient, that is why they HAD to use HBM for their Fury line, just so they could cut down on the power draw and even then it sucks down more power than Maxwell which is just pathetic. What is more pathetic is Maxwell beats fiji in DX 11 and 12.
MobiusPizza - Thursday, October 22, 2015 - link
That's only because you are looking at 4K which is bottle necked by memory amount.. HBM is limited to 4GB 1st Gen and hence you are not comparing GPU architecture itself fairly. I can argue equally that FuryX beats 980Ti at 1080p. So what's your point? It's apple to oranges.bennyg - Thursday, October 15, 2015 - link
... AMD might well be more forward thinking... but immediate past history shows Nvidia has sold many more products for many years in a row. Warm fuzzy feelings about moving the world forwards don't contribute to earnings or PE multiples as much as superior products and sales do.StevoLincolnite - Wednesday, October 14, 2015 - link
AMD holds a few x86 patents as well, which Intel needs.medi03 - Thursday, October 15, 2015 - link
AMDs 390, is faster and consumes about 50w (total system) more than slower yet similarly priced 970.And that despite so much more modest R&D budget and all the wonderful "competitive" things that nVidia does with tesselation in GameWorks' libs.
nVidia spends gazillion to develop proprietary G-Sync that adds 200$ to Monitor costs, just to be countered in a couple of month by Display Port 1.2a/FreeSync, that comes FREE OF CHARGE with most (all?) scaler chips out there. (effectively only lazy manufacturers won't enable it in their monitors)
Totally outclass, yeah.
JoeMonco - Tuesday, October 13, 2015 - link
Why bother? They can buy AMD's patents on the cheap in a couple of years.Ultracrusher - Tuesday, October 13, 2015 - link
Man i have so much to say about this...being a computer hardware nerd for 25 years and building over 3000 personal computers. I hate this change. hear me out... first off Intel needs AMD and visa versus. without AMD being a power horse in the cpu business Intel has no reason to do any price fluctuations. that means the next get processors may not decrease the price of the current generation like in years past. 2nd thing is amd is so close to chapter 11 just look at there stock, enough said. I would like to see amd get purchased by nvidia just for the purpose of keeping the intel prices at bay. so what do you thinkMark_gb - Tuesday, October 13, 2015 - link
AMD and Nvidia already have a cross-licensing agreement covering most graphics patents.JoeMonco - Wednesday, October 14, 2015 - link
Sure, which is why if they wanted to BUY the patents they can jist wait for the inevitable bankruptcy of AMD and buy them for pennies on the dollar.anubis44 - Wednesday, October 14, 2015 - link
It's not going to happen, because AMD has over 40 years of silicon valley patents, an x86 license, AMD64 technology, and the entire Radeon Graphics division worth as much as nVidia alone (currently $12 billion market cap).Waiting for AMD's bankruptcy? You might as well wait for the second coming of Christ. People have been seeing AMD's imminent bankruptcy for over 40 years, and they're still waiting.
anubis44 - Wednesday, October 14, 2015 - link
You mean from Microsoft, Qualcomm or Apple, when one of them buys AMD? Keep dreaming.Mark_gb - Tuesday, October 13, 2015 - link
That will never happen. There MUST be competition.5150Joker - Tuesday, October 13, 2015 - link
Sounds to me like this article is more damage control for AMD than anything else by the way it's worded. You made it sound like this is some natural transition for both Jim Keller and Phil Rogers with them wrapping their work up and moving on. A fellow just doesn't wrap up his work after 21 years and jump to his companies arch rival, when something huge like this happens, it means there's a very dark cloud looming over AMD's future. But hey, at least they can still afford to pay their ineffectual PR + CEO.JoeMonco - Tuesday, October 13, 2015 - link
Yep, AMD's in the throes of a death spiral.RussianSensation - Thursday, October 15, 2015 - link
Heard about AMD going bankrupt for 10 years now, let's keep going. Based on Q3 earnings, it's clear they will have enough cash to survive in 2016 as well. By that point 16nm GPUs will come online, Zen is getting closer (Q4 2016/Q1 2017), they will most likely start getting $1B of extra revenue from two custom design wins they announced before (Nintendo NX). The financial arm chair experts at AT have been wrong for 10 years in a row but every quarter we read about DOOM and how AMD is going bankrupt/death spiral any quarter now. Funny part is not a single one of you has the balls to put your money where your mouth is. Anyone who is confident in their predictions is buying puts or shorting the stock all the way to the bank. Most is just empty talk from people who would love nothing more than an Intel/NV monopoly.2 executives leave and the company is going to be bankrupt? The most insane thing I've read in a long time considering that the same people have touted how no 1-2 people can ever do enough to turn AMD around but if 1-2 people leave, AMD is dead? You guys need to stop taking your meds.
Yojimbo - Tuesday, October 13, 2015 - link
I agree. Two of their top technology officers leaving is definitely bad news for AMD. It's a brain drain.medi03 - Thursday, October 15, 2015 - link
Jim Keller doesn't count. Finish project then leave is what he always does, no matter where he works.RussianSensation - Thursday, October 15, 2015 - link
They don't like logical responses since it goes against the mantra that AMD is bankrupt any day now.hammer256 - Tuesday, October 13, 2015 - link
I really don't want to jump to any conclusions until the 15th...I think if they execute well on their next iteration of GPUs and CPUs, they should be in a reasonably competitive shape...
extide - Tuesday, October 13, 2015 - link
One can only hope...JoeMonco - Tuesday, October 13, 2015 - link
Sure, people have said that for every new line of CPUs and GPUs. How many times must you go through this game before seeing the writing on the wall?AS118 - Wednesday, October 14, 2015 - link
While I would have agreed with you in the past (Bulldozer was too big of a risk and they really shouldn't have dropped the ball on single-core performance) Jim Keller designed the next CPU, and he was on board the last time AMD made a competitive processor.As for the GPU's, HBM + process shrink + Raja Koduri being in charge just might make things better. I can only hope, honestly. I don't want a x86 or dGPU monopoly. The only way I can see things continuing if AMD fails is if Intel started making Video cards, and Nvidia bought out AMD and started making x86 CPU's or whatnot.
Or well, someone buying somebody (probably AMD) and finding a way to make sure dGPU's and x86 CPU's are still a 2-horse race somehow.
ThreeDee912 - Tuesday, October 13, 2015 - link
Zen is basically AMD's last hope. If it doesn't pan out, there's not much left to carry them. The GPUs are still somewhat competitive, but right now AMD's in a bit of a lull with no real new products while Zen is still getting ready for production.coburn_c - Tuesday, October 13, 2015 - link
Pfft. So crappy that all you sheep bought your 2500k then started going oh poor AMD... You put them out of business and then realize it's a bad thing... Lovely.SilthDraeth - Tuesday, October 13, 2015 - link
I've been buying AMD since I built my first computer 18 years ago. Performance has always been good enough. And hell back then, when the first Athlons came out, and AMD started x64, and I ran Windows XP x64 edition, AMD was actually superior to Intel. In fact, they where the first real threat. I firmly believe had Intel not engaged in anti competitive behavior back then, essentially locking AMD out of companies like Dell, and HP back then, AMD would have had a lot more market share, and income, and today would be a very different day, than wondering if AMD is about to go bankrupt.Gigaplex - Tuesday, October 13, 2015 - link
The illegal actions from Intel definitely did set AMD back, but even if that didn't happen, AMD would have struggled to produce enough chips to meet demand as their production capacity was only a fraction of Intels. The fact that the Core architecture from Intel was such an improvement, and AMDs Bulldozer was a bad design from the start, suggests AMD may not have fared all that much better.medi03 - Thursday, October 15, 2015 - link
That's more of an excuse (but they can't make enough chips, I recall Dell said that), rather than real reason, as we've seen later.blppt - Wednesday, October 14, 2015 - link
Also worthy of note is that AMD's chipsets back when their CPUs were competitive/beating Intel performance-wise could be kinda flaky. I mean, there was an era where the best/most stable choice for AMDs cheaper, and P4 beating cpus was a VIA chipset. Or nforce2. Not sure Dell and the like were willing to risk alienating Intel for a flaky chipset like Via's KT series, no matter how awesome the Thunderbird, Thoroughbred, etc were versus the P4.shaolin95 - Friday, October 16, 2015 - link
Same for me. I was holding on to AMD since I bought my first Tbird 1.4...tried as much as I could with the phenom II 965 as my last one until I tried the i7 920 and from there never turned back. I need performance not sentiment...AMD just couldnt keep it up so it was time to move on but indeed their lack of progress has also made Intel get lazy and CPU performance has been moving so slowly is not even interesting to me right now.Yojimbo - Tuesday, October 13, 2015 - link
Yeah, supply corporate welfare to a poorly performing company with your consumer dollars. Sounds like an all new type of market inefficiency.prime2515103 - Wednesday, October 14, 2015 - link
Maybe Ti should bring back National Semiconductor so they can bring back Cyrix so we can buy their CPU's so they don't go out of business again.phoenix_rizzen - Wednesday, October 14, 2015 - link
Didn't Cyrix merge with Centaur, and then get bought by Via?Technically, there are 3 x86 CPU makers: Intel, AMD, and Via.
You just don't hear a lot about Via as they tend to create embedded/SFF builds for specific purposes, as opposed to general purpose desktops or gaming behemoths. Plus, their integrated graphics suck, and their chipset drivers are pretty much Windows-only.
AS118 - Wednesday, October 14, 2015 - link
I don't know where your hostility is directed at. I've been buying 90% AMD CPU's for the past 10 years or so. Literally every computer in my house except for 1 Laptop we bought last year has been AMD. I also have been buying only AMD GPU's since I switched sides when I got the 3870 (I was Nvidia-only before then).There are a lot of people who are AMD loyalists (to keep competition going if nothing else), but the problem is that more people buy Intel/Vidia. The technology is often better, and the marketing definitely is.
While I want AMD to do better, I have to say that marketing-wise at least, some of their problems are self-inflicted.
at80eighty - Wednesday, October 14, 2015 - link
pretty much. AMD needs to remain viable or nvidia becomes a lazy company like intel with no real incentive to innovate. yet this thread is swarming with the green machine angry at any mention counter to nvidia. eh.RussianSensation - Thursday, October 15, 2015 - link
Exactly. Most of people claiming AMD bankruptcy and how competition doesn't drive innovation have probably never taken university level finance, accounting and economics classes. With more than $700M of cash on hand and over $300M coming in 2016, AMD is far from going bankrupt. One has to wonder how people that uneducated/lacking diverse knowledge base even get real world jobs in other fields.KenLuskin - Tuesday, October 13, 2015 - link
AMD has stolen numerous top people from Nvidia. All those peopele who think that this one guy determines the future of AMD are MORONS!Jim Keller did indeed finish ZEN! His MO is to finish his job and then join another company. Did Apple implode after Keller left them?
HSA has finished most of its job. Now its up to other companies to produce supporting software.
So, Anand tech is correct!
Nvidia needs Rogers, because they do NOT have an HSA equivalent, but they do have money to pay him.
AMD does NOT need Rogers, because HSA is finished!
To all MORONS, PLEASE put your money where your mouth is... take ALL your money and SHORT AMD ... OK?
Please post your short positions, so I can track YOUR BK!
Duckeenie - Wednesday, October 14, 2015 - link
What has HSA being completed got to do with anything? Phil Rogers was with ATI/AMD for 21 years, he doesn't have a history of moving on when projects finish or switching jobs for a new challenge. People rarely get up and leave a job after such a long time unless they're are worried about their future.- A Moron.
Michael Bay - Wednesday, October 14, 2015 - link
Oh, so it`s up to other companies to produce software for unproven hardware that`s not on the market. Truly a great incentive.Then again, if that`s your level of understanding, HSA may have done its job as well.
Azix - Wednesday, October 14, 2015 - link
software would be made by software developers. AMD is not going to build photoshop or sony vegas to use HSA. There are a few other companies involved with HSA besides AMD and rogers role will be filled. One person does not make the whole thingMichael Bay - Thursday, October 15, 2015 - link
Jeezus. Nobody will build ANYTHING for it until it can show a significant market share.RussianSensation - Thursday, October 15, 2015 - link
"To all MORONS, PLEASE put your money where your mouth is... take ALL your money and SHORT AMD ... OK? Please post your short positions, so I can track YOUR BK!"They can't and won't. First, most of the are broke middle class and secondly, even if they had $10-20K to throw around, they have no balls as it probably took them 1 year just to save that much.
D. Lister - Wednesday, October 14, 2015 - link
It is just eerie how every few months staunch AMD supporters just disappear from the comments section and replaced by new names, with the same zeal, and very similarly worded rhetoric. :pAS118 - Wednesday, October 14, 2015 - link
I wouldn't know, I don't drop by this site very often. That said, even as an AMD fan, I don't think it's cool to be a fanboy. AMD has lots of flaws, and some of the stuff they do makes me facepalm.I feel that the last thing AMD or any failing company needs is suck-ups, honestly. Especially when you talk to them in a place where they can hear. I like AMD, so I give them tons of constructive criticism, especially over social media, but I feel like it falls on deaf ears.
I tell them stuff like "You need a low-power card like the 750Ti for people who just use a 300w PSU on a stock CPU from Dell". If a friend asks me what card to get and they don't have a lot of money and don't want to muck around inside their case that much, I tell them to get a 750Ti. Where's your counter to that?
Or "If you really wanted to market your 300 series cards and Freesync, EVERY 300 card should've been Freesync compatible. The 370 (Pitcairn) shouldn't even be in the lineup and your marketing spiel should've been something simple like AMD 300 = Freesync" and stuff like that.
Basically, just some real common sense stuff, but I feel like while AMD has good tech, that their ability to think of the end user and how their customers think and act in real life is very lacking. Also, their marketing in my eyes is very insufficient when it doesn't outright actually suck and make me wince.
beginner99 - Wednesday, October 14, 2015 - link
" Rogers’ departure comes as a bit of a surprise. Prior to leaving the company, Rogers’ had been with AMD (and ATI before it) for 21 years, serving as a fellow for the last 8 of those years. "Why is this a surprise? AMD for sure is financially constraint and hence NV for sure can offer higher salaries to top people than AMD can.
CiccioB - Wednesday, October 14, 2015 - link
It's probably not a question of salary, it's a question of work and job opportunities.It a technician can't be involved in interesting projects, he just looses it's enthusiasm on the work it's doing and tries to change his job.
If you see that your company, where you worked for 21 years, isn't presenting new interesting opportunities, you just leave. Leaving after 21 years means you make a big jump where you really do not not where you are landing (you leave long time colleagues, friends, comfort of the place you have been working for so long) but if someone is forced to do so it just means that on this side of the platform jump nothing is good enough and you want to go to the other side.
LordanSS - Wednesday, October 14, 2015 - link
Right now you are most certainly typing your responses on a x86-64 platform (or an Android/Apple phone, tablet). x86-64 was AMD's doing.Had Intel gotten their way, we'd be all using Itaniums without backwards compatibility for most everything at the time. Or worse, still stuck with Netburst.
GDDR5, HBM are only a couple of things that AMD/ATI have come up in the recent past or were heavily invested in the development.
The final point is: we need competition. AMD/ATI need nVidia, and vice versa, to keep pushing things forward on the graphics and compute front. AMD has had problems on the CPU side, but I understand their bet on HSA. Too bad it took too long to come to fruition, and software support is very little. Meanwhile, we see Intel being able to just coast along because they don't need to push hard to stay ahead. And that is bad for all of us.
blppt - Wednesday, October 14, 2015 - link
Granted, AMD developing x64 was as big a coup as they have ever managed versus Intel, but who's to say we wouldn't be better off nowadays if Itanium/IA64 had taken over? Sure, Intel's baby has pretty much died off now, but what if Itanium was now the standard (anti-trust would have likely forced intel to license the instruction set to AMD or some other competitor) and had gotten the same capital investment in R&D that x64 got?Still, theres no question that AMD pretty much single-handedly crushed Itanium. That is an amazing achievement, considering how small they were vs the mighty Intel.
Me2011 - Wednesday, October 14, 2015 - link
HUGE loss for AMD. I know Phil personally and know what an incredibly brilliant man he is. This is a giant blow to AMD although they will downplay the significance of it.Magius - Wednesday, October 14, 2015 - link
Mr. Rogers left the neighborhood? :'(