Seasonal expectations from Q1 numbers would predict about 9 million desktop AIBs sold this past quarter. Instead about 13 million were sold. That suggests that 30% of desktop AIBs sold this past quarter are attributable to cryptocurrency mining, potentially more since elevated prices could be expected to reduce gaming demand. Then AMD's 29.4% market share suggests that either NVIDIA sold significantly more GPUs to cryptocurrency miners than AMD did during the quarter, or that AMD sold remarkably few cards to gamers during the quarter.
I don't know which numbers you are looking at. AMD stole 2% share from Nvidia without taking the fact of the frenzy in the used card market. The 290, 290x, 390, 390x, RX 470, RX 480, RX 570 and RX580 were all over the place at crazy prices while Nvidia were having the 1060 GTX and the 1070 GTX only for offering.
I'm looking at the numbers posted in this article. I think I explained the analysis reasonably well. The 1.9% gain in market share is incredibly small if you consider the percentage of cards that were apparently sold to cryptocurrency miners. Also note that AMD's 29.4% market share is still historically low, and even 1.4% below their market share of Q2 just one year ago. If you have a specific question, just ask.
As far as used cards, they are really besides my point. However, elevated prices for used AMD cards indicate that AMD cards that were previously sold to gamers were siphoned off from gaming to crypto miners. Do you really think during a cryptomining boom that accounted for a 40% seasonal rise in sales during a normally slow quarter for gaming, and additionally when the prices of used cards were elevated because of that boom, that more gamers bought used cards from cryptominers than cryptominers bought from gamers? That's completely illogical, especially when, as you yourself pointed out, GTX 1060 and GTX 1070 gaming alternatives exist for sale at much less elevated prices. Consider further that the 1060 and 1070 were already outselling AMD's offerings even when the AMD cards were selling for much less, before the cryptomining craze.
The logic of my conclusion is already sound, but its accuracy is verified by the latest Steam survey. AMD's 500 series GPUs have yet to show up at all on the survey, and the percentage of 480 and 470 cards has actually declined steadily on the survey from a high point in the April. The former shows that AMD has been selling few cards to gamers (the second alternative of my first post), and the latter shows that the used card market has been siphoning AMD cards away from gamers.
Do note that it takes 3 months to react and up supply on an advanced node so Q2 just drained channel inventory ,there was no way for AMD and Nvidia to up supply. That's why share shifted little.
My question has nothing to do with "math". It has to do with the relevance of what you said... hence "so what?" You never indicated what I said that you were referring to, or how what you said was relevant to whatever it was that I said that you were referring to.
As far as the "math", I did do it. It's written out in my post. It stands on its own, regardless of your assertions about Q2, i.e., even if you are correct that Q3 and Q4 will be more affected by cryptocurrency than Q2, it does not make any difference. What I wrote is not a "theory", it is a statement based on the Q2 data. In Q2, AMD either sold remarkably few cards to gamers or they had well under 50% of the cryptocurrency market. The Steam survey suggests that there may have even been a net outflow of current generation AMD cards from the hands of gamers in the quarter, owing to the used card market. These conclusions are backed up by the common sense of who is and who is not willing to pay $375 for an RX 570.
Now quit acting like CNN and come into the world of facts and reality before you so stridently publicly declare things.
Every Microcenter I've walked in too, people are buying AMD cards for mining. It's ridiculous this is making a comeback. If you aren't going to use it for gaming (at all) just buy an ASIC for $400 and it will outperform any VEGA.
Or just don't mine at all. The margins are so slim, depending on cost of investment, utility cost, time, wallet and currency conversion fees...I've done this before years ago it just wasn't worth it, even if you work in a large team to mine collectively.
ASICs aren't ready for Etherium (one of the big places the action is, and what they are buying the cards for), and might never be. The real kick is memory bandwidth, and that is a lot trickier to get right than banging out shifts and lookups for SHA256. FPGA boards might have some pretty good memory interfaces, but they aren't likely to be sufficiently cost effective to beat an off the shelf GPU.
Also, if anyone is making a profit off of mining VEGA, it is because of that HBM2 interface. I doubt it was intended for miners (bitcoin was likely the miners' darling when VEGA's interface was frozen), but that is how it worked out.
You are directly contradicting JPR with that claim.
And how else do you explain the unseasonably strong GPU demand? 31% Q/Q gain in shipments in Q2. Historically there is about a 10% decline in Q2 compared with Q1.
Furthermore, NVIDIA displayed about $200 M in cryptocurrency sales in their Q2 earnings via sales to large miners. They reported this income in their "OEM & IP" segment and you can estimate the revenue. That doesn't include small miners buying retail cards for mining. Both AMD and NVIDIA acknowledged strong cryptocurrency mining impact in Q2 in their conference calls.
I am very curious to see how commercial block-chain usage will change all of this. Bitcoins and Etherium are interesting, but companies like large banks and MS are looking at blockchain tech for lots of other uses. I wonder if companies will need their own in-house servers to run these blockchains, or if there will be block rewards to the general public for doing the mining for them.
Blockchains depend on everyone having a copy of the ledger. If it was internal only it would not be a very trustworthy blockchain.
This is why Ether is so important. It lets others leverage the massive public blockchain, making it insanely difficult to alter what was already written in the blockchain.
I think companies intend to do block chain services. They set up in places with cheap electricity and can potentially sell various security products and services along with the block chain services.
> JPR does not see a repeat of market cannibalization by used mining card
Ethereum is supposed to eventually switch to a proof-of-stake system, which wouldn't need GPUs for mining. So I think that crash will happen eventually, perhaps even this year (although that is becoming increasingly unlikely.)
From what I've heard, that keeps getting pushed back. Some people seem to believe it will never actually happen. Any idea when they currently plan to actually make the switch?
The market share graph merely proves the point that, in order to sell a large number of games, developers should be working toward ensuring said games run very well on Intel's iGPUs. Targeting dGPUs means eliminating 2/3rds of potential sales or more depending on what generation of dGPU we're talking about sticking into the system requirements.
Not really since a large portion of the iGPU sales are for machines that would never play a game. You would need to use something game centric like Steam stats to draw that conclusion. Hell even some server chips have iGPU's now a days.
Hmm, no I don't think it shows that at all. One must consider who the potential customers for the games are. What percentage of people with only an iGPU are looking to play newly released games? How many of those systems sold with only an iGPU are sitting in a cubicle somewhere, anyway? You can't count all iGPUs as "potential sales".
Additionally, almost all computers come with an iGPU. According to JPR the attach rate for GPUs this past quarter was 146%. So a good number of those iGPUs you are counting are actually paired with a dGPU.
If we look at Steam's hardware survey and look at the percentage of all users with DX 12 graphics who are using Intel iGPUs it is 6.62%. No AMD iGPU even shows up on the list because they've had such a small market share. The number of regular Steam users with DX 12 graphics capability, an AMD CPU, and no dGPU is minuscule. Btw, DX12 graphics accounted for 78% of all respondents in July. If we include all GPUs capable of DX11 or above graphics, then iGPUs account for 12% of those. To restate it for clarity, 88% of Steam survey respondents who had a DX11-capable GPU were using a dGPU.
So, the percentage of people looking to buy a newly released game (potential sales) who only have an iGPU is far smaller than you are claiming. It's probably less than the 12% of Steam survey respondents with a DX11 capable GPU that were using an iGPU.
I want to clarify something I said. The percentage of people looking to buy SOME newly released game who only have an iGPU may be greater than 12%. But the percentage of total games sales by people with only an iGPU is probably less than 12%. Those with dGPUs are buying more games (and probably paying more for their games: buying higher priced games and buying them as soon as they come out instead of waiting for the price to fall).
In any case, a demand for graphically-demanding games does exist. If companies started making their games to cater to less demanding hardware it would open up the market for other companies to swoop in and cater to the users willing to pay more money for better-looking games. This is supported by the fact that gamers have actually been increasingly willing to spend more money for GPUs as time goes on, not less.
Or hope that the Ryzen apus wont be to shitty since AMD actually has relatively decent iGPUs and now with Ryzen decent cpus to. As long as neither part is cut down too much they should make okish gaming chips for undemanding gamers.
Was looking today at prices, Nvidia seems to have caught up with demand but AMD is not even close. This means that AMD will be losing massive share in gaming and they need to push prices down even if that means that everything is OOS.
AMD's only saving grace here is there are still clueless mining bandwagoners still clearing out their GPU supply even at prices that make zero sense because they still believe that "AMD + mining = gud" and "NV + mining = bad" meme.
Thats all well and good for now, but AMD spends money on endorsing game developers with their "Gaming Evolved" program, which is kind of pointless if nobody can buy your cards to game on.
Plus, eventually this mining crazy will fall off, and if you piss off gamers into buying in-stock Nvidia cards, you hurt yourself that way as well. My guess is that AMD is having trouble manufacturing enough supply for the cards, not that they are intentionally doing anything to limit output right now.
I'm surprised no one mentions the fact that GPU demand was _down_ 30% in Q1 2017 wrt Q4 2016, whereas shipments tend to trend flat between Q4 and Q1. So this really looks like mostly shipments were delayed from Q1 to Q2 2017.
Shipments do not tend to be flat between Q4 and Q1. They send to be down close to 20%. They were down a bit more than what is seasonal. There was a glut of graphics cards in the channel during Q1, so supply was fine. Shipments weren't delayed. Therefore any "delay" would have to be on the side of demand. What would be your explanation for such an unusual "delay" to have taken place this particular Q1? It flies in the face of the notion of seasonal normality.
Where did you get the -20% from? Q4 to Q1 never was below -10 over the past 10 years prior to 2017.
I'm just looking at the data in the article. First image: "AIB growth from Q4 to Q1, 10 ave: -2.6%" (which is nowhere near the -20% you claim) "AIB growth from Q1 to Q2, 10 ave: -9.8%" That is the 10-year average baseline. Note that this is a cumulative -12.4% from Q4 to Q2.
Now the article discusses the +31% from Q2 2017, comparing it to the -9.8% average - a huge outlier. We're talking about a sudden +40% in graphics card production and sales wrt what was expected. Thing is, the -30% from Q4 to Q1 in 2017 is also a huge outlier in the opposite direction. Q4 2106 to Q2 2017, we get a cumulative +1%. Now all of a sudden, we are just talking about a +13% over what was expected. Not the same thing.
The article seems to argue that there was an unexplained huge drop in sales in Q1 and a completely unrelated insane cryptomining boom in Q2 for a single cryptocurrency (to support this, at least some parallel chart of ether supply or similar info would be needed).
I'm just saying that without more info, it would seem more plausible to assume that the Q1 and Q2 outliers are related. That could be either a simple technical reason of some sales being attributed to Q2 rather than Q1 in the data, delayed purchase decisions, whatnot.
I got it from something I remembered from something an analyst said in a conference call, I believe 18% was the number. But it was talking about revenue, and specifically about NVIDIA's Q1 revenue. NVIDIA's Q1 is February, March, and April. JPR might be using calendar year Q1 which is January, February, and March (not sure). OK, forget about that. I shouldn't have said it, my mistake. Assume a 3% decline from Q4 to Q1 on average as in the chart.
There was a 30% decline this past Q4 to Q1. But it wasn't from a lack of supply. It was from a lack of demand. It's not reasonable to just assume that the Q1 and Q2 outliers are related. You need a reason. For what reason would large numbers of people be putting off purchases from Q1 to Q2? Why should it be different from seasonal norms? Anxiety about the economy? I don't think there was much. Expectation of a new product coming into market? The only product thought to be coming was Vega. In Q1 people thought Vega might come in Q2. Then in Q2 people knew Vega would be coming Q2 and Q3. Why would people who are waiting for new products purchase old products once they were sure a new product really was coming very soon? That doesn't make sense. Note that Vega Frontier Edition reviews didn't come out until Q2 was over.
Then consider that we do know why Q2 demand was strong. AMD and NVIDIA both said that cryptocurrency mining had a big impact in Q2. NVIDIA showed about $200 M of revenue sold to large crypto mining operations that they listed in their "OEM & IP" segment. That's independent of seasonal norms, it's direct market information. Q1 was not weak because of lack of cryptocurrency demand, since strong crypto currency demand is not normal. So there is good reason to believe Q1's weakness and Q2's strength are unrelated.
It's a good guess there would have still been weakness in discrete desktop graphics cards sales in Q2 without cryptocurrency sales, but we'll never know for sure. But the idea that Q2 demand was strongly affected by cryptocurrency mining is known from other sources. It's not just inferred from total unit shipments.
JPR argues for the insane cryptomining boom in Q2, which is what the article cites. For both Q1 (https://jonpeddie.com/press-releases/details/add-i... and Q2 2017 AIB reports, JPR does not go beyond saying that the -30% is "seasonally understandable," amidst a generally declining PC market. In the general graphics Q2 report (https://jonpeddie.com/press-releases/details/moder... JPR describes the trend as a return to seasonality, rather than a drastic outlier. What JPR does imply is that the post-recession cycle and the tablet incursion may be factors in why seasonality has not been normal.
We do not have access to the full paid JPR market reports, but it is highly unlikely that there is a hidden reason that JPR would not mention in the press releases and public reports. On the face of it, JPR has previously signaled the Q4 to Q1 drop is for mundane and/or seasonal reasons. And JPR in the Q2 2017 AIB report does not connect the Q1 to Q2 rise and Q4 to Q1 drop together at all. Because they have not stated or implied that the two outliers are related, this makes it difficult for the article to assume so without data, especially in light of a report that is explicitly data-driven.
Sorry, still minor snag in my numbers. Should be: "To be compared with the seasonal -12.4%." So we get +4.1% wrt expectations from Q4 to Q2 in 2017 (+21.5% in 2009).
I'm not a graphics card market analyst, nor an economist. But naively looking at this from a supply-and-demand perspective, sales seem to be bound to go up after the huge drop in Q1. Part of it may be ramping up production to build stock prior to the launches of 1080Ti in March, Titan Xp in April and GT1030 in May, absorbing some production capacity in Q1 for cards sold in Q2. Another part may be, as jjj noted earlier, that supply for the other cards in NVidias portfolio, which are much higher volume products, takes months to adjust. If demand was down in Q1, production will not follow immediately. So basically, by the end of Q1 the supply channel was likely sitting on a stockpile of unsold graphics cards (just released, soon to be released and older models). Should they throw them in the bin? These cards are bound to be sold one way or another, mechanically increasing sales in the following quarters.
On the numbers, my math was wrong. At these fractions, -30% followed by +31% is not +1%, but rater 100%*(0.7*1.31-1)=-8.3%. To be compared with the seasonal -9.8%. That's nothing. Zero, zilch, nada. Look at 2009, where sales increased by 100%*(1.07*1.02-1)=9.1%, up 19% wrt to the seasonal average. If there is anything historically unprecedented here, it's the drop in Q1 2017, not the return to seasonality in Q2.
I'm just saying that the way the numbers are portrayed in the article, it sounds like we are looking at a 40% increase in total demand wrt expectations, whereas we are really looking at +1.5%. The latter sounds less sexy but more realistic.
"I'm not a graphics card market analyst, nor an economist."
Neither am I.
"But naively looking at this from a supply-and-demand perspective, sales seem to be bound to go up after the huge drop in Q1."
In certain contexts that's true. Certainly if supply were an issue and there was pent up demand that would be vert reasonable. In certain contexts buying decisions are based on some external timing like funding availability. Purchases are going to be made, it's just a matter of when. In a case like that strong demand would be expected to follow weak demand. But from what I have seen in the consumer graphics card market that doesn't tend to happen. As far as I know, that doesn't fit this case. Consumers in general are affected by the economy and anxiety about the economy. In terms of graphics cards, consumers are further affected by new games releases. I don't follow games releases that closely but I don't think there was an unusual influx of popular and graphics-demanding titles in Q2 this year nor an unusual dearth in Q1.
"Part of it may be ramping up production to build stock prior to the launches of 1080Ti in March, Titan Xp in April and GT1030 in May, absorbing some production capacity in Q1 for cards sold in Q2."
Again, supply was not the issue in Q1. There was plenty of supply. Lack of Q1 sales was entirely due to lack of demand. And excess supply doesn't create more demand, so building up supply in Q1 won't increase sales in Q2.
"Another part may be, as jjj noted earlier, that supply for the other cards in NVidias portfolio, which are much higher volume products, takes months to adjust. If demand was down in Q1, production will not follow immediately."
This is an argument for lower sales, not greater sales. If they cut production on Q1 because of weakened demand and it takes months to restart production to eventually get new cards out to retailers, then when demand unexpectedly picked up in Q2 they would not have been able to respond to demand, reducing sales, not increasing them. It's possible that this happened but it does nothing to explain either the low Q1 sales or the high Q2 sales, or any link between Q1 and Q2 sales. So basically, by the end of Q1 the supply channel was likely sitting on a stockpile of unsold graphics cards (just released, soon to be released and older models). Should they throw them in the bin? These cards are bound to be sold one way or another, mechanically increasing sales in the following quarters."
"So basically, by the end of Q1 the supply channel was likely sitting on a stockpile of unsold graphics cards (just released, soon to be released and older models). Should they throw them in the bin? These cards are bound to be sold one way or another, mechanically increasing sales in the following quarters."
The cards aren't "bound to be sold". They will sell only if the demand exists. Sure retailers could cut prices to try to sell the cards through, as there would be increased demand at lower prices, but that isn't what happened. Prices actually went up, because demand was stronger than expected and outstripped supply. The supply and the demand are independent factors that both must meet in order to make a sale. You keep trying to explain everything entirely with supply.
"I'm just saying that the way the numbers are portrayed in the article, it sounds like we are looking at a 40% increase in total demand wrt expectations, whereas we are really looking at +1.5%. The latter sounds less sexy but more realistic."
But yet you argued against this interpretation yourself above when you said "Another part may be, as jjj noted earlier, that supply for the other cards in NVidias portfolio, which are much higher volume products, takes months to adjust. If demand was down in Q1, production will not follow immediately. So basically, by the end of Q1 the supply channel was likely sitting on a stockpile of unsold graphics cards." The only reason they would cut production after weak demand in Q1 is if they expected Q2 to continue to be weak. If the natural expectation were to rebound, as you are suggesting it should be, they would expect the rebound and continue with normal production. You and I may not be graphics card market analysts but you can bet that those making production decisions at AMD and NVIDIA are. What you are doing is tacitly assuming dependence of Q2's strength with Q1's weakness. It doesn't work that way. By AMD's and NVIDIA's decisions and JPR's report we can see that the actual experts don't believe that it works that way. When demand was weak in Q1 they fully expected weakness to be maintained into Q2, or at least they didn't expect Q2 demand to "make up for" the weak Q1 demand. And this isn't just guesswork on their parts. They gather a lot of data, they analyze it both quantitatively and with their own judgment.
@Yojimbo Is it possible that during the etherium ramp up, demand for cheaper second hand and older generation graphics cards was higher than demand for current generation cards (for miners) and somehow contributed to stalling demand for new cards in Q1? I realize that gamers don't really card what miners are doing (until if affects their prices) and that the sale of a second hand card is often followed by the purchase of a new card which could actually cause a temporary spike in sales due to lower total cost of entry. However, I've already read many public statements from people who claim to have sold their graphics cards to cash in on the boom and decided to make due with whatever else they had available until some later date (1080Ti, summer sales, Vega, Volta, etc.). It is conceivable that some of the drop in shipments may have come from this type of second hand sell and wait scenario. If AMD / nVidia then decided to drop production for Q2 to reflect weaker Q1 sales, as long as they didn't cut too hard (which I doubt they would have with the largest predicted increase in shipments typically slated for Q3), they would end with a scenario of underestimating demand for Q2, but still being able to pull from their stockpile to offset some(most?) of that. Keep in mind that a 30% drop in shipments followed by a 31% increase in shipments does not bring you back to parity: (100(%)*0.7)*1.31=91.7%. While the trend is certainly better than it has been, the Q4 - Q2 trend is looking pretty similar to 2012 (-8.3% vs -8.9%). Your thoughts?
""It is conceivable that some of the drop in shipments may have come from this type of second hand sell and wait scenario."
By "some" what do you mean? We are talking about swings of millions of units. I find it hard to believe that millions of gamers sold their graphics cards to miners in Q1 and then waited until Q2 to replace them. The Ethereum craze didn't start until Q2, anyway. Ether wasn't regularly above $50 until the end of April and wasn't regularly above $100 until the middle of May. I don't think Ethereum was much of an issue at all in Q1. If it were, Q1 new graphics cards sales wouldn't have been weak. Most mining is done by, and most mining cards are owned by, big mining farms that purchase in volume, not off ebay. When you take into account the size of the GPU mining market and you note that it accounts for millions of units, you must note that the bulk of those units are from large mining operations in the Far East and Eastern Europe. Once you throw those out, because they aren't getting supplied by the used card market, you no longer have that type of volume you need, even during the height of the craze in late Q2, let alone Q1 before the craze kicked off. Miners weren't buying a million+ used older generation cards in Q1. Furthermore, the number of those who would have sold their cards but delayed their purchase of a replacement into Q2 would have been well under the total.
"If AMD / nVidia then decided to drop production for Q2 to reflect weaker Q1 sales, as long as they didn't cut too hard (which I doubt they would have with the largest predicted increase in shipments typically slated for Q3), they would end with a scenario of underestimating demand for Q2, but still being able to pull from their stockpile to offset some(most?) of that."
This may be entirely true. But supply is not the issue here. The demand was low in Q1 and high in Q2. That's the issue.
"While the trend is certainly better than it has been, the Q4 - Q2 trend is looking pretty similar to 2012 (-8.3% vs -8.9%)."
If we zoom out far enough so we no longer have the resolution to be able to see the issue then the issue just magically disappears :P Sorry for the sarcasm.
@Yojimbo: "I find it hard to believe that millions of gamers sold their graphics cards to miners in Q1 and then waited until Q2 to replace them."
I was also considering (though I didn't state) a scenario where normal people who had little knowledge or interest in mining just happened to be selling their old card (there is a very large second hand market) and held off getting a new card when the card they wanted started shifting up in price. However:
@Yojimbo: "The Ethereum craze didn't start until Q2, anyway. Ether wasn't regularly above $50 until the end of April and wasn't regularly above $100 until the middle of May. I don't think Ethereum was much of an issue at all in Q1."
This kinda puts a damper on the whole theory. It does seem unlikely, given this information, that mining affected much in Q1, leaving the abnormal drop in Q1 sales a mystery unsolved.
@Yojimbo: "This may be entirely true. But supply is not the issue here. The demand was low in Q1 and high in Q2. That's the issue."
I think I'm understanding your perspective, but this statement bugs me. You can't truly separate supply and demand. If you have 50% higher than average demand, but you produced 200% more units, you still have an oversupply despite the increased demand. Supply will continue to be considered high until inventory starts to clear. Likewise, if you produce 30% fewer units but your demand only decreased 5%, you still have an under-supply despite the decreased demand. High supply <the state of having more inventory than will sell at the set price> is always relative to demand <the amount of an item that will sell at a set price>, or vice versa depending on the the perspective you want to look at it from.
@Yojimbo: "If we zoom out far enough so we no longer have the resolution to be able to see the issue then the issue just magically disappears :P Sorry for the sarcasm."
Sarcasm doesn't really bug me. It was just a theory based on limited information and I did ask for your thoughts on it (thank you for that). That said, it really doesn't fit well in the context of what I stated. While it is true that zooming out does obscure details like swings in sales and short term market trends, it does give a better perspective of the big picture. Zooming in too far without a consistent point of reference, as that first chart does, can also give the wrong picture. The shipment numbers in the first chart are not absolute numbers, they are relative to the previous quarters shipments (they should have included Q3 and Q4 or just stuck to absolute numbers instead of relative percentages). The AIB line graph illustrates my point. While there is a massive percentage upswing in shipments for Q2'17, the preceding downswing in Q1'17 means the actual numbers shipped were still lower than Q4'16 or even Q3'16. Again, I'm not necessarily disagreeing with your overall assessment, just trying to keep things in perspective.
I agree, between the 30% drop the quarter before and the 31% increase now, there's basically no change at all, and in other words, no content behind the story.
Can't wait for the magic mathematical money miners to bugger off. The prices on GTX 1060s and RX 580s are ludicrous, all because criminals want another platform for ransomware and selling drugs online.
Every Microcenter I've walked in too, people are buying AMD cards for mining. It's ridiculous this is making a comeback https://bestblowjobgifs.com/ . If you aren't going to use it for gaming (at all) just buy an ASIC for $400 and it will outperform any VEGA.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
46 Comments
Back to Article
Yojimbo - Friday, August 25, 2017 - link
Seasonal expectations from Q1 numbers would predict about 9 million desktop AIBs sold this past quarter. Instead about 13 million were sold. That suggests that 30% of desktop AIBs sold this past quarter are attributable to cryptocurrency mining, potentially more since elevated prices could be expected to reduce gaming demand. Then AMD's 29.4% market share suggests that either NVIDIA sold significantly more GPUs to cryptocurrency miners than AMD did during the quarter, or that AMD sold remarkably few cards to gamers during the quarter.eva02langley - Friday, August 25, 2017 - link
I don't know which numbers you are looking at. AMD stole 2% share from Nvidia without taking the fact of the frenzy in the used card market. The 290, 290x, 390, 390x, RX 470, RX 480, RX 570 and RX580 were all over the place at crazy prices while Nvidia were having the 1060 GTX and the 1070 GTX only for offering.Yojimbo - Friday, August 25, 2017 - link
I'm looking at the numbers posted in this article. I think I explained the analysis reasonably well. The 1.9% gain in market share is incredibly small if you consider the percentage of cards that were apparently sold to cryptocurrency miners. Also note that AMD's 29.4% market share is still historically low, and even 1.4% below their market share of Q2 just one year ago. If you have a specific question, just ask.As far as used cards, they are really besides my point. However, elevated prices for used AMD cards indicate that AMD cards that were previously sold to gamers were siphoned off from gaming to crypto miners. Do you really think during a cryptomining boom that accounted for a 40% seasonal rise in sales during a normally slow quarter for gaming, and additionally when the prices of used cards were elevated because of that boom, that more gamers bought used cards from cryptominers than cryptominers bought from gamers? That's completely illogical, especially when, as you yourself pointed out, GTX 1060 and GTX 1070 gaming alternatives exist for sale at much less elevated prices. Consider further that the 1060 and 1070 were already outselling AMD's offerings even when the AMD cards were selling for much less, before the cryptomining craze.
The logic of my conclusion is already sound, but its accuracy is verified by the latest Steam survey. AMD's 500 series GPUs have yet to show up at all on the survey, and the percentage of 480 and 470 cards has actually declined steadily on the survey from a high point in the April. The former shows that AMD has been selling few cards to gamers (the second alternative of my first post), and the latter shows that the used card market has been siphoning AMD cards away from gamers.
jjj - Friday, August 25, 2017 - link
Do note that it takes 3 months to react and up supply on an advanced node so Q2 just drained channel inventory ,there was no way for AMD and Nvidia to up supply.That's why share shifted little.
Yojimbo - Friday, August 25, 2017 - link
So what?jjj - Friday, August 25, 2017 - link
And i am supposed to take you seriously now?Do the math before you act like a Trump.
The impact was late in the quarter, prices and shortages scaled differently for the two and don't forget that gamers can be miners too.
Your theory does stand for Q3 and even Q4 if mining holds but it's nonsense for Q2.
Yojimbo - Friday, August 25, 2017 - link
My question has nothing to do with "math". It has to do with the relevance of what you said... hence "so what?" You never indicated what I said that you were referring to, or how what you said was relevant to whatever it was that I said that you were referring to.As far as the "math", I did do it. It's written out in my post. It stands on its own, regardless of your assertions about Q2, i.e., even if you are correct that Q3 and Q4 will be more affected by cryptocurrency than Q2, it does not make any difference. What I wrote is not a "theory", it is a statement based on the Q2 data. In Q2, AMD either sold remarkably few cards to gamers or they had well under 50% of the cryptocurrency market. The Steam survey suggests that there may have even been a net outflow of current generation AMD cards from the hands of gamers in the quarter, owing to the used card market. These conclusions are backed up by the common sense of who is and who is not willing to pay $375 for an RX 570.
Now quit acting like CNN and come into the world of facts and reality before you so stridently publicly declare things.
Samus - Sunday, August 27, 2017 - link
Every Microcenter I've walked in too, people are buying AMD cards for mining. It's ridiculous this is making a comeback. If you aren't going to use it for gaming (at all) just buy an ASIC for $400 and it will outperform any VEGA.Or just don't mine at all. The margins are so slim, depending on cost of investment, utility cost, time, wallet and currency conversion fees...I've done this before years ago it just wasn't worth it, even if you work in a large team to mine collectively.
wumpus - Thursday, August 31, 2017 - link
ASICs aren't ready for Etherium (one of the big places the action is, and what they are buying the cards for), and might never be. The real kick is memory bandwidth, and that is a lot trickier to get right than banging out shifts and lookups for SHA256. FPGA boards might have some pretty good memory interfaces, but they aren't likely to be sufficiently cost effective to beat an off the shelf GPU.Also, if anyone is making a profit off of mining VEGA, it is because of that HBM2 interface. I doubt it was intended for miners (bitcoin was likely the miners' darling when VEGA's interface was frozen), but that is how it worked out.
jjj - Friday, August 25, 2017 - link
Q2 is April -June and Ethereum went nuts in June so the impact on the quarter was limited.Yojimbo - Friday, August 25, 2017 - link
You are directly contradicting JPR with that claim.And how else do you explain the unseasonably strong GPU demand? 31% Q/Q gain in shipments in Q2. Historically there is about a 10% decline in Q2 compared with Q1.
Furthermore, NVIDIA displayed about $200 M in cryptocurrency sales in their Q2 earnings via sales to large miners. They reported this income in their "OEM & IP" segment and you can estimate the revenue. That doesn't include small miners buying retail cards for mining. Both AMD and NVIDIA acknowledged strong cryptocurrency mining impact in Q2 in their conference calls.
CaedenV - Friday, August 25, 2017 - link
I am very curious to see how commercial block-chain usage will change all of this. Bitcoins and Etherium are interesting, but companies like large banks and MS are looking at blockchain tech for lots of other uses. I wonder if companies will need their own in-house servers to run these blockchains, or if there will be block rewards to the general public for doing the mining for them.WinterCharm - Friday, August 25, 2017 - link
Blockchains depend on everyone having a copy of the ledger. If it was internal only it would not be a very trustworthy blockchain.This is why Ether is so important. It lets others leverage the massive public blockchain, making it insanely difficult to alter what was already written in the blockchain.
Yojimbo - Friday, August 25, 2017 - link
I think companies intend to do block chain services. They set up in places with cheap electricity and can potentially sell various security products and services along with the block chain services.nfriedly - Friday, August 25, 2017 - link
> JPR does not see a repeat of market cannibalization by used mining cardEthereum is supposed to eventually switch to a proof-of-stake system, which wouldn't need GPUs for mining. So I think that crash will happen eventually, perhaps even this year (although that is becoming increasingly unlikely.)
Yojimbo - Friday, August 25, 2017 - link
From what I've heard, that keeps getting pushed back. Some people seem to believe it will never actually happen. Any idea when they currently plan to actually make the switch?vladx - Saturday, August 26, 2017 - link
It's been delayed to 2nd half of 2018, a difficulty bomb delay is planned to that effect until then.BrokenCrayons - Friday, August 25, 2017 - link
The market share graph merely proves the point that, in order to sell a large number of games, developers should be working toward ensuring said games run very well on Intel's iGPUs. Targeting dGPUs means eliminating 2/3rds of potential sales or more depending on what generation of dGPU we're talking about sticking into the system requirements.FreckledTrout - Friday, August 25, 2017 - link
Not really since a large portion of the iGPU sales are for machines that would never play a game. You would need to use something game centric like Steam stats to draw that conclusion. Hell even some server chips have iGPU's now a days.Yojimbo - Friday, August 25, 2017 - link
Hmm, no I don't think it shows that at all. One must consider who the potential customers for the games are. What percentage of people with only an iGPU are looking to play newly released games? How many of those systems sold with only an iGPU are sitting in a cubicle somewhere, anyway? You can't count all iGPUs as "potential sales".Additionally, almost all computers come with an iGPU. According to JPR the attach rate for GPUs this past quarter was 146%. So a good number of those iGPUs you are counting are actually paired with a dGPU.
If we look at Steam's hardware survey and look at the percentage of all users with DX 12 graphics who are using Intel iGPUs it is 6.62%. No AMD iGPU even shows up on the list because they've had such a small market share. The number of regular Steam users with DX 12 graphics capability, an AMD CPU, and no dGPU is minuscule. Btw, DX12 graphics accounted for 78% of all respondents in July. If we include all GPUs capable of DX11 or above graphics, then iGPUs account for 12% of those. To restate it for clarity, 88% of Steam survey respondents who had a DX11-capable GPU were using a dGPU.
So, the percentage of people looking to buy a newly released game (potential sales) who only have an iGPU is far smaller than you are claiming. It's probably less than the 12% of Steam survey respondents with a DX11 capable GPU that were using an iGPU.
Yojimbo - Friday, August 25, 2017 - link
I want to clarify something I said. The percentage of people looking to buy SOME newly released game who only have an iGPU may be greater than 12%. But the percentage of total games sales by people with only an iGPU is probably less than 12%. Those with dGPUs are buying more games (and probably paying more for their games: buying higher priced games and buying them as soon as they come out instead of waiting for the price to fall).In any case, a demand for graphically-demanding games does exist. If companies started making their games to cater to less demanding hardware it would open up the market for other companies to swoop in and cater to the users willing to pay more money for better-looking games. This is supported by the fact that gamers have actually been increasingly willing to spend more money for GPUs as time goes on, not less.
someonesomewherelse - Saturday, October 14, 2017 - link
Or hope that the Ryzen apus wont be to shitty since AMD actually has relatively decent iGPUs and now with Ryzen decent cpus to. As long as neither part is cut down too much they should make okish gaming chips for undemanding gamers.jjj - Friday, August 25, 2017 - link
Was looking today at prices, Nvidia seems to have caught up with demand but AMD is not even close.This means that AMD will be losing massive share in gaming and they need to push prices down even if that means that everything is OOS.
StrangerGuy - Monday, August 28, 2017 - link
AMD's only saving grace here is there are still clueless mining bandwagoners still clearing out their GPU supply even at prices that make zero sense because they still believe that "AMD + mining = gud" and "NV + mining = bad" meme.blppt - Tuesday, August 29, 2017 - link
Thats all well and good for now, but AMD spends money on endorsing game developers with their "Gaming Evolved" program, which is kind of pointless if nobody can buy your cards to game on.Plus, eventually this mining crazy will fall off, and if you piss off gamers into buying in-stock Nvidia cards, you hurt yourself that way as well. My guess is that AMD is having trouble manufacturing enough supply for the cards, not that they are intentionally doing anything to limit output right now.
JanW1 - Friday, August 25, 2017 - link
I'm surprised no one mentions the fact that GPU demand was _down_ 30% in Q1 2017 wrt Q4 2016, whereas shipments tend to trend flat between Q4 and Q1. So this really looks like mostly shipments were delayed from Q1 to Q2 2017.Yojimbo - Friday, August 25, 2017 - link
Shipments do not tend to be flat between Q4 and Q1. They send to be down close to 20%. They were down a bit more than what is seasonal. There was a glut of graphics cards in the channel during Q1, so supply was fine. Shipments weren't delayed. Therefore any "delay" would have to be on the side of demand. What would be your explanation for such an unusual "delay" to have taken place this particular Q1? It flies in the face of the notion of seasonal normality.JanW1 - Friday, August 25, 2017 - link
Where did you get the -20% from? Q4 to Q1 never was below -10 over the past 10 years prior to 2017.I'm just looking at the data in the article. First image:
"AIB growth from Q4 to Q1, 10 ave: -2.6%" (which is nowhere near the -20% you claim)
"AIB growth from Q1 to Q2, 10 ave: -9.8%"
That is the 10-year average baseline. Note that this is a cumulative -12.4% from Q4 to Q2.
Now the article discusses the +31% from Q2 2017, comparing it to the -9.8% average - a huge outlier. We're talking about a sudden +40% in graphics card production and sales wrt what was expected. Thing is, the -30% from Q4 to Q1 in 2017 is also a huge outlier in the opposite direction. Q4 2106 to Q2 2017, we get a cumulative +1%. Now all of a sudden, we are just talking about a +13% over what was expected. Not the same thing.
The article seems to argue that there was an unexplained huge drop in sales in Q1 and a completely unrelated insane cryptomining boom in Q2 for a single cryptocurrency (to support this, at least some parallel chart of ether supply or similar info would be needed).
I'm just saying that without more info, it would seem more plausible to assume that the Q1 and Q2 outliers are related. That could be either a simple technical reason of some sales being attributed to Q2 rather than Q1 in the data, delayed purchase decisions, whatnot.
Yojimbo - Friday, August 25, 2017 - link
I got it from something I remembered from something an analyst said in a conference call, I believe 18% was the number. But it was talking about revenue, and specifically about NVIDIA's Q1 revenue. NVIDIA's Q1 is February, March, and April. JPR might be using calendar year Q1 which is January, February, and March (not sure). OK, forget about that. I shouldn't have said it, my mistake. Assume a 3% decline from Q4 to Q1 on average as in the chart.There was a 30% decline this past Q4 to Q1. But it wasn't from a lack of supply. It was from a lack of demand. It's not reasonable to just assume that the Q1 and Q2 outliers are related. You need a reason. For what reason would large numbers of people be putting off purchases from Q1 to Q2? Why should it be different from seasonal norms? Anxiety about the economy? I don't think there was much. Expectation of a new product coming into market? The only product thought to be coming was Vega. In Q1 people thought Vega might come in Q2. Then in Q2 people knew Vega would be coming Q2 and Q3. Why would people who are waiting for new products purchase old products once they were sure a new product really was coming very soon? That doesn't make sense. Note that Vega Frontier Edition reviews didn't come out until Q2 was over.
Then consider that we do know why Q2 demand was strong. AMD and NVIDIA both said that cryptocurrency mining had a big impact in Q2. NVIDIA showed about $200 M of revenue sold to large crypto mining operations that they listed in their "OEM & IP" segment. That's independent of seasonal norms, it's direct market information. Q1 was not weak because of lack of cryptocurrency demand, since strong crypto currency demand is not normal. So there is good reason to believe Q1's weakness and Q2's strength are unrelated.
It's a good guess there would have still been weakness in discrete desktop graphics cards sales in Q2 without cryptocurrency sales, but we'll never know for sure. But the idea that Q2 demand was strongly affected by cryptocurrency mining is known from other sources. It's not just inferred from total unit shipments.
Nate Oh - Friday, August 25, 2017 - link
JPR argues for the insane cryptomining boom in Q2, which is what the article cites. For both Q1 (https://jonpeddie.com/press-releases/details/add-i... and Q2 2017 AIB reports, JPR does not go beyond saying that the -30% is "seasonally understandable," amidst a generally declining PC market. In the general graphics Q2 report (https://jonpeddie.com/press-releases/details/moder... JPR describes the trend as a return to seasonality, rather than a drastic outlier. What JPR does imply is that the post-recession cycle and the tablet incursion may be factors in why seasonality has not been normal.We do not have access to the full paid JPR market reports, but it is highly unlikely that there is a hidden reason that JPR would not mention in the press releases and public reports. On the face of it, JPR has previously signaled the Q4 to Q1 drop is for mundane and/or seasonal reasons. And JPR in the Q2 2017 AIB report does not connect the Q1 to Q2 rise and Q4 to Q1 drop together at all. Because they have not stated or implied that the two outliers are related, this makes it difficult for the article to assume so without data, especially in light of a report that is explicitly data-driven.
Nate Oh - Friday, August 25, 2017 - link
Apologies, the links were slightly borked by the automatic truncation.https://jonpeddie.com/press-releases/details/add-i...
https://jonpeddie.com/press-releases/details/moder...
JanW1 - Saturday, August 26, 2017 - link
Sorry, still minor snag in my numbers. Should be:"To be compared with the seasonal -12.4%." So we get +4.1% wrt expectations from Q4 to Q2 in 2017 (+21.5% in 2009).
JanW1 - Saturday, August 26, 2017 - link
Oh and this comment should be in response to my other one below.EDIT BUTTON PUHLEEEAAZZE!
JanW1 - Saturday, August 26, 2017 - link
I'm not a graphics card market analyst, nor an economist. But naively looking at this from a supply-and-demand perspective, sales seem to be bound to go up after the huge drop in Q1. Part of it may be ramping up production to build stock prior to the launches of 1080Ti in March, Titan Xp in April and GT1030 in May, absorbing some production capacity in Q1 for cards sold in Q2. Another part may be, as jjj noted earlier, that supply for the other cards in NVidias portfolio, which are much higher volume products, takes months to adjust. If demand was down in Q1, production will not follow immediately. So basically, by the end of Q1 the supply channel was likely sitting on a stockpile of unsold graphics cards (just released, soon to be released and older models). Should they throw them in the bin? These cards are bound to be sold one way or another, mechanically increasing sales in the following quarters.On the numbers, my math was wrong. At these fractions, -30% followed by +31% is not +1%, but rater 100%*(0.7*1.31-1)=-8.3%. To be compared with the seasonal -9.8%. That's nothing. Zero, zilch, nada. Look at 2009, where sales increased by 100%*(1.07*1.02-1)=9.1%, up 19% wrt to the seasonal average. If there is anything historically unprecedented here, it's the drop in Q1 2017, not the return to seasonality in Q2.
I'm just saying that the way the numbers are portrayed in the article, it sounds like we are looking at a 40% increase in total demand wrt expectations, whereas we are really looking at +1.5%. The latter sounds less sexy but more realistic.
Yojimbo - Saturday, August 26, 2017 - link
"I'm not a graphics card market analyst, nor an economist."Neither am I.
"But naively looking at this from a supply-and-demand perspective, sales seem to be bound to go up after the huge drop in Q1."
In certain contexts that's true. Certainly if supply were an issue and there was pent up demand that would be vert reasonable. In certain contexts buying decisions are based on some external timing like funding availability. Purchases are going to be made, it's just a matter of when. In a case like that strong demand would be expected to follow weak demand. But from what I have seen in the consumer graphics card market that doesn't tend to happen. As far as I know, that doesn't fit this case. Consumers in general are affected by the economy and anxiety about the economy. In terms of graphics cards, consumers are further affected by new games releases. I don't follow games releases that closely but I don't think there was an unusual influx of popular and graphics-demanding titles in Q2 this year nor an unusual dearth in Q1.
"Part of it may be ramping up production to build stock prior to the launches of 1080Ti in March, Titan Xp in April and GT1030 in May, absorbing some production capacity in Q1 for cards sold in Q2."
Again, supply was not the issue in Q1. There was plenty of supply. Lack of Q1 sales was entirely due to lack of demand. And excess supply doesn't create more demand, so building up supply in Q1 won't increase sales in Q2.
"Another part may be, as jjj noted earlier, that supply for the other cards in NVidias portfolio, which are much higher volume products, takes months to adjust. If demand was down in Q1, production will not follow immediately."
This is an argument for lower sales, not greater sales. If they cut production on Q1 because of weakened demand and it takes months to restart production to eventually get new cards out to retailers, then when demand unexpectedly picked up in Q2 they would not have been able to respond to demand, reducing sales, not increasing them. It's possible that this happened but it does nothing to explain either the low Q1 sales or the high Q2 sales, or any link between Q1 and Q2 sales.
So basically, by the end of Q1 the supply channel was likely sitting on a stockpile of unsold graphics cards (just released, soon to be released and older models). Should they throw them in the bin? These cards are bound to be sold one way or another, mechanically increasing sales in the following quarters."
"So basically, by the end of Q1 the supply channel was likely sitting on a stockpile of unsold graphics cards (just released, soon to be released and older models). Should they throw them in the bin? These cards are bound to be sold one way or another, mechanically increasing sales in the following quarters."
The cards aren't "bound to be sold". They will sell only if the demand exists. Sure retailers could cut prices to try to sell the cards through, as there would be increased demand at lower prices, but that isn't what happened. Prices actually went up, because demand was stronger than expected and outstripped supply. The supply and the demand are independent factors that both must meet in order to make a sale. You keep trying to explain everything entirely with supply.
"I'm just saying that the way the numbers are portrayed in the article, it sounds like we are looking at a 40% increase in total demand wrt expectations, whereas we are really looking at +1.5%. The latter sounds less sexy but more realistic."
But yet you argued against this interpretation yourself above when you said "Another part may be, as jjj noted earlier, that supply for the other cards in NVidias portfolio, which are much higher volume products, takes months to adjust. If demand was down in Q1, production will not follow immediately. So basically, by the end of Q1 the supply channel was likely sitting on a stockpile of unsold graphics cards." The only reason they would cut production after weak demand in Q1 is if they expected Q2 to continue to be weak. If the natural expectation were to rebound, as you are suggesting it should be, they would expect the rebound and continue with normal production. You and I may not be graphics card market analysts but you can bet that those making production decisions at AMD and NVIDIA are. What you are doing is tacitly assuming dependence of Q2's strength with Q1's weakness. It doesn't work that way. By AMD's and NVIDIA's decisions and JPR's report we can see that the actual experts don't believe that it works that way. When demand was weak in Q1 they fully expected weakness to be maintained into Q2, or at least they didn't expect Q2 demand to "make up for" the weak Q1 demand. And this isn't just guesswork on their parts. They gather a lot of data, they analyze it both quantitatively and with their own judgment.
BurntMyBacon - Monday, August 28, 2017 - link
@YojimboIs it possible that during the etherium ramp up, demand for cheaper second hand and older generation graphics cards was higher than demand for current generation cards (for miners) and somehow contributed to stalling demand for new cards in Q1? I realize that gamers don't really card what miners are doing (until if affects their prices) and that the sale of a second hand card is often followed by the purchase of a new card which could actually cause a temporary spike in sales due to lower total cost of entry. However, I've already read many public statements from people who claim to have sold their graphics cards to cash in on the boom and decided to make due with whatever else they had available until some later date (1080Ti, summer sales, Vega, Volta, etc.). It is conceivable that some of the drop in shipments may have come from this type of second hand sell and wait scenario. If AMD / nVidia then decided to drop production for Q2 to reflect weaker Q1 sales, as long as they didn't cut too hard (which I doubt they would have with the largest predicted increase in shipments typically slated for Q3), they would end with a scenario of underestimating demand for Q2, but still being able to pull from their stockpile to offset some(most?) of that. Keep in mind that a 30% drop in shipments followed by a 31% increase in shipments does not bring you back to parity: (100(%)*0.7)*1.31=91.7%. While the trend is certainly better than it has been, the Q4 - Q2 trend is looking pretty similar to 2012 (-8.3% vs -8.9%). Your thoughts?
Yojimbo - Monday, August 28, 2017 - link
""It is conceivable that some of the drop in shipments may have come from this type of second hand sell and wait scenario."By "some" what do you mean? We are talking about swings of millions of units. I find it hard to believe that millions of gamers sold their graphics cards to miners in Q1 and then waited until Q2 to replace them. The Ethereum craze didn't start until Q2, anyway. Ether wasn't regularly above $50 until the end of April and wasn't regularly above $100 until the middle of May. I don't think Ethereum was much of an issue at all in Q1. If it were, Q1 new graphics cards sales wouldn't have been weak. Most mining is done by, and most mining cards are owned by, big mining farms that purchase in volume, not off ebay. When you take into account the size of the GPU mining market and you note that it accounts for millions of units, you must note that the bulk of those units are from large mining operations in the Far East and Eastern Europe. Once you throw those out, because they aren't getting supplied by the used card market, you no longer have that type of volume you need, even during the height of the craze in late Q2, let alone Q1 before the craze kicked off. Miners weren't buying a million+ used older generation cards in Q1. Furthermore, the number of those who would have sold their cards but delayed their purchase of a replacement into Q2 would have been well under the total.
"If AMD / nVidia then decided to drop production for Q2 to reflect weaker Q1 sales, as long as they didn't cut too hard (which I doubt they would have with the largest predicted increase in shipments typically slated for Q3), they would end with a scenario of underestimating demand for Q2, but still being able to pull from their stockpile to offset some(most?) of that."
This may be entirely true. But supply is not the issue here. The demand was low in Q1 and high in Q2. That's the issue.
"While the trend is certainly better than it has been, the Q4 - Q2 trend is looking pretty similar to 2012 (-8.3% vs -8.9%)."
If we zoom out far enough so we no longer have the resolution to be able to see the issue then the issue just magically disappears :P Sorry for the sarcasm.
BurntMyBacon - Thursday, August 31, 2017 - link
@Yojimbo: "I find it hard to believe that millions of gamers sold their graphics cards to miners in Q1 and then waited until Q2 to replace them."I was also considering (though I didn't state) a scenario where normal people who had little knowledge or interest in mining just happened to be selling their old card (there is a very large second hand market) and held off getting a new card when the card they wanted started shifting up in price. However:
@Yojimbo: "The Ethereum craze didn't start until Q2, anyway. Ether wasn't regularly above $50 until the end of April and wasn't regularly above $100 until the middle of May. I don't think Ethereum was much of an issue at all in Q1."
This kinda puts a damper on the whole theory. It does seem unlikely, given this information, that mining affected much in Q1, leaving the abnormal drop in Q1 sales a mystery unsolved.
@Yojimbo: "This may be entirely true. But supply is not the issue here. The demand was low in Q1 and high in Q2. That's the issue."
I think I'm understanding your perspective, but this statement bugs me. You can't truly separate supply and demand. If you have 50% higher than average demand, but you produced 200% more units, you still have an oversupply despite the increased demand. Supply will continue to be considered high until inventory starts to clear. Likewise, if you produce 30% fewer units but your demand only decreased 5%, you still have an under-supply despite the decreased demand. High supply <the state of having more inventory than will sell at the set price> is always relative to demand <the amount of an item that will sell at a set price>, or vice versa depending on the the perspective you want to look at it from.
@Yojimbo: "If we zoom out far enough so we no longer have the resolution to be able to see the issue then the issue just magically disappears :P Sorry for the sarcasm."
Sarcasm doesn't really bug me. It was just a theory based on limited information and I did ask for your thoughts on it (thank you for that). That said, it really doesn't fit well in the context of what I stated. While it is true that zooming out does obscure details like swings in sales and short term market trends, it does give a better perspective of the big picture. Zooming in too far without a consistent point of reference, as that first chart does, can also give the wrong picture. The shipment numbers in the first chart are not absolute numbers, they are relative to the previous quarters shipments (they should have included Q3 and Q4 or just stuck to absolute numbers instead of relative percentages). The AIB line graph illustrates my point. While there is a massive percentage upswing in shipments for Q2'17, the preceding downswing in Q1'17 means the actual numbers shipped were still lower than Q4'16 or even Q3'16. Again, I'm not necessarily disagreeing with your overall assessment, just trying to keep things in perspective.
ABR - Saturday, August 26, 2017 - link
I agree, between the 30% drop the quarter before and the 31% increase now, there's basically no change at all, and in other words, no content behind the story.someonesomewherelse - Saturday, October 14, 2017 - link
Except the fact that a rx 570 now costs ~400 eur and isn't even in stock while in Q1 it was 2something eur and in stock.r3loaded - Friday, August 25, 2017 - link
Can't wait for the magic mathematical money miners to bugger off. The prices on GTX 1060s and RX 580s are ludicrous, all because criminals want another platform for ransomware and selling drugs online.Yojimbo - Friday, August 25, 2017 - link
Large financial institutions are very interested in blockchain and cryptocurrency. It's a much bigger deal for the world than Destiny 2.cugukoro - Wednesday, July 14, 2021 - link
Every Microcenter I've walked in too, people are buying AMD cards for mining. It's ridiculous this is making a comeback https://bestblowjobgifs.com/ . If you aren't going to use it for gaming (at all) just buy an ASIC for $400 and it will outperform any VEGA.