It's ironic that this "dream combo" part is finally coming out after AMD's CPU cores are good enough that the combo isn't as comparatively desirable.
Also, I really hope that Enduro support (if that's the current name still) is extended to Linux, if not at least access to both devices. Would be cool to see a multiseat-ready consumer-level single-socket system.
FreeSync and FreeSync 2 are AMD's proprietary implementation of VESA's Adaptive Sync and HDMI's upcoming Game Mode VRR, it requires an AMD GPU. It's "free" because it doesn't require any additional proprietary hardware (specifically, no G-Sync module in the display). NVIDIA could implement G-Sync in the same manner, as they have proven with laptop G-Sync using eDP.
Freesync & Freesync 2 are not proprietary. It is true that AMD developed it, however it is open-source and royalty free. Any company can implement it, including nvidia.
Freesync2, or Freesync is not guaranteed, and not sure if even possible. The issue should be similar to adaptive sync on nVidia Optimus: you have to have a path connecting the display directly to the dGPU to have it. On these parts- displays can be expected to be connected to Intel iGPU, serving as energy efficient GPU solution and passing frames to display when dGPU is used- thus no Freesync.
Intel: knock knock Amd: umm yes? Intel: can you build us a gpu that we can integrate onto our cpu? Amd: umm no? Intel: we have a dump truck full of money... Amd: come in!
It's probably easier than that, likely the CPU and GPU teams are completely separate with different budgets (explains the process module differences, etc), and Intel just chatted up the GPU head directly. AMD'S GPU team (ATI) never competed with Intel after all!
Well think of this as Vega Polaris nano. My estimation is that this will get pretty close to 570 perf. Or well, prob the Intel part a little more in the direction of the 560, and their own 28CU part with higher tdp will prob beat the 570.
I thought the idea of this cpu for laptops so you can get longer battery life. How come they are putting this out for desktop? If someone wanted better graphic performance, can't they just get a discrete graphic card?
Most computers sold don't have discrete graphics cards. This meets the needs of corporate users who need a little bit more graphics oomph, without having to go to the expense of adding a discrete card.
Hmm, that's pretty high target TDP. It could be dialed down, but I was hoping this would go into the next 15" rMBP, which would need more like 60W total TDP.
True, and Iris Pro had 84W on the desktop while Apple got a 47W TDP one, and they were nearly single handedly the ones pushing for Iris Pros creation. I suppose the desktop end of this could be a may as well, while Apple asked for the laptop MCM nearly exclusively again, we'll see.
While that is certainly possible, I really doubt it. Apple has been trending away from discrete GPUs in their product lineup for years. Remember: battery life and total product size are important too (much to the frustration of "power users").
I don't think I agree with that claim. They brought back dGPUs to the base 15" rMBPs, and while the 13s are too small for them without comprimises, they do pick the best performing IGPs for them with EDRAM. Apple is one of the few to care about the baseline of graphics performance, have for years.
The iMac 4K also went back from Iris Pro to a dedicated Radeon Pro when it could.
I think this is a shoe-in for a next iMac and 15" rMBP (at lower wattage)
Don't confuse the short term product mix with the long-term trend. If Skylake/Kabylake had the L4 eDRAM option, I really doubt that Apple would have "brought back" the dGPU option.
In the short term, I doubt it. The consumer iMac and MacBook Pros were just revised six months ago and Apple tends to wait at least a year between product updates. Also, unlike a laptop, Apple isn't desperate for space in the iMac, so I don't see the appeal there. Either have a dGPU or don't, but optimizing the two dies onto the same package doesn't make any sense for the iMac.
Apple is forced to dGPU in iMac's because Intel doesn't offer fast enough iGPU's. And Intel is moving away from eDRAM because it doesn't scale very well with smaller gate pitches and can't easily be integrated with the main chip because it needs different process than the main cpu.
And also because Intel knows that the GPU's that they make themselves, are so horribly inefficient in terms of perf/w.
They'd be eager to compete if they knew how, but obviously they are unwilling to spend the money and effort to really strike against Nvidia / AMD in the short term in GPU space.
How about no igpu, more cores, more CU like XBO X? TDP that makes it suitable for mini form factors. Something that can perform as a desktop but miniature and as silent as possible, portable but with no screen and cheaper than laptops. Current mini PC are horribly overpriced and underperforming. XBO X shows that it doesn't need to like that.
I can't imagine why most people would like to put a huge tower rather than a NUC sized PC on their desk.
Actually a lot of professional offices have gone mobile, since laptops can dock in multiple monitors and when need, carried in a board room or carry on a plane.
I disagree about the XBOX X - maybe for gaming it is good - but it has no upgradable it at all and it aim at primary one purpose gaming - but it good for another purpose that so far is not really cost efficient in PC world and that is UHD Blu-ray play back
If primary interested in UHD Blu-ray for home theater, at under $200 Xbox One S stills the shows - I have one but I heard it still is not as good as a dedicated UHD Blu-ray player - only reason, I got it and this maybe wrong, is I would hope the player is upgrade more since it is mostly software.
A shame that this won't become available as a desktop-part. This would've been a nice upgrade to my currently used i7-5775C. And the upcoming Raven Ridge APUs look to be worse actually.
We first thought, without any details this was an Mobile Part designed where it could simplify the dual fan ( CPU + GPU ) cooling system.
Now it is a desktop part, I am not sure where this fit in. iMac doesn't need this space saving, and Apple could fairly well fit the dual Cooling system designed for iMac Pro and put into the iMac.
It doesn't fit the Mac Mini, since Apple don't even use high TDP for those range.
So it is likely a new iMac, in typical Apple fashion which is even thinner ( God I hope not ), or a new Mac Mini, in typical Apple fashion is even less upgradable then previous one.
Iris Pro was also a 'desktop part' the way Intel sold it, but Apple still used a 47W version of it where the desktops wer 84W, and was almost single handedly the one asking for it.
I can see this being similar. Put a 50W version in Apples laptops, while Intel sells whatever else it can.
My guess is that the posting with the other processors is a mistake and that this is a custom package for the next generation of Mac computers, Apple is probably going with an older CPU architecture to save money.
Saving money isn't everything. When you're as big as Apple, the ability to supply parts in volume starts to matter more than whether the parts are the "latest tech".
"According to some other media..." <- how about crediting the other media and providing links so we can check out the source? Standard journalism, right?
Wow, looks like a real abortion. Not sure why anyone in their right mind would do something idiotic like this. BGA filth on steroids, in search of suckers that like to waste money on disposable gonad-free tech garbage.
In low power systems - running on just the integrated system is more efficient than running on just discrete and even in this package still external.
Besides that it probably the same basic Intel core - with just AMD GPU added in the package. Who knows we have NVidia GPU's in similar package one day.
Unless you are serious doing graphics in games and such - actually the AMD GPU sits idle. This is done to same power and batter life.
What the hell with 100 Watt TDP? It is just make it stupid enough to pay a lot more with non optimal setting, with core i3 and AMD R7 250 its easy to hit 150 watt peak power, why not limit the cpu to 140 watt limit or even 220 watt in high cpu and gpu usage but 95 watt in normal high cpu usage and 30 watt when idle, i think its optimal..
To me this looks like temporary solution probably aim at Apple for either iMac or iMacbook Pro. It important to remember that same time this announcement came out, Raju came to Intel.
It looks like mobile 8th generation CPU included in special board with AMD GPU - I would not doubt that there could be Intel / NVidia combination in the future.
This is probably a special OEM board - only for specific situations - one I heard is that Intel has a new gaming NUC using this board ( it not actually CPU ). It actually reminds be of card inside my Lenovo IdealPad 100 - basically Atom based processor.
It would have been nice to also list the 8th Generation Mobile Processor, If it is a desktop processor, than it yet another socket configuration.
It looks like higher clock 8650U at 3.1 instead 1.9 and also has integrated 630 instead 620. which explains the higher watts. I wonder if the CPU can be included with AMD GPU in a notebook. For non gamers - that could be good combination.
This appears to be based on lower power mobile cores - this is not really a desktop processor like Coffee lake. AMD GPU takes up most of the power consumption - currently unknown but most likely more than 50% of it from AMD GPU - likely about between 65 and 75% since the Quad core U are 15 watt which this chip is based off - except running at higher frequency
is this going to be at all affordable? because of all the cryptominers, GPUs and other PC parts are at all time highs. A Vega 56 is going for around $600-700 when it should be around $400 because it's actually outpacing the GTX 1080 in terms of mining (it's almost as fast as the 1080Ti for mining). A Vega 64 is going for $700-800+. RAM is so expensive now because of the miners. The Nvidia 1070 is also good for miners and going for high amounts. For some reason, the regular 1080 is a dog for mining.
We’ve updated our terms. By continuing to use the site and/or by logging into your account, you agree to the Site’s updated Terms of Use and Privacy Policy.
80 Comments
Back to Article
lmcd - Monday, January 1, 2018 - link
It's ironic that this "dream combo" part is finally coming out after AMD's CPU cores are good enough that the combo isn't as comparatively desirable.Also, I really hope that Enduro support (if that's the current name still) is extended to Linux, if not at least access to both devices. Would be cool to see a multiseat-ready consumer-level single-socket system.
nathanddrews - Tuesday, January 2, 2018 - link
In reality, it's still pretty exciting it this means official FreeSync 2 support for Intel CPUs (w/AMD graphics, that is).Hinton - Tuesday, January 2, 2018 - link
What was it with FreeSync 2 that required an AMD CPU?I looked at AMDs page, but I couldn't see it in the wall of text.
Manch - Wednesday, January 3, 2018 - link
Nothing. Hence why it is called Free Sync. Nvidia wont support it bc it would chip away at their GSYNC sales.nathanddrews - Wednesday, January 3, 2018 - link
FreeSync and FreeSync 2 are AMD's proprietary implementation of VESA's Adaptive Sync and HDMI's upcoming Game Mode VRR, it requires an AMD GPU. It's "free" because it doesn't require any additional proprietary hardware (specifically, no G-Sync module in the display). NVIDIA could implement G-Sync in the same manner, as they have proven with laptop G-Sync using eDP.e36Jeff - Sunday, January 7, 2018 - link
Freesync & Freesync 2 are not proprietary. It is true that AMD developed it, however it is open-source and royalty free. Any company can implement it, including nvidia.neblogai - Wednesday, January 3, 2018 - link
Freesync2, or Freesync is not guaranteed, and not sure if even possible. The issue should be similar to adaptive sync on nVidia Optimus: you have to have a path connecting the display directly to the dGPU to have it. On these parts- displays can be expected to be connected to Intel iGPU, serving as energy efficient GPU solution and passing frames to display when dGPU is used- thus no Freesync.ImSpartacus - Friday, January 5, 2018 - link
That's a really good point. It probably won't support FreeSync for that reason.Zingam - Friday, January 5, 2018 - link
Does it melt down nicely?Nehemoth - Monday, January 1, 2018 - link
And Now I just want to know how many MH/s can I get to mine in the most popular algorithms.jjj - Monday, January 1, 2018 - link
So maybe Intel gets 24CUs while AMD uses a fully enabled die with 28CUs and 25-30% higher clocks for a similar solution with roughly 50% higher FLOPs.MajGenRelativity - Tuesday, January 2, 2018 - link
It *could* be like that, but a fully enabled died with 25% higher clocks would consume more powershabby - Monday, January 1, 2018 - link
Wonder how this conversation wentIntel: knock knock
Amd: umm yes?
Intel: can you build us a gpu that we can integrate onto our cpu?
Amd: umm no?
Intel: we have a dump truck full of money...
Amd: come in!
basroil - Tuesday, January 2, 2018 - link
It's probably easier than that, likely the CPU and GPU teams are completely separate with different budgets (explains the process module differences, etc), and Intel just chatted up the GPU head directly. AMD'S GPU team (ATI) never competed with Intel after all!Hinton - Tuesday, January 2, 2018 - link
Intel tried to compete with AMD on the GPU part, and failed.At least, for the market segment this is aimed at.
Manch - Wednesday, January 3, 2018 - link
"Intel just chatted up the GPU head directly"Yup, and made this deal
Then he quit
Now he works for Intel.....
Hinton - Tuesday, January 2, 2018 - link
I don't think it went like that.Amongst many, Nvidia is a bigger name than AMD.
I don't know why AMD isn't making hay on the fact that they're in the Xbox and Playstation.
But the deal this time could very easily be "Intel Partners with AMD to Creates the Best Compined GPU/CPU". Intel pushing this makes AMD look good.
HStewart - Wednesday, January 3, 2018 - link
Actually it continues fromAmd: come in!
Intel: We have a temporary solution using Kaby-G
Amd: Oh why?
Intel: Raju - new discrete GPU coming in the future.
Amd: Oh :(
webdoctors - Monday, January 1, 2018 - link
Unless the pricing is reasonable, this'll be a DOA product or one catering solely at OEMs.70W for AMD videocard is a tough barrier.
Mil0 - Tuesday, January 2, 2018 - link
Well think of this as Vega Polaris nano. My estimation is that this will get pretty close to 570 perf. Or well, prob the Intel part a little more in the direction of the 560, and their own 28CU part with higher tdp will prob beat the 570.Arnulf - Tuesday, January 2, 2018 - link
It's a product made specifically for Apple.06GTOSC - Tuesday, January 2, 2018 - link
It's clearly not for budget notebooks. But the power, space, and cost savings versus a separate GPU card should make it worth it if it's price right.06GTOSC - Tuesday, January 2, 2018 - link
And it's definitely solely for OEMs. People building their own, unless they want something like this for an HTPC, likely wouldn't go for it.Cliff34 - Monday, January 1, 2018 - link
I thought the idea of this cpu for laptops so you can get longer battery life. How come they are putting this out for desktop? If someone wanted better graphic performance, can't they just get a discrete graphic card?tyger11 - Monday, January 1, 2018 - link
Most computers sold don't have discrete graphics cards. This meets the needs of corporate users who need a little bit more graphics oomph, without having to go to the expense of adding a discrete card.Old_Fogie_Late_Bloomer - Tuesday, January 2, 2018 - link
This is probably purpose-built for a Retina iMac.tipoo - Monday, January 1, 2018 - link
Hmm, that's pretty high target TDP. It could be dialed down, but I was hoping this would go into the next 15" rMBP, which would need more like 60W total TDP.Ian Cutress - Monday, January 1, 2018 - link
You can obviously declock/bin to get different power consumptiontipoo - Monday, January 1, 2018 - link
True, and Iris Pro had 84W on the desktop while Apple got a 47W TDP one, and they were nearly single handedly the ones pushing for Iris Pros creation. I suppose the desktop end of this could be a may as well, while Apple asked for the laptop MCM nearly exclusively again, we'll see.vFunct - Tuesday, January 2, 2018 - link
Maybe this is for a Mac Mini or cheaper iMac, and there's probably another version for laptops?crashtech - Tuesday, January 2, 2018 - link
Teasing us with the probability of a socketed solution isn't very nice! Seems almost certain this will be BGA.Hinton - Tuesday, January 2, 2018 - link
But, you aren't interested in a socketed version?Judas_g0at - Tuesday, January 2, 2018 - link
Has to be for a new Apple product.Elstar - Tuesday, January 2, 2018 - link
While that is certainly possible, I really doubt it. Apple has been trending away from discrete GPUs in their product lineup for years. Remember: battery life and total product size are important too (much to the frustration of "power users").mr_tawan - Tuesday, January 2, 2018 - link
With this level of TDP, I think , if Apple will be using one of these, it would be on the iMac pro (or even Mac Pro).tipoo - Tuesday, January 2, 2018 - link
I don't think I agree with that claim. They brought back dGPUs to the base 15" rMBPs, and while the 13s are too small for them without comprimises, they do pick the best performing IGPs for them with EDRAM. Apple is one of the few to care about the baseline of graphics performance, have for years.The iMac 4K also went back from Iris Pro to a dedicated Radeon Pro when it could.
I think this is a shoe-in for a next iMac and 15" rMBP (at lower wattage)
Elstar - Tuesday, January 2, 2018 - link
Don't confuse the short term product mix with the long-term trend. If Skylake/Kabylake had the L4 eDRAM option, I really doubt that Apple would have "brought back" the dGPU option.tipoo - Tuesday, January 2, 2018 - link
There's no word of 8th gen processors with Iris Pro either, ergo...Shoo-in for Apple.A 60W TDP-down version of this sounds perfect for the rMBP, and the 100 for the iMac 4K.
Elstar - Tuesday, January 2, 2018 - link
In the short term, I doubt it. The consumer iMac and MacBook Pros were just revised six months ago and Apple tends to wait at least a year between product updates. Also, unlike a laptop, Apple isn't desperate for space in the iMac, so I don't see the appeal there. Either have a dGPU or don't, but optimizing the two dies onto the same package doesn't make any sense for the iMac.zepi - Tuesday, January 2, 2018 - link
Apple is forced to dGPU in iMac's because Intel doesn't offer fast enough iGPU's. And Intel is moving away from eDRAM because it doesn't scale very well with smaller gate pitches and can't easily be integrated with the main chip because it needs different process than the main cpu.zepi - Tuesday, January 2, 2018 - link
And also because Intel knows that the GPU's that they make themselves, are so horribly inefficient in terms of perf/w.They'd be eager to compete if they knew how, but obviously they are unwilling to spend the money and effort to really strike against Nvidia / AMD in the short term in GPU space.
Zingam - Tuesday, January 2, 2018 - link
What is this travesty?Zingam - Tuesday, January 2, 2018 - link
Well, it might make sense in the case of NUC or Mac mini but only if the cut out the igpu!Zingam - Tuesday, January 2, 2018 - link
How about no igpu, more cores, more CU like XBO X? TDP that makes it suitable for mini form factors.Something that can perform as a desktop but miniature and as silent as possible, portable but with no screen and cheaper than laptops.
Current mini PC are horribly overpriced and underperforming. XBO X shows that it doesn't need to like that.
I can't imagine why most people would like to put a huge tower rather than a NUC sized PC on their desk.
mr_tawan - Tuesday, January 2, 2018 - link
Having a Mid-Tower on the desk give a 'professional' vibe to the overall office I guess.HStewart - Wednesday, January 3, 2018 - link
Actually a lot of professional offices have gone mobile, since laptops can dock in multiple monitors and when need, carried in a board room or carry on a plane.In my case I am using a Thinkpad 530
HStewart - Wednesday, January 3, 2018 - link
I disagree about the XBOX X - maybe for gaming it is good - but it has no upgradable it at all and it aim at primary one purpose gaming - but it good for another purpose that so far is not really cost efficient in PC world and that is UHD Blu-ray play backIf primary interested in UHD Blu-ray for home theater, at under $200 Xbox One S stills the shows - I have one but I heard it still is not as good as a dedicated UHD Blu-ray player - only reason, I got it and this maybe wrong, is I would hope the player is upgrade more since it is mostly software.
jrs77 - Tuesday, January 2, 2018 - link
A shame that this won't become available as a desktop-part. This would've been a nice upgrade to my currently used i7-5775C. And the upcoming Raven Ridge APUs look to be worse actually.silverblue - Tuesday, January 2, 2018 - link
I don't think they're competing products, so no issue.iwod - Tuesday, January 2, 2018 - link
May be it is not Apple after all?We first thought, without any details this was an Mobile Part designed where it could simplify the dual fan ( CPU + GPU ) cooling system.
Now it is a desktop part, I am not sure where this fit in. iMac doesn't need this space saving, and Apple could fairly well fit the dual Cooling system designed for iMac Pro and put into the iMac.
It doesn't fit the Mac Mini, since Apple don't even use high TDP for those range.
So it is likely a new iMac, in typical Apple fashion which is even thinner ( God I hope not ), or a new Mac Mini, in typical Apple fashion is even less upgradable then previous one.
Or may be it is not Apple at all.
tipoo - Tuesday, January 2, 2018 - link
Iris Pro was also a 'desktop part' the way Intel sold it, but Apple still used a 47W version of it where the desktops wer 84W, and was almost single handedly the one asking for it.
I can see this being similar. Put a 50W version in Apples laptops, while Intel sells whatever else it can.
HStewart - Wednesday, January 3, 2018 - link
My guess is the following:new iMac
new iMacbook Pro's or possibly even the iMacbook Air
new NUC aim at gaming.
Possibly other venders and even one day different dGPU - but likely will evolved into Intel own dGPU designed by Raju
It is odd this think looks like the Celeron motherboard in my Lenovo 100s which is horrible to upgrade especially memory
http://www.ebay.com/itm/like/222752096033
timbotim - Tuesday, January 2, 2018 - link
Tweezing? On a tech site? Highlighting, if it has to be aesthetic-related, but simply, why not "noticing"?paul sss - Tuesday, January 2, 2018 - link
or for a gaming machine think lottery / gamblingPhilybeef - Tuesday, January 2, 2018 - link
My guess is that the posting with the other processors is a mistake and that this is a custom package for the next generation of Mac computers, Apple is probably going with an older CPU architecture to save money.Elstar - Wednesday, January 3, 2018 - link
Saving money isn't everything. When you're as big as Apple, the ability to supply parts in volume starts to matter more than whether the parts are the "latest tech".TinyAudio - Tuesday, January 2, 2018 - link
Hades Canyon the replacement for the Skull Canyon NUC ??HStewart - Wednesday, January 3, 2018 - link
Yes I heard this arrangement is primary for gaming NUC - likely also Apple products.OrphanageExplosion - Tuesday, January 2, 2018 - link
"According to some other media..." <- how about crediting the other media and providing links so we can check out the source? Standard journalism, right?Cryio - Tuesday, January 2, 2018 - link
An Intel CPU has 24 CU while AMD's own APUs have 12 CU? Wow.Manch - Wednesday, January 3, 2018 - link
Unconfirmed, and diff TDP, diff product target.Mr. Fox - Tuesday, January 2, 2018 - link
Wow, looks like a real abortion. Not sure why anyone in their right mind would do something idiotic like this. BGA filth on steroids, in search of suckers that like to waste money on disposable gonad-free tech garbage.06GTOSC - Tuesday, January 2, 2018 - link
Will the onboard AMD GPU have it's own memory? Or is pulling from system memory like the onboard Intel GPU?tipoo - Tuesday, January 2, 2018 - link
It will have HBM2.fasterquieter - Tuesday, January 2, 2018 - link
So these have both integrated and package graphics? Does that mean the integrated graphics just sit idle? Seems wasteful.Kwarkon - Tuesday, January 2, 2018 - link
they would not write about that if it was not on purposesimply you can either go for performance (pGPU) or battery life (iGPU)
tipoo - Wednesday, January 3, 2018 - link
Same as every existing dGPU system outside of a Xeon. The IGP will still be used for battery savings in the laptop version.HStewart - Wednesday, January 3, 2018 - link
In low power systems - running on just the integrated system is more efficient than running on just discrete and even in this package still external.Besides that it probably the same basic Intel core - with just AMD GPU added in the package. Who knows we have NVidia GPU's in similar package one day.
Unless you are serious doing graphics in games and such - actually the AMD GPU sits idle. This is done to same power and batter life.
Pinn - Tuesday, January 2, 2018 - link
mylene loves ryankucingorange - Wednesday, January 3, 2018 - link
What the hell with 100 Watt TDP? It is just make it stupid enough to pay a lot more with non optimal setting, with core i3 and AMD R7 250 its easy to hit 150 watt peak power, why not limit the cpu to 140 watt limit or even 220 watt in high cpu and gpu usage but 95 watt in normal high cpu usage and 30 watt when idle, i think its optimal..HStewart - Wednesday, January 3, 2018 - link
This is not a desktop cpu/gpu - likely specially designed for Apple systems - for lower power consumption.HStewart - Wednesday, January 3, 2018 - link
To me this looks like temporary solution probably aim at Apple for either iMac or iMacbook Pro. It important to remember that same time this announcement came out, Raju came to Intel.It looks like mobile 8th generation CPU included in special board with AMD GPU - I would not doubt that there could be Intel / NVidia combination in the future.
This is probably a special OEM board - only for specific situations - one I heard is that Intel has a new gaming NUC using this board ( it not actually CPU ). It actually reminds be of card inside my Lenovo IdealPad 100 - basically Atom based processor.
It would have been nice to also list the 8th Generation Mobile Processor, If it is a desktop processor, than it yet another socket configuration.
HStewart - Wednesday, January 3, 2018 - link
It looks like higher clock 8650U at 3.1 instead 1.9 and also has integrated 630 instead 620. which explains the higher watts. I wonder if the CPU can be included with AMD GPU in a notebook. For non gamers - that could be good combination.pavag - Wednesday, January 3, 2018 - link
4 cores? in 2018? is 2008 again?Also, Intel has atrocious graphics driver support.
HStewart - Wednesday, January 3, 2018 - link
This appears to be based on lower power mobile cores - this is not really a desktop processor like Coffee lake. AMD GPU takes up most of the power consumption - currently unknown but most likely more than 50% of it from AMD GPU - likely about between 65 and 75% since the Quad core U are 15 watt which this chip is based off - except running at higher frequencyHStewart - Wednesday, January 3, 2018 - link
Also this is a temporary solution for gaming industry - until Raju comes up with next generation Intel GPU'sdirtyvu - Thursday, January 4, 2018 - link
is this going to be at all affordable? because of all the cryptominers, GPUs and other PC parts are at all time highs. A Vega 56 is going for around $600-700 when it should be around $400 because it's actually outpacing the GTX 1080 in terms of mining (it's almost as fast as the 1080Ti for mining). A Vega 64 is going for $700-800+. RAM is so expensive now because of the miners. The Nvidia 1070 is also good for miners and going for high amounts. For some reason, the regular 1080 is a dog for mining.Zingam - Friday, January 5, 2018 - link
Does it melt down nicely?