AMD held a press briefing today on their upcoming 8000M graphics chips, which they are calling the "second generation GCN architecture" parts. We’ll have more on that in a moment, but while we were expecting (dreading) a rebranding prior to the call, it appears we are at least partially mistaken; there will be at least one completely new GPU with 8000M. (If you want additional background material, you can see the previous generation high-end 7000M announcement from April 2012 for reference.)

I’m not going to get too far into the marketing aspects, as we’ve heard all of this information before: AMD has improved Enduro Technology, they’re continuing to improve their drivers, and APP Acceleration has a few more applications. There have been a few major titles released in the past couple of months with AMD Gaming Evolved branding (Far Cry 3 is arguably the most notable of the offerings, with Hitman: Absolution and Sleeping Dogs also scoring well amongst critics and users), and Bioshock Infinite is at least one future release that I'm looking forward to playing.

Cutting straight to the chase, at this point AMD has released limited information on the core specifications for some of their 8000M GPUs, but they coyly note that at least one more GPU announcement will be forthcoming in Q2 2013 (8900M by all appearances). Today is a soft launch of high level details, with more architectural information and product details scheduled for January 7, 2013 at CES. AMD did not share any codenames for the newly announced mobile GPUs, if you’re wondering, other than the overall family name of “Solar” for the mobile chips (replacing the outgoing “London” series), but we do know from other sources that the 384 core part is codenamed "Mars" while the larger 640 core part is codenamed "Neptune". Here are the details we have right now:

AMD Radeon HD 8500M, 8600M, 8700M, and 8800M
  Radeon
HD 8500M
Radeon
HD 8600M
Radeon
HD 8700M
Radeon
HD 8800M
Stream Processors 384 384 384 640
Engine Clock 650MHz 775MHz 650-850MHz 650-700MHz
Memory Clock 2.0GHz/4.5GHz 2.0GHz/4.5GHz 2.0GHz/4.5GHz 4.5GHz
Memory Type DDR3/GDDR5 DDR3/GDDR5 DDR3/GDDR5 GDDR5
FP32 GFLOPS 537 633 537-691 992
FP64 GFLOPS 33 39 33-42 62

Obviously there are a lot of missing pieces right now, but what we immediately notice is that the core count on the 8500M/8600M/8700M means that we’re definitely looking at a new GPU. The only other time we’ve seen AMD do 384 cores is with Trinity, but that’s a VLIW4 architecture so we’re not seeing that again. Given the currently shipping Southern Islands chips (“London” on the mobile side) have 640 cores max for Cape Verde, 1280 max for Pitcairn, and up to 2048 for Tahiti, AMD has likely created a fourth SI derivative that drops down to two CU arrays, each with three CUs. (You can read more about the GCN/SI architecture in our earlier GPU coverage.) Performance is something of a wildcard with the new 384 core parts, and the choice of DDR3/GDDR5 memory will also influence the final result. We'll find out in the coming months how the 8500/8600/8700M stack up to NVIDIA's midrange "GT" offerings, which interestingly are also using 384 cores.

Also worth a quick note is that AMD is not discussing TDPs at this point in time—which is common practice for both AMD and NVIDIA. We expect the new "Mars" parts to be more power efficient than the outgoing Thames/Turks cores, thanks to the shrink to a 28nm process. However, AMD and NVIDIA typically stick to common power targets for laptops that are dictated by their OEM partners, which often means they'll play with clock speeds in order to hit a specific TDP. That's why all of the clock speeds listed in the above table have a qualifying "up to" prefix (which I omitted).

The final announced card is the one where we appear to have more of a rebrand/optimization of a previous generation chip. 8800M has the same 640 core count as Cape Verde/7800M, only with modified clocks this time. The earlier 7800M chips could clock up as high as 800MHz, so maximum core clock is actually down a bit, but they only ran the memory at up to 1GHz (4GHz effective) GDDR5. If AMD determined memory bandwidth was more important for that particular GPU than shader performance, the new 8800M would make sense. Also note that AMD isn’t including the boost clock speeds into the above chart; under the right circumstances, all of the new chips can run at higher clocks than the reference clock.


Radeon 7800M Left, Radeon 8800M Right

AMD isn’t calling the 8800M a rebrand, but we’re looking at the same core counts as Cape Verde and the same 28nm process technology, so we wouldn’t expect a substantial change in performance. There’s also the above chip shot as a point of reference. If the 8800M is substantially different from Cape Verde then the above images provided in AMD’s slides must be incorrect, as the new and old chips look the same. Minor tweaks to power use, caching, or other elements notwithstanding, we’re probably dealing with a die respin at most. But, there’s nothing inherently wrong with rebranding—AMD and NVIDIA have both been doing it for some time now. Don’t expect every “upgraded” GPU to be better; a 7400M isn’t faster than a 6700M, and likewise we expect 7700M and 7800M to be faster options than the 384 core 8500M/8600M/8700M and competitive with 8800M. Here’s a quick recap of the same core specs as above for the current 7700M/7800M parts:

AMD Radeon HD 7700M/7800M Specifications
  Radeon
HD 7730M
Radeon
HD 7750M
Radeon
HD 7770M
Radeon
HD 7850M
Radeon
HD 7870M
Stream Processors 512 512 512 640 640
Engine Clock 575-675MHz 575MHz 675MHz 675MHz 800MHz
Memory Clock 1.8GHz 4.0GHz 4.0GHz 4.0GHz 4.0GHz
Memory Type DDR3 GDDR5 GDDR5 GDDR5 GDDR5
FP32 GFLOPS 589-691 589 691 864 1024
FP64 GFLOPS 36.8-43.2 36.8 43.2 54 64

I’ll refrain from commenting too much more about performance of an unreleased part, but AMD indicated their 8870M should be substantially faster than NVIDIA’s current GT 650M GDDR5 (which isn’t too surprising considering clocks and core counts), and the 8770M should likewise be a healthy 20%+ bump in performance relative to the 7670M. I’d rather see comparisons with GTX 670MX and HD 7770M, respectively, but I suspect those wouldn’t be quite as impressive. Anyway, you can see AMD’s comparison charts in the complete slide deck gallery below. Availability of the new GPUs is slated for Q1 2013.

Comments Locked

88 Comments

View All Comments

  • Stuka87 - Wednesday, December 19, 2012 - link

    AMD nuts would have to be nuts to open source the whole driver. There is a lot of code in there that they do NOT want nVidia to see. Same reason nVidia wont open source their whole driver.

    When you open source something, you are basically giving it away. Any trade secrets that you have in that code is no longer a secret once released.
  • HibyPrime1 - Wednesday, December 19, 2012 - link

    The hope is that the gains from releasing their driver source would outweigh the losses to NVIDIA, if there were to be losses.

    At this point, I'm sure the majority of the software tricks either party uses is well known by both sides. If this were 10 years ago I'd say the trade secrets should be kept secret, but I doubt there is more than just minor things that NVIDIA doesn't already know.

    In one of the more public secret keepings (if that makes sense) with eyefinity, AMD/ATI managed to keep that secret right up until launch day. It didn't take long for NVIDIA to do the same thing in software. So even keeping a secret closed up until launch day only gave AMD a few months (IIRC) of a head start.

    That said even if there were major trade secrets in the code, I doubt they would be of much use. The two companies use very different hardware and quite likely have very different requirements.
  • CeriseCogburn - Wednesday, December 19, 2012 - link

    It's amazing how stupid the amd fan nuts are - they whined for years without PhysX and screamed proprietary, pretended nVidia didn't offer the port through NGOHQ and support it while amd killed it, then when amd finally put some money where their mouth was on open source, they LOCKED UP WINZIP tight into their little nazi proprietary bundle, becoming the most gigantic hypocrite possible.

    Now, the little fluttering brainflies want another liars show - perhaps AMD should just tell them they will soon open source everything, then direct the idiot minions to attack nVidia because they have not done so already...

    Yes, that will work - then when amd files bankruptcy and caves in from their cheap penny pinching idiot worshipful bagbrains, they can lock up the code forever in some deep and mysterious hostile takeover agreement... and their tinpot dictated believe any lie ever fans can blame the evil "BAIN!" that did freedom and open source code in...

    There, that should be sent directly to amd HQ, they'll probably accuse me of corporate espionage for stealing their PR campaign outlines.
  • g101 - Sunday, March 17, 2013 - link

    Oh look, the lifeless dipshit is still here.

    You troll every single AMD article with meaningless BS. How does it feel when nvidia releases products after AMD and has shitty, shitty 'physx' (very little resemblance to actual physics and *nothing* that couldn't be run easily on a modern CPU) titles like borderlands 2 ? It looks like it was drawn by a kid with a smudgy marker. This wondrous display of fidelity and 'pisics gaise!' was going on while AMD was working on superior and innovative products with the larger developers, all while consistently outperforming nvidia for less.

    Then...the compute comparison comes into play...

    Sorry, little boy. The green-with-jealousy team brought nothing to the table this year except over-priced, no overhead, green-light crap.

    Get an education, N0vida will not be able to pay you for being part of their water-army-comment-ridiculously-deluded-little-child group much longer...AMD now owns the console market. PS4: AMD APU. xbox720: AMD APU. Steambox: AMD APU, possibly with a small AMD discrete card. Wii: AMD APU.

    I guess nvidia will have to make their own titles?
  • klmccaughey - Wednesday, December 19, 2012 - link

    They just can't open source them, it would be commercial suicide. I would love it, but I just can't see it happening.
  • CeriseCogburn - Tuesday, December 25, 2012 - link

    Wait a minute !
    All the amd fanboys for the past uncountable years have:

    DEMANDED that nVidia open source it's PhysX !!!

    I'm sure they still demand it, even though their queen traitor amd locked up winzip tight with proprietary code amd and all it's minions and masters preached they would never, ever do !

    I haven't seen a single amd fanboy apologize, EVER.
    Now that's as shameful as it gets.
    You can't do worse.
  • g101 - Sunday, March 17, 2013 - link

    Oh look, the lifeless dipshit is still here.

    You troll every single AMD article with meaningless BS. How does it feel when nvidia releases products after AMD and has shitty, shitty 'physx' (very little resemblance to actual physics and *nothing* that couldn't be run easily on a modern CPU) titles like borderlands 2 ? It looks like it was drawn by a kid with a smudgy marker. This wondrous display of fidelity and 'pisics gaise!' was going on while AMD was working on superior and innovative products with the larger developers, all while consistently outperforming nvidia for less.

    Then...the compute comparison comes into play...

    Sorry, little boy. The green-with-jealousy team brought nothing to the table this year except over-priced, no overhead, green-light crap.

    Get an education, N0vida will not be able to pay you for being part of their water-army-comment-ridiculously-deluded-little-child group much longer...AMD now owns the console market. PS4: AMD APU. xbox720: AMD APU. Steambox: AMD APU, possibly with a small AMD discrete card. Wii: AMD APU.

    I guess nvidia will have to make their own titles?
  • SydneyBlue120d - Tuesday, December 18, 2012 - link

    I was talking about this:
    http://www.phoronix.com/scan.php?page=news_item&am...
  • MrSpadge - Tuesday, December 18, 2012 - link

    Of the small minority running Linux even fewer need serious 3Dacceleration - otherwise they'd be running windows by now.
  • Inteli - Tuesday, December 18, 2012 - link

    Actually, I just switched my laptop to linux, and it's a lot less resource intensive. I imagine i would get better performances on games, so I see more people using linux for more 3D intensive applications.

Log in

Don't have an account? Sign up now