Introduction

If desktop graphics hardware can be more than a little confusing, deciphering performance of mobile graphics parts can be (and has historically been) an absolute nightmare. Way back in the day it was at least fairly easy to figure out which desktop chip was hiding in which mobile kit, but both AMD and NVIDIA largely severed ties between mobile and desktop branding. They may not want to readily admit that, and in the case of certain models they still pretty heavily rely on the cachet associated with their desktop hardware, but it's by and large true. So to help you make sense of mobile graphics, we present to you the first in what will hopefully be a regular series of guides.

I started putting guides like this one together back at my alma mater NotebookReview, and they've always been pretty well-received. It's really not hard to understand why: while NVIDIA and AMD are usually pretty forthcoming with the specs of their desktop kit, they've historically been pretty cagey about their notebook graphics hardware. As a result, sites like this one have had to sift through information about different laptops, compare notes with other sites and readers, and eventually compile the data. Forums will light up with questions like "can this laptop play xyz?"

Thankfully, the advent of DirectX 11 drastically simplified my job. Whenever shader models or even entire DirectX versions were bifurcated, complication followed suit, but with DirectX 11 pretty much everybody is on board with the same fundamental feature sets, and AMD and NVIDIA both support their respective technologies across the board. Intel remains the odd man out, as you'll see.

We'll break things down into three categories. The first is integrated graphics, which interestingly has gone entirely on-package and even on-die over the past year. It's surprising how fast that change really occurred. Coupled with NVIDIA's exit from the chipset business, we're strictly looking at Intel and AMD here. The second and third are dedicated to AMD and NVIDIA's mobile lines. Wherever possible we'll also link you to a review that demonstrates the performance of the graphics hardware in question. And note that when we talk about the number of shaders, CUDA cores, or EUs on a given part, that these numbers are ONLY comparable to other parts from the same vendor; 92 of NVIDIA's CUDA cores are not comparable to, say, 160 shaders from an AMD Radeon.

Integrated Graphics

"Too Slow to Play" Class: Intel HD Graphics (Arrandale), Intel Atom IGP, AMD Radeon HD 4250
Specs aren't provided because in this case they aren't really needed: none of these integrated graphics parts are going to be good for much more than the odd game of Unreal Tournament 2004. Intel has had a devil of a time getting their IGP act together prior to the advent of Sandy Bridge, while AMD's Radeon HD 3000/3100/3200/4200/4225/4250 core (yes, it's all basically the same core) is really showing its age. Thankfully, outside of Atom's IGP, all of these are on their way out. As for gaming on Atom, there's always the original StarCraft.

Intel HD 3000 (Sandy Bridge)
12 EUs, Core Clock: Varies
With Sandy Bridge, Intel was able to produce an integrated graphics part able to rival AMD and NVIDIA's budget entries. In fact, in our own testing we found the HD 3000 able to largely keep up with AMD's dedicated Radeon HD 6450 and to a lesser extent the 6470, and NVIDIA's current mobile lineup generally doesn't extend that low (likely excepting the GT 520M and GT 520MX). That said, there are still some caveats to the HD 3000: while Intel's questionable driver quality is largely behind it, you may still experience the odd compatibility issue from time to time (when Sandy Bridge dropped, Fallout 3 had an issue), and more punishing games like Mafia II and Metro 2033 will be largely out of its reach. The clocks on the HD 3000 also vary greatly, with a starting clock of 650MHz for mainstream parts, 500MHz for low voltage parts, and just 350MHz for ultra low voltage parts. Turbo clocks get even weirder, ranging anywhere from 900MHz to 1.3GHz depending on the processor model. Still, it's nice to not have to roll your eyes anymore at the suggestion of doing some casual gaming on Intel's integrated hardware. (Sandy Bridge Review)

AMD Radeon HD 6250/6310 (Brazos)
80 Shaders, 8 TMUs, 4 ROPs, Core Clock: 280MHz (6250), 500MHz (6310)
In Brazos, AMD produced a workable netbook-level processor core and grafted last generation's Radeon HD 5450/5470 core onto it. The result is an integrated graphics processor with a decent amount of horsepower for low-end casual gaming, but in some cases it's going to be hamstrung by the comparatively slow Bobcat processor cores. That's perfectly fine, though, as Brazos is generally a more desirable alternative to Atom + NG-ION netbooks, offering more processor performance and vastly superior battery life. Just don't expect to do any but the most casual gaming on a Brazos-powered netbook. (HP dm1z Review)

AMD Radeon HD 6380G/6480G/6520G/6620G (Llano)
160/240/320/400 (6380G/6480G/6520G/6620G) Shaders, 20/16/12 (6480G/6520G/6620G) TMUs, 8/4 (6620G and 6520G/6480G) ROPs, Core Clock: 400-444MHz
Llano isn't out anywhere near in force yet, but we have a good idea of how the 6620G performs and expect the IGP performance to essentially scale down in such a way that the model numbers are fairly appropriate. The long and short of Llano is that the processor half pales in comparison to Sandy Bridge, but the graphics hardware is monstrous. Gamers on an extreme budget are likely to be well-served by picking up a notebook with one of AMD's A6 or A8 processors in it, with Llano promising near-midrange mobile graphics performance. (Llano Mobile Review)

AMD Radeon HD 6000M Graphics
Comments Locked

85 Comments

View All Comments

  • prdola0 - Thursday, July 7, 2011 - link

    OpenGL is a "legacy" stuff? You must be living in a basement locked in a corner. Have you heard of OpenGL 4.1, which is equal or even better than DirectX 11? No. You just troll around. Unlike other trolls, your posts are not even clever or funny. Go back to DailyTech.
  • Pirks - Thursday, July 7, 2011 - link

    If OpenGL was all that unicorny and shit butterflies like you imply, then game devs would use it instead of DX. Alas, looks like you have no clue.
  • bhassel - Thursday, July 7, 2011 - link

    Game devs do use it. Ever seen a PS3 game?

    And yeah, DX is a cleaner API from a developer point of view, but that says nothing about the quality of the graphics it produces (same as OpenGL.) If anything, as more devices move away from windows, *DX* will be the legacy stuff.
  • Pirks - Thursday, July 7, 2011 - link

    Yeah, the dead PS3 of the dead Sony, welcome to the dead PSN guys! Gee what an argument, such a shitty console. This only proves my point that only shit developers on shit consoles use legacy OpenGL. If you code for Sony you must be crazy fucked up masochistic pervert enjoying pain in the ass that Sony gives you. Just read any interviews with Sony hardware using devs. I used to develop for PS3 a few years ago and you really have to look around thoroughly to find as stupid, stinking, developer unfriendly and moronic set of tools as Sony puked out for its console. Anything shitty MS ever did looks totally angelic and unicorny compared to poop Sony feeds its devs. No wonder there are some crap games on PS3 and the best stuff is on DX console from MS. Devs are smart and they like the best dev tools and the best APIs and currently no one even comes close to MS in that regard. So Sony using morons can stuff legacy OpenGL in where it belongs, ya know what I'm talking about eh :)))
  • Broheim - Friday, August 5, 2011 - link

    I'm gonna call BS on you being a developer, your complete fucking ignorance about openGL would be completely inexcusable for someone who supposedly "developed for PS3"...
  • leexgx - Saturday, July 9, 2011 - link

    so many comments on here its now stacking

    OpenGL is good in some ways as most OpenGL games should work under Linux

    but most games that use OpenGL seems like as any game thats made using OpenGL (the feel of the games seems the same)
  • UMADBRO - Thursday, July 7, 2011 - link

    I think you're the one that is clueless
  • Pirks - Friday, July 8, 2011 - link

    Tell that to game devs who use DX everywhere instead of legacy OpenGL shit on a shoddy Sony console that is as dead as PSN itself
  • Etern205 - Saturday, July 9, 2011 - link

    Pirks kicking Apple to the curb? o_0
    Hell has freeze and pigs do fly! :P
  • Broheim - Friday, August 5, 2011 - link

    adobe haven't "dropped Mac support eaons ago" because that's where they make their money.

    how do you propose Apple implements a proprietary microsoft technology, that microsoft have no intention of sharing, into their OS?
    also if openGL is so inferior why haven't Apple just written their own API like they did with openCL?

    the fact that minecraft doesn't try to look good on purpose seems to elude your simpleton brain, but that hardly comes as a surprise, minecraft is about the freedom and gameplay, it's not just shovelware with purty textures.

Log in

Don't have an account? Sign up now