Medium Detail Gaming Comparison

We’re going to focus on the main gaming target for this level of hardware: our Medium presets. The Low preset tends to look pretty lousy in several of the games (StarCraft II being the worst), while our High settings are too much for the GT 540M/HD 6630M in about half of the titles, even at 1366x768. Medium detail on the other hand should give us 30+ FPS in nearly all of the games we’re testing while still providing decent visuals. Here are the results.

Battlefield: Bad Company 2

Civilization V

DiRT 2

Left 4 Dead 2

Mafia II

Mass Effect 2

Metro 2033

STALKER: Call of Pripyat

StarCraft II: Wings of Liberty

Total War: Shogun 2

In terms of performance, the 6630M matches up nicely against the GT 540M. It wins a few games by a small margin, loses a few by a similarly small margin, and ties in the remainder. If we were just looking at gaming performance, there’s not much to discuss here. NVIDIA still has CUDA and PhysX, and AMD has their Accelerated Parallel Processing SDK, but given the level of performance we’re talking about, you won’t need or use these extras most of the time. Unfortunately for AMD, it’s not just about gaming performance.

We tested ten games (though we don’t have results from the Alienware M11x R3 in Civilization V and Total War: Shogun 2), and the newest titles are Civ5 (13 months old) and TWS2 (6 months old). You’d hope that with relatively known games, compatibility wouldn’t be an issue, but we ran into two problems during testing.

The first wasn’t a consistent problem, but we did have a couple of crash to desktop issues when benchmarking DiRT 2 using Dynamic Switchable Graphics mode. The other issue was a lot worse, and it was in StarCraft II when running at anything more than the Low default settings. As noted earlier, SC2 looks pretty lousy at its minimum settings, and Medium detail (and possibly even High) should be more than playable on the HD 6630M. In dynamic mode, however, not only was performance quite a bit lower than expected at our Low settings, but at Medium and above there were tons of missing textures. The solution is to switch to manually controlled switching, which solved both the performance and rendering errors, but then why even have dynamic switching in the first place?

While we’re on the subject, let’s also take a look at a few other comparisons. Running “stock” on the CPU, the Acer 3830TG is slightly slower overall than when using ThrottleStop to limit the CPU to 2.1GHz. With ThrottleStop, the average score across ten games improves by 20%, so the CPU throttling has a major impact in games. Civ5, ME2, and TWS2 show the least difference, with Mass Effect 2 being the only game that ran slightly faster at the stock CPU settings; Mafia 2 runs 10% faster, Metro 2033 and STALKRE are 13% faster, L4D2 jumps up 28%, DiRT2 is 49% faster, and the ever-CPU-limited StarCraft II runs 58% faster. Interesting to point out is that 2.1GHz is 75% higher than 1.2GHz, so on average it looks like the 3830TG runs around 1.3GHz in StarCraft II.

Flipping over to the other GT 540M equipped laptops and using the best result from the Acer 3830TG, the Alienware M11x R3 averages out to 3.5% faster, but it only has a clear lead in two games (DiRT2 and SC2) while it’s slightly slower in L4D2 and Mafia2. The Dell XPS 15 comes in 9% faster than the 3830TG on average, with the biggest lead again coming from SC2; Mafia2 once again gives the 3830TG a lead, so the latest NVIDIA drivers appear to be playing a role.

Finally, since we now have results from the HD 6630M with a Llano A8-3500M as well as the i5-2410M, we thought it would be interesting to see just how much performance you give up by gaming with the slower Llano CPU. If you’ll recall, in our first look at a Llano notebook, we expressed concern that the CPU would hold back GPU performance. Out of our ten tested titles, only Mass Effect 2 ran faster on Llano with a dGPU. Most games (at Medium settings where the GPU becomes more of a bottleneck) are close: BFBC2 and Civ5 are exactly the same performance, Mafia 2 is 2% faster with the Intel CPU, and Dirt2, L4D2, Metro, and STALKER are all within 5-10%. Not surprisingly, StarCraft II is the big outlier, running over twice as fast on the Intel i5-2410M. We’re still not looking at similar driver builds, as the Llano laptop is running a Catalyst 11.6 derivative (8.862.110607a-120249E) compared to 11.1 on the Sony, but even so the average performance increase with an Intel CPU comes in at 12% (mostly thanks to SC2).

Since there were two games where we didn’t have a perfect experience, we decided to look at some more recent titles.

How AMD’s Dynamic Switchable Graphics Works What about Recent Games?
Comments Locked

91 Comments

View All Comments

  • JarredWalton - Tuesday, September 20, 2011 - link

    Thanks for the input -- I've corrected the 6700M mistakes now. Somehow I got it stuck in my brain that 6700M was rebadged 5700M, but that's only the 6300M and 6800M. Thanks also for the updates on Linux--good to know how it does/doesn't work.
  • rflynn88 - Tuesday, September 20, 2011 - link

    Correction required:

    The 6700M series chips do support dynamic switching. I'm not sure if the 6300M series does. The HP dv6t can be optioned out with a Core i5/i7 Sandy Bridge chip and the 6770M with dynamic switching. There was actually a big issue with the dynamic switching not working on the dv6t for OpenGL applications, which was only recently remedied with a BIOS update.
  • JarredWalton - Tuesday, September 20, 2011 - link

    Corrected -- see above note. I mistakenly confused the 6700M situation with the 6300M/6800M, which are rebadged 5000M parts and would not have the decoupled PCI-E interface to allow for dynamic switching.
  • BernardP - Tuesday, September 20, 2011 - link

    What I would like to do with Optimus is disable it completely and have my notebook always use the Nvidia graphics, even on the desktop. I don't care about battery life, as my notebook is almost never running on battery.

    After searching on the web, I have found no way to disable Optimus. Anybody here have a solution?
  • bhima - Tuesday, September 20, 2011 - link

    Yes. Open up your NVIDIA settings. Change your Global Settings preffered graphics processor to the NVIDIA card. Voila!
  • BernardP - Tuesday, September 20, 2011 - link

    As simple as that? I'm hopeful...

    Please allow me a bit of residual doubt, considering, for example, the following discussion where there is a mention of this suggested setting.

    http://superuser.com/questions/282734/how-to-disab...

    However, I'll try your suggestion on a new Alienware laptop with Optimus one of my friends just bought.
  • JarredWalton - Tuesday, September 20, 2011 - link

    The above change won't disable Optimus so much as always try to run programs on the dGPU. To my knowledge, there is no way to disable it completely, since with Optimus there is no direct link between the dGPU and the display outputs. All data has to get copied over the PCI-E bus to the IGP framebuffer.
  • BernardP - Wednesday, September 21, 2011 - link

    That's also my understanding, hence my doubts...
  • seapeople - Tuesday, September 20, 2011 - link

    I can see not caring about battery life, but you don't care about heat and/or noise either? More power use = more heat = constant fan turning on and off vs silent operation using only the integrated GPU.

    Less heat also equates to longer hardware lifetime.

    I just don't understand why you would want a fat GPU idling while you browse AnandTech instead of the low power Intel GPU built into your computer?
  • BernardP - Wednesday, September 21, 2011 - link

    Because I want a not-so-fat NVidia GPU, such as GT520M or GT525M, to pair with a high quality 1980x1080 IPS screen.... and then, use NVidia scaling to set up a custom resolution that will allow my old presbyobic eyes to see what's on the screen. For a 15.6 in. screen, 1440x900 is about right and with the NVidia custom resolution tools, there is no significant loss of quality.

    AMD graphics drivers don't allow setting up custom resolutions. Can Intel graphics drivers do it?

    And I know that one can make things bigger onscreen with Windows settings, but it doesn't work all the time for everything. There is no substitute for setting up custom resolutions.

Log in

Don't have an account? Sign up now