New for GTX 800M: Battery Boost

While the core hardware features have not changed in most respects – Maxwell and Kepler are both DX11 parts that implement some but not all of the DX 11.1 features – there is one exception. NVIDIA has apparently modified the hardware in the new GTX 800M chips to support a feature they’re calling Battery Boost. The short summary is that with this new combination of software and hardware features, laptops should be able to provide decent (>30 FPS) gaming performance while delivering 50-100% more battery life during gaming.

This could be really important for laptop gaming, as many people have moved to tablets and smartphones simply because a laptop doesn’t last long enough off AC power to warrant consideration. Battery Boost isn’t going to suddenly solve the problem of a high-end GPU and CPU using a significant amount of power, but instead of one hour (or less) of gaming we could actually be looking at a reasonable 2+ hours. Regardless, NVIDIA is quite excited to see where things go with Battery Boost, and we’ll certainly be testing the first GTX 800M laptops to provide some of our own measurements. Let’s get into some of the details of the implementation.

First, Battery Boost will require the use of NVIDIA’s GeForce Experience (GFE) software. You can see the various settings in the above gallery, though the screenshots are provided by NVIDIA so we have not yet been able to test this. Battery Boost builds on some of the already existing features like game profiles and optimizations, but it adds in some additional tweaks. Each GFE game profile on a laptop with Battery Boost will now have options for plugged in and battery power settings, and along with that setting is the ability to set a target frame rate (with 30 being commonly recommended as a nice balance between smoothness and reducing power use).

NVIDIA went into quite a bit of detail explaining how Battery Boost is more than simply targeting a lower average frame rate. That’s certainly a large part of the power savings, but it’s more than just capping the frame rate at 30 FPS. NVIDIA provided some information with a test laptop running Borderlands 2 where the baseline measurement was 86 minutes of battery life; turning on frame rate targeting at 30 FPS improved battery life by around 25% to 107 minutes, while Battery Boost is able to further improve on that result by another 22% and deliver 131 minutes of gameplay.

NVIDIA didn’t reveal all the details of what they’re doing, but they did state that Battery Boost absolutely requires a new 800M GPU – that’s it’s not a purely software driven solution. It’s an “intelligent” solution that has the drivers monitoring all aspects of the system – CPU, GPU, RAM, etc. – to reduce power draw and reach maximum efficiency. I suspect some of the “secret sauce” comes by way of capping CPU clocks, since most games generally don’t need the CPU running at maximum Turbo Boost to deliver decent frame rates, but what else might be going on is difficult to say. It also sounds as though Battery Boost requires certain features in the laptop firmware to work, which again would limit the feature to new notebooks.

Besides being system wide and intelligent, NVIDIA has two other “talking points” for Battery Boost. It will be automatic – unplug and the Battery Boost settings are enabled; plug in and you switch back to the AC performance mode. That’s easy enough to understand, but there’s a catch: you can’t have a game running and have it switch settings on-the-fly. That’s not really surprising, considering many games require you to exit and restart if you want to change certain settings. Basically, if you’re going to be playing a game while unplugged and you want the benefits of Battery Boost to be active, you’ll need to unplug before starting the game.

As for being user tunable, the above gallery already more or less covers that point – you can customize the settings for each game within GFE. I did comment to NVIDIA that it would be good to add target frame rate to the list of customization options on a per-game basis, as there are some games where you might want a slightly higher frame rate and others where lower frame rates would be perfectly adequate. NVIDIA indicated this would be something they can add at a later date, but for now the target frame rate is a global setting, so you’ll need to manually change it if you want a higher or lower frame rate for a specific game – and understand of course that higher frame rates will generally increase the load on the GPU and thus reduce battery life.

There’s one other aspect to mobile gaming that’s worth a quick note. Most high-end gaming laptops prior to now have throttled the GPU clocks when unplugged. This wasn’t absolutely necessary but was a conscious design decision. In order to maintain higher clocks, the battery and power circuitry would need to be designed to deliver sufficient power, and this often wasn’t considered practical or important. Even with a 90Wh battery, the combination of a GTX-class GPU and a fast CPU could easily drain the battery in under 30 minutes if allowed to run at full performance. So the electrical design power (EDP) of most gaming notebooks until now has capped GPU performance while unplugged, and even then battery life while gaming has typically been less than an hour. Now with Battery Boost, NVIDIA has been working with the laptop OEMs to ensure that the EDP of the battery subsystem will be capable of meeting the needs of the GPU.

Your personal opinion of Battery Boost and whether or not it’s useful will largely depend on what you do with your laptop. Presumably the main reason for getting a laptop instead of a desktop is the ability to be mobile and move around the house or take your PC with you, and Battery Boost should help improve the mobility aspect for gaming. If you rarely/never game while unplugged, it won’t necessarily help in any way but then it won’t hurt either. I suspect many of us simply don’t bother trying to game while unplugged because it drains the battery so quickly, and potentially doubling your mobile gaming time will certainly help in that respect. It’s a “chicken and egg” scenario, where people don’t game because it’s not viable and there’s not much focus on improving mobile gaming because people don’t play while unplugged. NVIDIA is hoping by taking the first step to improving mobile battery life that they can change what people expect from gaming laptops going forward.

Introducing NVIDIA’s GeForce 800M Lineup for Laptops Other Features: GameStream, ShadowPlay, Optimus, etc.
Comments Locked

91 Comments

View All Comments

  • ThreeDee912 - Wednesday, March 12, 2014 - link

    Very minor typo on the first page.
    "The second GTX 860M will be a completely new Maxell part"

    I'm assuming "Maxell" should be "Maxwell".

    /nitpick
  • gw74 - Wednesday, March 12, 2014 - link

    why do I live in a world where thunderbolt eGPU for laptops are still not a thing and astronomically expensive, mediocre-performing gaming laptops are still a thing?
  • willis936 - Wednesday, March 12, 2014 - link

    Because outward facing bandwidth is scarce and expensive. Even with thunderbolt 2.0 (which has seen a very underwhelming adoption from OEMs) the GPU will be spending a great deal of time waiting around to be fed.
  • lordmocha - Sunday, March 16, 2014 - link

    the few videos on youtube show that it is possible to run graphics cards over TB1 4x pice, and be able to achieve 90%+ performance out of the card when connected to an external monitor and 80%+ performance when feeding back through the TB to the internal monitor

    so basically eGPU could really be a thing right now, but no-one is making a gaming targeted pcie tb connector. (sonnet's one needs an external psu if you are trying to put a gpu in it)
  • rhx123 - Wednesday, March 12, 2014 - link

    Because GPU Makers can sell mobile chips for a huge increase over desktop chips.
    A 780M, which is roughly comparable to a desktop 660 nets Nvidia a hell of a lot more cash.
    A 660 can be picked up for arround £120 these days, whereas on Clevo reseller site in my country a 770-780M upgrade costs £158.

    Nvidia knows that a large percentage of very high end gaming laptops (780M) just sit on desks and are carried around very infrequently, which could easily be undercut piecewise with a eGPU.

    My 13inch laptop, 750Ti ExpressCard eGPU, and PSU for the eGPU (XBOX 360) can still easily fit into a backpack for taking round to a friends, and costed much less than any similar performing laptop avaible at the time, and when I don't need that GPU power then I have an ultraportable 13 inch at my disposal.
  • CosmosAtlas - Wednesday, March 26, 2014 - link

    Because intel does not give permissions for making thunderbolt eGPU. I was waiting for a thunderbolt based Vidock, however it will never happen because of this.
  • willis936 - Wednesday, March 12, 2014 - link

    So I take it nvidia hasn't hinted at the possibility of g-sync chips being included in laptop panels? I think they'd make the biggest impact in laptops where sub 60 fps is practically given on newer titles.
  • JarredWalton - Thursday, March 13, 2014 - link

    I asked about this at CES. It's something NVIDIA is working on, but there's a problem in that the display is being driven by the Intel iGPU, with Optimus working in the background and rendering the frames. So NVIDIA would have to figure out how to make Intel iGPU drive the LCD properly -- and not give away their tech I suppose. I think we'll see a solution some time in the next year or two, but G-Sync is still in its early stages on desktops, so it will take time.
  • Hrel - Wednesday, March 12, 2014 - link

    I'm surprised you guys didn't say anything about the 850M not supporting SLI. I was expecting a paragraph deriding that decision by Nvidia. I'm really upset. I would have loved to see how energy efficient SLI could get. Lenovo has had that laptop with 750M in SLI for about a year now and I've thought that was kinda stupid.

    But considering how power efficient Maxwell is maybe that could actually be a good idea now.

    Maybe they'll still do it with underclocked GTX 860M's.

    Hm, I bet that's what Nvidia wanted. To discourage OEM's from buying 2 cheaper GPU's/laptop instead of ONE hugely expensive one. Prevent them from SLI'ing the best GPU in their lineup.

    Yep, pissed. I'm pissed.
  • Hrel - Wednesday, March 12, 2014 - link

    best GPU for SLI* in their lineup.

Log in

Don't have an account? Sign up now