Other Features: GameStream, ShadowPlay, Optimus, etc.

Along with Battery Boost, the GTX class of 800M GPUs will now also support NVIDIA’s GameStream and ShadowPlay technologies, again through NVIDIA’s GeForce Experience software. Unlike Battery Boost, these are almost purely software driven solutions and so they are not strictly limited to 800M hardware. However, the performance requirements are high enough that NVIDIA is limiting their use to GTX GPUs, and all GTX 700M and 800M parts will support the feature, along with the GTX 680M, 675MX, and 670MX. Basically, all GTX Kepler and Maxwell parts will support GameStream and ShadowPlay; the requirement for Kepler/Maxwell incidentally comes because GameStream and ShadowPlay both make use of NVIDIA’s hardware encoding feature.

If you haven’t been following NVIDIA’s software updates, the quick summary is that GameStream allows the streaming of games from your laptop/desktop to an NVIDIA SHIELD device. Not all games are fully supported/optimized, but there are over 50 officially supported games and most Steam games should work via Steam’s Big Picture mode. I haven’t really played with GameStream yet, so I’m not in a position to say much more on the subject right now, but if you don’t mind playing with a gamepad it’s another option for going mobile – within the confines of your home – and can give you much longer unplugged time. GameStream does require a good WiFi connection (at least 300Mbps 5GHz, though you can try it with slower connections I believe), and the list of GameStream-Ready routers can be found online.

On a related note, something I'd really like to see is support for GameStream extended to more than just SHIELD devices. NVIDIA is already able to stream 1080p content in this fashion, and while it might not match the experience of a GTX 880M notebook running natively, it would certainly be a big step up from lower-end GPUs and iGPUs. Considering the majority of work is done on the source side (rendering and encoding a game) and the target device only has to decode a video stream and provide user I/O, it shouldn't be all that difficult. Take it a step further and we could have something akin to the GRID Gaming Beta coupled with a gaming service (Steam, anyone?) and you could potentially get five or six hours of "real" gaming on any supported laptop! Naturally, NVIDIA is in the business of selling GPUs and I don't see them releasing GameStream for non-NVIDIA GPUs (i.e. Intel iGPUs) any time soon, if ever. Still, it's a cool thought and perhaps someone else can accomplish this. (And yes, I know there are already services that are trying to do cloud gaming, but they have various drawbacks; being able to do my own "local cloud gaming" would definitely be cool.)

ShadowPlay targets a slightly different task, namely that of capturing your best gaming moments. When enabled in GFE, at any point in time you can press Alt+F10 to save up to the last 20 minutes (user configurable within GFE) of game play. Manual recording is also supported, with Alt+F9 used to start/stop recording and a duration limited only by the amount of disk space you have available. (Both hotkeys are customizable as well.) The impact on performance with ShadowPlay is typically around 5%, and at most around 10%, with a maximum resolution of up to 1080p (higher resolutions will be automatically scaled down to 1080p).

We’ve mentioned GeForce Experience quite a few times now, and NVIDIA is particularly proud of all the useful features they’ve managed to add to GFE since it first went into open beta at the start of 2013. Initially GFE’s main draw was the ability to apply “optimal” settings to all supported/detected games, but obviously that’s no longer the only reason to use the software. Anyway, I’m not usually much of a fan of “automagic” game settings, but GFE does tend to provide appropriate defaults, and you can always adjust any settings that you don’t agree with. AMD is trying to provide a similar feature via their Raptr gaming service, but by using a GPU farm to automatically test and generate settings for all of their GPUs NVIDIA is definitely ahead for the time being.

NVIDIA being ahead of AMD applies to other areas as well, to varying degrees. Optimus has seen broad support for nearly every laptop equipped with an NVIDIA GPU for a couple years now, and the number of edge cases where Optimus doesn’t work quite as expected is quite small – I can’t remember the last time I had any problems with the feature. Enduro tends to work okay on the latest platforms as well, but honestly I haven’t received a new Enduro-enabled laptop since about a year ago, and there have been plenty of times where Enduro – and AMD’s drivers – have been more than a little frustrating. PhysX and 3D Vision also tend to get used/supported more than the competing solutions, but I’d rate those as being less important in general.

New for GTX 800M: Battery Boost Gaming Notebooks Are Thriving
Comments Locked

91 Comments

View All Comments

  • ThreeDee912 - Wednesday, March 12, 2014 - link

    Very minor typo on the first page.
    "The second GTX 860M will be a completely new Maxell part"

    I'm assuming "Maxell" should be "Maxwell".

    /nitpick
  • gw74 - Wednesday, March 12, 2014 - link

    why do I live in a world where thunderbolt eGPU for laptops are still not a thing and astronomically expensive, mediocre-performing gaming laptops are still a thing?
  • willis936 - Wednesday, March 12, 2014 - link

    Because outward facing bandwidth is scarce and expensive. Even with thunderbolt 2.0 (which has seen a very underwhelming adoption from OEMs) the GPU will be spending a great deal of time waiting around to be fed.
  • lordmocha - Sunday, March 16, 2014 - link

    the few videos on youtube show that it is possible to run graphics cards over TB1 4x pice, and be able to achieve 90%+ performance out of the card when connected to an external monitor and 80%+ performance when feeding back through the TB to the internal monitor

    so basically eGPU could really be a thing right now, but no-one is making a gaming targeted pcie tb connector. (sonnet's one needs an external psu if you are trying to put a gpu in it)
  • rhx123 - Wednesday, March 12, 2014 - link

    Because GPU Makers can sell mobile chips for a huge increase over desktop chips.
    A 780M, which is roughly comparable to a desktop 660 nets Nvidia a hell of a lot more cash.
    A 660 can be picked up for arround £120 these days, whereas on Clevo reseller site in my country a 770-780M upgrade costs £158.

    Nvidia knows that a large percentage of very high end gaming laptops (780M) just sit on desks and are carried around very infrequently, which could easily be undercut piecewise with a eGPU.

    My 13inch laptop, 750Ti ExpressCard eGPU, and PSU for the eGPU (XBOX 360) can still easily fit into a backpack for taking round to a friends, and costed much less than any similar performing laptop avaible at the time, and when I don't need that GPU power then I have an ultraportable 13 inch at my disposal.
  • CosmosAtlas - Wednesday, March 26, 2014 - link

    Because intel does not give permissions for making thunderbolt eGPU. I was waiting for a thunderbolt based Vidock, however it will never happen because of this.
  • willis936 - Wednesday, March 12, 2014 - link

    So I take it nvidia hasn't hinted at the possibility of g-sync chips being included in laptop panels? I think they'd make the biggest impact in laptops where sub 60 fps is practically given on newer titles.
  • JarredWalton - Thursday, March 13, 2014 - link

    I asked about this at CES. It's something NVIDIA is working on, but there's a problem in that the display is being driven by the Intel iGPU, with Optimus working in the background and rendering the frames. So NVIDIA would have to figure out how to make Intel iGPU drive the LCD properly -- and not give away their tech I suppose. I think we'll see a solution some time in the next year or two, but G-Sync is still in its early stages on desktops, so it will take time.
  • Hrel - Wednesday, March 12, 2014 - link

    I'm surprised you guys didn't say anything about the 850M not supporting SLI. I was expecting a paragraph deriding that decision by Nvidia. I'm really upset. I would have loved to see how energy efficient SLI could get. Lenovo has had that laptop with 750M in SLI for about a year now and I've thought that was kinda stupid.

    But considering how power efficient Maxwell is maybe that could actually be a good idea now.

    Maybe they'll still do it with underclocked GTX 860M's.

    Hm, I bet that's what Nvidia wanted. To discourage OEM's from buying 2 cheaper GPU's/laptop instead of ONE hugely expensive one. Prevent them from SLI'ing the best GPU in their lineup.

    Yep, pissed. I'm pissed.
  • Hrel - Wednesday, March 12, 2014 - link

    best GPU for SLI* in their lineup.

Log in

Don't have an account? Sign up now