Performance Expectations

In their presentation and FAQ, NVIDIA provided estimates of performance relative to Ivy Bridge Core i5 ULV with HD 4000. Before we get to those numbers, we want to quickly set the stage for what NVIDIA is showing. Despite the similarity in name and features, as we discovered last year the ULV chips tend to run into TDP limits when you’re trying to hit both the CPU and the iGPU. The CPU cores for instance can use around 12-15W and a full load on the iGPU can create another 10-15W; stuff both into a 17W TDP and you’re going to get throttling, which is exactly what happens.

Looking at HD 4000 performance with Core i5 ULV and Core i7 quad-core, you can see that the quad-core part is anywhere from 0% to around 60% faster. On average it’s 35% faster at our 2012 “Value” settings and 26% faster at our 2012 “Mainstream” settings. As for the 700M performance relative to Core i5 ULV, NVIDIA provides the following estimates based on benchmarks at moderate detail and 1366x768 in Battlefield 3, Crysis 2, Just Cause 2, DiRT 3, and F1 2011:

Besides the above slide, NVIDIA provided some performance estimates using results from the 3DMark 11 Performance benchmark, and the results are even more heavily in favor of NVIDIA. In their FAQ, NVIDIA states that even the lowly GeForce 710M is three times faster than ULV HD 4000, while the GT 720M is 3.3x faster, GT 730M and 735M are 4.8X faster (hmmm…do we really need GT 735M?), GT 740M is 5.3X faster, GT 745M is 5.8x faster, and GT 750M is 6.3x faster. Of course, those number are from NVIDIA, going up against the much slower ULV variant of Ivy Bridge, and using 3DMark 11—which isn’t quite as important as actual gaming performance.

I suspect the GT3 and GT3e configurations of Haswell will be substantially faster than IVB’s HD 4000 and may come close to the lower end of NVIDIA’s range…at least on the standard voltage Haswell chips. For ULV, I’ve heard a performance estimates that GT3 Haswell will be 30%-50% faster than GT2 IVB, and GT3e could be roughly twice as fast, but that should still leave NVIDIA with a healthy lead. Anyway, we’d suggest taking all of these numbers with a grain of salt for now. The real comparison for most is going to be Haswell and 700M, and while we have a pretty good idea where 700M and HD 4000 performance fall (since the 700M parts are Kepler and Fermi updates), Haswell’s iGPU is likely to be a different beast.

Closing Thoughts

On the whole, Kepler has been amazingly successful for NVIDIA, particularly in the mobile world. The bar for midrange mobile dGPUs was raised significantly with the GT 640M LE and above parts typically offering anywhere from 25% to 75% better performance than the previous generation, and that was accomplished along with reducing power use. It was NVIDIA’s version of Intel’s Core 2 launch, and the vast majority of notebooks with dGPUs seem to be using NVIDIA hardware these days. Much of that can also be attributed to NVIDIA’s driver team, where Optimus support and usability still trumps AMD’s Enduro alternative. AMD is still working to improve their drivers, but they're still not at the same level as NVIDIA's mobile drivers.

Not surprisingly, it looks like every laptop with an NVIDIA dGPU these days also comes with Optimus support, and NVIDIA says they’ll be in three times as many Ultrabooks and ultraportables in 2013 compared to 2012—which isn’t too hard, since off the top of my head the only two Ultrabooks with NVIDIA dGPUs I can name are the Acer M5 and the ASUS UX32VD. NVIDIA also says they have over 30 design wins for touchscreen laptops, but again considering Windows 8 almost requires a touchscreen to really be useful that’s expected. We will likely see a limited number of laptops launching with Ivy Bridge CPUs and 700M dGPUs over the coming weeks, with ASUS specifically listed in NVIDIA’s 700M FAQ with their X450 (GT 740M) and N46 (GT 740M as well); Lenovo is also a launch day partner with several options: Y400 with GT 750M, and Z400/Z500 with GT 740M.

The real launch is likely to coincide with Intel’s Haswell update later in Q2 2013. When that comes along, we're likely to see some additional 700M updates from NVIDIA on the high end (again, echoing what happened with the 600M and 680M launches). Just don't count on seeing a mobile variant of Titan/GK110 for a while yet; I'd peg that level of performance as something we won't see in laptops until we have two more process shrinks under our belts (i.e. when TSMC is at 16nm).

GeForce 700M Models and Specifications
Comments Locked

91 Comments

View All Comments

  • Kevin G - Monday, April 1, 2013 - link

    What I'd like to see is an ExpressCard version of the low end parts. I've been working with numerous business class laptops with this expansion slot and I've run into the situation where I could use an additional display. I've used USB adapters but they've been less than ideal. I fathom a low clock GK208 chip and a 64 bit wide memory bus could be squeezed into an ExpressCard form factor. I'd expect it to perform around the level of Intel HD4000 but that'd still be far superior to USB solutions.
  • arthur449 - Monday, April 1, 2013 - link

    While some ExpressCard slots give access to the PCI-E bus, the problem is that the laptop's BIOS/UEFI has to support the device in its whitelist. In almost every situation where people have modded their laptops and attached them to external GPUs, they had to flash a custom ROM to remove compatibility restrictions put in place to limit the amount of compatibility testing the vendor had to conduct.
  • rhx123 - Monday, April 1, 2013 - link

    A surprisingly low amount of laptops needed modification to remove the whitelist on the express card slot, and it is possible to do it with software pre-windows if there is whitelisiting.
    I did not have to whitelist on my Lenovo X220T.
  • JarredWalton - Monday, April 1, 2013 - link

    Cooling would require the majority of the GPU to exist outside of the slot if you go this route. I don't think you could properly route heat-pipes through the relatively thin slot opening with a radiator/fan on the outside. Once you go external, the number of people really interested in the product drops quite a bit, and you'd still need to power the device so on most laptops without a dGPU I expect the external ExpressCard option would also require external power. At that point, the only real value is that you could have an external GPU hooked up to a display and connect your laptop to it for a semi-portable workstation.
  • Kevin G - Monday, April 1, 2013 - link

    It would be crazy to put any of these chips into an ExpressCard form factor without reducing power consumption. I was thinking of dropping the clock down to 400 Mhz and cutting power consumption further with a corresponding drop in voltages. It wouldn't have to break any performance records, just provide full acceleration and drive an external display.

    In hindsight, the GK208 may be too power hungry. The 28 nm Fermi parts (GF117?) should be able to hit the power and thermal allocations for ExpressCard without resorting to an external chassis.
  • Wolfpup - Tuesday, April 2, 2013 - link

    I like the IDEA of a connection to an external dock that allows ANY video card to be used (heck, why not go for SLI?) but notebooks would have to be able to support it-sounds like lots don't, plus tons of notebooks don't have ExpressCard slots anymore (plus not sure if the bandwidth would start being a bottleneck or not). (Or obviously Thunderbolt could theoretically pull this off too...IF you could just boot with any GPU installed and have the external GPU active by the time Windows boots at least).
  • rhx123 - Monday, April 1, 2013 - link

    You can make an external graphics card if you want, I have a 650Ti desktop card attached through ExpressCard.

    It's powered by an XBox PSU.

    http://imgur.com/239skMP
  • rhx123 - Monday, April 1, 2013 - link

    It can drive the internal laptop display through Optimus.
  • Flunk - Monday, April 1, 2013 - link

    Disappointing, this is a really small bump. Mostly a re-labelling of existing parts. Although I suppose it is to be expected seeing as almost all Geforce GT 640m LE-650ms can be clocked up to 1100Ghz with a little bit of bios hacking.
  • JarredWalton - Monday, April 1, 2013 - link

    Besides the fact that nothing runs at 1100GHz (or Ghz, whatever those are), I dare say you've exaggerated quite a bit. Many laptops with even moderate dGPUs run quite warm, and that's with the dGPUs hitting a max clock of around 900MHz (GT 650M with DDR3 and a higher clocked core as opposed to GDDR5 with a lower clocked core). If you manage to hack the VBIOS for a laptop to run what is supposed to be a 500MHz part at 1GHz or more, you're going to overload the cooling system on virtually every laptop I've encountered.

    In fact, I'll go a step further and say that with very few exceptions, overclocking of laptops in general is just asking for trouble, even when the CPU supports it. I tested old Dell XPS laptops with Core 2 Extreme CPUs that could be overclocked, and the fans would almost always be at 100% under any sort of load as soon as you started overclocking. Long-term, that sort of thing is going to cause component failures far more quickly, and on laptops that cost well over $2000 I think most would be quite angry if it failed after a couple years.

    If you understand the risks and don't really care about ruining a laptop, by all means have at it. But the number of laptops I've seen running stock that have heat dissipation issues urges extreme caution.

Log in

Don't have an account? Sign up now