Performance Expectations

In their presentation and FAQ, NVIDIA provided estimates of performance relative to Ivy Bridge Core i5 ULV with HD 4000. Before we get to those numbers, we want to quickly set the stage for what NVIDIA is showing. Despite the similarity in name and features, as we discovered last year the ULV chips tend to run into TDP limits when you’re trying to hit both the CPU and the iGPU. The CPU cores for instance can use around 12-15W and a full load on the iGPU can create another 10-15W; stuff both into a 17W TDP and you’re going to get throttling, which is exactly what happens.

Looking at HD 4000 performance with Core i5 ULV and Core i7 quad-core, you can see that the quad-core part is anywhere from 0% to around 60% faster. On average it’s 35% faster at our 2012 “Value” settings and 26% faster at our 2012 “Mainstream” settings. As for the 700M performance relative to Core i5 ULV, NVIDIA provides the following estimates based on benchmarks at moderate detail and 1366x768 in Battlefield 3, Crysis 2, Just Cause 2, DiRT 3, and F1 2011:

Besides the above slide, NVIDIA provided some performance estimates using results from the 3DMark 11 Performance benchmark, and the results are even more heavily in favor of NVIDIA. In their FAQ, NVIDIA states that even the lowly GeForce 710M is three times faster than ULV HD 4000, while the GT 720M is 3.3x faster, GT 730M and 735M are 4.8X faster (hmmm…do we really need GT 735M?), GT 740M is 5.3X faster, GT 745M is 5.8x faster, and GT 750M is 6.3x faster. Of course, those number are from NVIDIA, going up against the much slower ULV variant of Ivy Bridge, and using 3DMark 11—which isn’t quite as important as actual gaming performance.

I suspect the GT3 and GT3e configurations of Haswell will be substantially faster than IVB’s HD 4000 and may come close to the lower end of NVIDIA’s range…at least on the standard voltage Haswell chips. For ULV, I’ve heard a performance estimates that GT3 Haswell will be 30%-50% faster than GT2 IVB, and GT3e could be roughly twice as fast, but that should still leave NVIDIA with a healthy lead. Anyway, we’d suggest taking all of these numbers with a grain of salt for now. The real comparison for most is going to be Haswell and 700M, and while we have a pretty good idea where 700M and HD 4000 performance fall (since the 700M parts are Kepler and Fermi updates), Haswell’s iGPU is likely to be a different beast.

Closing Thoughts

On the whole, Kepler has been amazingly successful for NVIDIA, particularly in the mobile world. The bar for midrange mobile dGPUs was raised significantly with the GT 640M LE and above parts typically offering anywhere from 25% to 75% better performance than the previous generation, and that was accomplished along with reducing power use. It was NVIDIA’s version of Intel’s Core 2 launch, and the vast majority of notebooks with dGPUs seem to be using NVIDIA hardware these days. Much of that can also be attributed to NVIDIA’s driver team, where Optimus support and usability still trumps AMD’s Enduro alternative. AMD is still working to improve their drivers, but they're still not at the same level as NVIDIA's mobile drivers.

Not surprisingly, it looks like every laptop with an NVIDIA dGPU these days also comes with Optimus support, and NVIDIA says they’ll be in three times as many Ultrabooks and ultraportables in 2013 compared to 2012—which isn’t too hard, since off the top of my head the only two Ultrabooks with NVIDIA dGPUs I can name are the Acer M5 and the ASUS UX32VD. NVIDIA also says they have over 30 design wins for touchscreen laptops, but again considering Windows 8 almost requires a touchscreen to really be useful that’s expected. We will likely see a limited number of laptops launching with Ivy Bridge CPUs and 700M dGPUs over the coming weeks, with ASUS specifically listed in NVIDIA’s 700M FAQ with their X450 (GT 740M) and N46 (GT 740M as well); Lenovo is also a launch day partner with several options: Y400 with GT 750M, and Z400/Z500 with GT 740M.

The real launch is likely to coincide with Intel’s Haswell update later in Q2 2013. When that comes along, we're likely to see some additional 700M updates from NVIDIA on the high end (again, echoing what happened with the 600M and 680M launches). Just don't count on seeing a mobile variant of Titan/GK110 for a while yet; I'd peg that level of performance as something we won't see in laptops until we have two more process shrinks under our belts (i.e. when TSMC is at 16nm).

GeForce 700M Models and Specifications
Comments Locked

91 Comments

View All Comments

  • crypticsaga - Monday, April 1, 2013 - link

    What possible reason would intel have for pushing a product like that? In fact if some sources are correct they are trying to do the exact opposite by bottlenecking even internal dGPU by limiting available PCIe lanes in Broadwell.
  • shompa - Monday, April 1, 2013 - link

    That problem have been solved for over a year with Thunderbolt. Use a Thunderbolt PCIE with graphic card.
  • fokka - Monday, April 1, 2013 - link

    afaik this is not entirely correct since even thunderbolt is too slow to properly utilize a modern graphics card.
    this is not surprising, since thunderbolt is based on 4x pci-e 2.0 (2GB/s) and current desktop class graphics are using 16x pci-e 3.0 (~16GB/s) which is about eight times as fast.

    so i wouldn't say the problem is completely solved throughput-wise, but thunderbold sure was an important step in the right direction.
  • MojaMonkey - Monday, April 1, 2013 - link

    No, shompa is correct, it has been solved with Thunderbolt and I'm personally using a GTX 680 connected externally. Works great.
  • Wolfpup - Monday, April 1, 2013 - link

    Ugh. You're either ignorant or reaaaaally generous with the hyperbole. "20+ lbs notebooks"? Really?

    In real life, mid-range notebooks/GPUs do fine for gaming, and high end notebooks/GPUs do...REALLY fine. When you can max out today's games at 1080p, that isn't "performing poorly", and is orders of magnitude better than Intel's video.

    If YOU guys don't want high end notebooks, fine, but I don't see how they're hurting you.
  • lmcd - Tuesday, April 2, 2013 - link

    My cheap A8m (Trinity) can play Rage at high texture res at panel res (1366x768), just for starters. And that's $400 level I think right now.
  • superjim - Wednesday, April 10, 2013 - link

    I can confirm this. An A8-4500M does really well for $400 or below on 1366x768. Now if the A10 would come down to $400 we'll really be in good shape.
  • xenol - Monday, April 1, 2013 - link

    I had a laptop with discrete graphics that lasted for over 9 hours on battery, while surfing the web. It was a laptop with an early form of Optimus (you had to manually switch), but still, you can have graphical performance xor battery life if you don't need the performance. But asking for both? Now that's silly.

    As for your issue with marketing the 680M as it is when it can't outperform a midrange desktop card... You do realize that this is a different market segment? Also you should tell "shame on you" to all the display companies who mislead customers into think they're buying a panel that can do 16 million colors (which last I checked, 18-bits is not 16 million) or have a 1000000:1 contrast ratio (which you need to be in a pitch black room and being shown a black/white checkerboard pattern to see).
  • Wolfpup - Monday, April 1, 2013 - link

    "Modest performance increase"? I wouldn't call my GTX 680m a "modest performance increase" over Intel video lol

    Are you KIDDING?!? Notebook hardware is ALWAYS worse than desktop. This applies obviously to CPUs too, which you're inexplicably not complaining about. You always pay more to get the same performance. That doesn't mean it's "dishonest" or the like.

    And quite obviously integrated video can never catch up with a discreet part so long as they make high end discreet parts, so the time is "never", not "near".

    ****
    Regarding the article...Optimus...eh, Nvidia's driver team is impressive as always, but literally the first program I ran that I wanted to run on the GPU wouldn't run on the GPU...thankfully my notebook lets you turn off Optimus.
  • JarredWalton - Monday, April 1, 2013 - link

    Which program did you run that you wanted on the GPU? Please be very specific.

Log in

Don't have an account? Sign up now