Performance Expectations

In their presentation and FAQ, NVIDIA provided estimates of performance relative to Ivy Bridge Core i5 ULV with HD 4000. Before we get to those numbers, we want to quickly set the stage for what NVIDIA is showing. Despite the similarity in name and features, as we discovered last year the ULV chips tend to run into TDP limits when you’re trying to hit both the CPU and the iGPU. The CPU cores for instance can use around 12-15W and a full load on the iGPU can create another 10-15W; stuff both into a 17W TDP and you’re going to get throttling, which is exactly what happens.

Looking at HD 4000 performance with Core i5 ULV and Core i7 quad-core, you can see that the quad-core part is anywhere from 0% to around 60% faster. On average it’s 35% faster at our 2012 “Value” settings and 26% faster at our 2012 “Mainstream” settings. As for the 700M performance relative to Core i5 ULV, NVIDIA provides the following estimates based on benchmarks at moderate detail and 1366x768 in Battlefield 3, Crysis 2, Just Cause 2, DiRT 3, and F1 2011:

Besides the above slide, NVIDIA provided some performance estimates using results from the 3DMark 11 Performance benchmark, and the results are even more heavily in favor of NVIDIA. In their FAQ, NVIDIA states that even the lowly GeForce 710M is three times faster than ULV HD 4000, while the GT 720M is 3.3x faster, GT 730M and 735M are 4.8X faster (hmmm…do we really need GT 735M?), GT 740M is 5.3X faster, GT 745M is 5.8x faster, and GT 750M is 6.3x faster. Of course, those number are from NVIDIA, going up against the much slower ULV variant of Ivy Bridge, and using 3DMark 11—which isn’t quite as important as actual gaming performance.

I suspect the GT3 and GT3e configurations of Haswell will be substantially faster than IVB’s HD 4000 and may come close to the lower end of NVIDIA’s range…at least on the standard voltage Haswell chips. For ULV, I’ve heard a performance estimates that GT3 Haswell will be 30%-50% faster than GT2 IVB, and GT3e could be roughly twice as fast, but that should still leave NVIDIA with a healthy lead. Anyway, we’d suggest taking all of these numbers with a grain of salt for now. The real comparison for most is going to be Haswell and 700M, and while we have a pretty good idea where 700M and HD 4000 performance fall (since the 700M parts are Kepler and Fermi updates), Haswell’s iGPU is likely to be a different beast.

Closing Thoughts

On the whole, Kepler has been amazingly successful for NVIDIA, particularly in the mobile world. The bar for midrange mobile dGPUs was raised significantly with the GT 640M LE and above parts typically offering anywhere from 25% to 75% better performance than the previous generation, and that was accomplished along with reducing power use. It was NVIDIA’s version of Intel’s Core 2 launch, and the vast majority of notebooks with dGPUs seem to be using NVIDIA hardware these days. Much of that can also be attributed to NVIDIA’s driver team, where Optimus support and usability still trumps AMD’s Enduro alternative. AMD is still working to improve their drivers, but they're still not at the same level as NVIDIA's mobile drivers.

Not surprisingly, it looks like every laptop with an NVIDIA dGPU these days also comes with Optimus support, and NVIDIA says they’ll be in three times as many Ultrabooks and ultraportables in 2013 compared to 2012—which isn’t too hard, since off the top of my head the only two Ultrabooks with NVIDIA dGPUs I can name are the Acer M5 and the ASUS UX32VD. NVIDIA also says they have over 30 design wins for touchscreen laptops, but again considering Windows 8 almost requires a touchscreen to really be useful that’s expected. We will likely see a limited number of laptops launching with Ivy Bridge CPUs and 700M dGPUs over the coming weeks, with ASUS specifically listed in NVIDIA’s 700M FAQ with their X450 (GT 740M) and N46 (GT 740M as well); Lenovo is also a launch day partner with several options: Y400 with GT 750M, and Z400/Z500 with GT 740M.

The real launch is likely to coincide with Intel’s Haswell update later in Q2 2013. When that comes along, we're likely to see some additional 700M updates from NVIDIA on the high end (again, echoing what happened with the 600M and 680M launches). Just don't count on seeing a mobile variant of Titan/GK110 for a while yet; I'd peg that level of performance as something we won't see in laptops until we have two more process shrinks under our belts (i.e. when TSMC is at 16nm).

GeForce 700M Models and Specifications
Comments Locked

91 Comments

View All Comments

  • StevoLincolnite - Monday, April 1, 2013 - link

    Not really. Overclocking is fine if you know what you're doing.
    Years ago I had a Pentium M 1.6ghz notebook with a Mobility Radeon 9700 Pro.
    Overclocked that processor to 2.0ghz+ and the Graphics card core clock was almost doubled.
    Ran fine for years, eventually the screen on it died due to sheer age, but I'm still using it as file server hooked up to an old monitor still to this day, with about a half dozen external drives hanging off it.
  • JarredWalton - Monday, April 1, 2013 - link

    Hence the "with very few exceptions". You had a top-end configuration and overclocked it, but that was years ago. Today with Turbo Boost the CPUs are already pushing the limits most of the time in laptops (and even in desktops unless you have extreme cooling). GPUs are doing the same now with GPU Boost 2.0 (and AMD has something similar, more or less). But if you have a high-end Clevo, you can probably squeeze an extra 10-20% from overclocking (YMMV).

    But if we look at midrange offerings with GT 640M LE...well, does anyone really think an Acer M5 Ultrabook is going to handle the thermal load or power load of a GPU that's running twice as fast as spec over the long haul? Or what about a Sony VAIO S 13.3" and 15.5" -- we're talking about Sony, who is usually so worried about form that they underclock GPUs to keep their laptops from overheating. Hint: any laptop that's really thin isn't going to do well with GPU or CPU overclocking! I know there was a Win7 variant of the Sony VAIO S that people overclocked (typically 950MHz was the maximum anyone got stable), but that was also with the fans set to "Performance".

    Considering the number of laptops I've seen where dust buildup creates serious issues after six months, you're taking a real risk. The guys who are pushing 950MHz overclocks on 640M LE are also the same people that go and buy ultra-high-end desktops and do extreme overclocking, and when they kill a chip it's just business as usual. Again, I reiterate that I have seen enough issues with consumer laptops running hot, especially when they're over a year old, that I suggest restraint with laptop overclocking. You can do it, but don't cry to NVIDIA or the laptop makers when your laptop dies!
  • transphasic - Monday, April 1, 2013 - link

    Totally agreed. I had a Clevo/Sager Laptop with the 9800m GTX in it, and after only two years, it died, due to the Nvidia GPU getting fried to a crisp. The heat build-up from internal dust accumulation was what destroyed my $2700 dollar laptop after only 2 years of use.
    Ironically, I was thinking about overclocking it prior to it dying on me. In looking back, good thing I didn't do it. Overclocking is risky, and the payoffs are just not worth it, unless you are ready to take the expensive financial risks involved.
  • Drasca - Tuesday, April 2, 2013 - link

    I've got a Clevo x7200 and I just cleaned out a wall of dust after discovering it was thermal throttling hard core. I've got to hand it to the internals and cooling of this thing though, it was still running like a champ.

    This thing's massive cooling is really nice.

    I can stably overclock the 485m GPU from 575 Mhz to 700Mhz without playing with voltages. No signifigant difference in temps, especially compared to when it was throttling. Runs at 61C.

    I love the cooling solution on this thing.
  • whyso - Monday, April 1, 2013 - link

    It depends really. As long as you don't touch voltage the temperature does not rise much. I have a 660m and it reaches 1085/2500 without any problems (ANIC rating of 69%). Overclocked vs non overclocked is basically a 2 degree difference (72 vs 74 degrees). Better than a stock 650 desktop.

    Also considering virtually every 660m I have seen boost up to 950/2500 from 835/2000 I don't think the 750m is going to be any upgrade. Many 650m have a boost of 835 core so there really is no upgrade there either (maybe 5-10%). GK107 is fine with 64 GB/sec bandwidth.
  • whyso - Monday, April 1, 2013 - link

    Whoops sorry didn't see the 987 clocks, nice jump there.
  • JarredWalton - Monday, April 1, 2013 - link

    Funny thing is that in reading comments on some of the modded VBIOS stuff for the Sony VAIO S, the modder say, "The Boost clock doesn't appear to be working properly so I just set it to the same value..." Um, think please Mr. Modder. The Boost clock is what the GPU is able to hit when certain temperature and power thresholds are not exceeded; if you overclock, you've likely inherently gone beyond what Boost is designed to do.

    Anyway, a 2C difference for a 660M isn't a big deal, but you're also looking at a card with a default 900MHz clock, so you went up in clocks by 20% and had a 3% temperature increase (and no word on fan speed). Going from 500MHz to 950MHz is likely going to be more strenuous on the system and components.
  • damianrobertjones - Monday, April 1, 2013 - link

    "and their primary competition in the iGPU market is going to be HD 4000 running on a ULV chip!"

    Wouldn't that be the HD 4600? Also it's a shame that no-one really states the HD4000 with something like Vengeance ram which improves performance
  • HisDivineOrder - Monday, April 1, 2013 - link

    So if the "core hardware" is the same from Boost 1 and 2, then nVidia should go on and make Boost 2.0 be something we all can enable in the driver.

    Or... are they trying to get me to upgrade to new hardware to activate a feature my card is already fully capable of supporting? Haha, nVidia, you so crazy.
  • JarredWalton - Monday, April 1, 2013 - link

    There may be some minor difference in the core hardware (some extra temperature or power sensors?), but I'd be shocked if NVIDIA offered an upgrade to Boost 1.0 users via drivers -- after all, it looks like half of the performance increase from 700M is going to come from Boost 2.0!

Log in

Don't have an account? Sign up now