GeForce 700M Models and Specifications

With that brief introduction out of the way, here are the specs of the now announced 700M family. If I had to guess, I expect we’ll see revised high-end 700M parts sometime later this year based on tweaked GK106 and GK104 chips—like maybe a GTX 780M that has the performance of the GTX 680MX but in the power envelope of the GTX 680M—but we’ll have to wait and see what happens.

  GeForce GT 750M GeForce GT 745M GeForce GT 740M
GPU and Process 28nm GK107 or GK106 28nm GK107 28nm GK107
CUDA Cores 384 384 384
GPU Clock Up to 967MHz
plus Boost
Up to 837MHz
plus Boost
Up to 980MHz
plus Boost
Memory Eff. Clock Up to 5.0GHz Up to 5.0GHz Up to 5.0GHz
Memory Bus Up to 128-bit Up to 128-bit Up to 128-bit
Memory Bandwidth Up to 80GB/s Up to 80GB/s Up to 80GB/s
Memory Up to 2GB GDDR5
or DDR3
Up to 2GB GDDR5
or DDR3
Up to 2GB GDDR5
or DDR3

Compared to the previous generation GTX 660M, GT 650M, GT645M, and GT 640M (not to mention the GT 640M LE), the new chips all have the same core set of features but now with GPU Boost 2.0 and higher memory clocks. I wish NVIDIA would just drop support for DDR3 on their higher end chips, and likewise the “up to” clauses aren’t really helpful, but they’re both necessary evils thanks to working with OEMs that sometimes have slightly different requirements. Overall, performance of these new 700M parts should be up 15-25% relative to the previous models, thanks to higher GPU and memory clock speeds.

You’ll note that the core clocks appear to be a little crazy, but this is based largely on how the OEMs choose to configure a specific laptop. With both GDDR5 and DDR3 variants available, NVIDIA wants to keep performance of chips in the same name within 10% of each other. Thus, we could see a GT 740M with 2.5GHz GDDR5 and a moderate core clock, another GT 740M with 2.0GHz GDDR5 and a slightly higher core clock, and a third variant with 1800MHz DDR3 but matched to a 980MHz core clock. Presumably, most (all?) currently planned GT 750M and GT 745M laptops are using GDDR5 memory, and thus we don’t see the higher core clocks. As for the Boost clocks, in practice that can increase the GPU core speed 15% or more over the normal value, with most games realizing a 10-15% performance thanks to the increase.

One final item of interest is that while the GT 750M appears to have a similar configuration to the other GPUs—384 cores, 128-bit memory interface—at least in the chip shots provided the GT 750M uses a different GPU core. Based on the appearance in the above images, the GT 750M uses GK106, only it’s what would be called a “floor sweeper” model: any GK106 chip with too many defective cores to be used elsewhere can end up configured basically the same as GK107. Presumably, there will also be variants that use GK107 (or potentially GK208, just like the other parts), but NVIDIA wouldn’t confirm or deny this.

  GeForce GT 735M GeForce GT 730M GeForce GT 720M GeForce 710M
GPU and Process 28nm GK208 28nm GK208 28nm Fermi 28nm Fermi
CUDA Cores 384 384 96 96
GPU Clock Up to 889MHz
plus Boost
Up to 719MHz
plus Boost
Up to 938MHz
with Boost
Up to 800MHz
with Boost
Memory Eff. Clock Up to 2.0GHz Up to 2.0GHz Up to 2.0GHz Up to 1.8GHz
Memory Bus Up to 64-bit Up to 64-bit Up to 64-bit Up to 64-bit
Memory Bandwidth 32GB/s 32GB/s 32GB/s 32GB/s
Memory Up to 2GB DDR3 Up to 2GB DDR3 Up to 2GB DDR3 Up to 2GB DDR3

Moving on to the lower end of the 700M range, we have the GT 730M and 710M that have already shown up in a few laptops. Joining them are GT 735M and GT 720M, which are similar chips with higher clocks. All of these chips have 64-bit memory interfaces and that will obviously curtail performance a bit, but NVIDIA is targeting Ultrabooks and other thin form factors here so performance and thermals need to be kept in balance; more on this in a moment.

The GT 735M and 730M at least are “new” parts that we haven’t seen previously in the Kepler family. The word is that some OEMs were after more economical alternatives than even the GT 640M LE, and the option to go with a 64-bit interface opens up some new markets. It’s basically penny pinching on the part of the OEMs, but we’ve complained about BoM price saving measures plenty so we won’t get into it here. NVIDIA did mention that they’ve spent some additional time tuning the drivers for performance over a 64-bit bus on these chips, and their primary competition in the iGPU market is going to be HD 4000 running on a ULV chip—and in the near future, HD 4600 with Haswell. They'll also compete with AMD APUs and dGPUs, obviously, but NVIDIA is more interested in trying to show laptop vendors and users what they gain by adding an NVIDIA dGPU to an Intel platform.

Introducing the NVIDIA GeForce 700M Family Performance Expectations and Closing Thoughts
Comments Locked

91 Comments

View All Comments

  • JarredWalton - Monday, April 1, 2013 - link

    *Crickets* (again)

    And this is why I don't buy into such claims; I had some issues when Optimus first launched, but I haven't had any I can specifically pinpoint in the last year or more. If anyone has something under Windows that doesn't play right with Optimus, please post it here so I can verify the issue. Otherwise, with no concrete evidence, it's just FUD.
  • Wolfpup - Tuesday, April 2, 2013 - link

    VLC for starters. I seriously have no idea how you can be unaware of issues with it-any big notebook forum will have threads about it all the time, people trying to disable it.
  • mrhumble1 - Monday, April 1, 2013 - link

    Speaking of dishonest, you are not sticking to the facts. You obviously don't own a nice laptop with a 680M inside. I do, and it performs amazingly well. I output games from my laptop to my TV via HDMI and they look spectacular.

    You also obviously don't understand that desktop PCs (and their components) cannot be directly compared laptops. I also highly doubt that a "dirt cheap" PC can run The Witcher 2 at almost Ultra settings at a playable framerate.
  • tviceman - Monday, April 1, 2013 - link

    You're missing out then if you like to game. I've got a 560m that still performs admirably, running many of today's games with max settings (no AA) at 1600x900 60fps. I'm spoiled by fast frame rates and decent graphics settings, I can't imagine using even the upcoming haswell to play games like Bioshock Infinite on.
  • xTRICKYxx - Monday, April 1, 2013 - link

    I will never ever buy a laptop without a discrete card. Video cards 7770M/650M or above can play any game on 1920x1080 on high if there is a good enough CPU as well. Mobile graphics are starting to become powerful enough.

    Look at Intel CPU's. My i7-2600K at home is slightly slower than my i7-3720QM clock for clock.
  • jsilverstreak - Monday, April 1, 2013 - link

    680m sli is about 10% faster than a 680
    and the 780m's leaked benchmark was on par with a 660ti
  • jsilverstreak - Monday, April 1, 2013 - link

    oops it's actually the 680 the 780m is on par with
    although its leaked
  • nerd1 - Saturday, April 13, 2013 - link

    You obviously dont game, do you?
  • Mr. Bub - Wednesday, April 17, 2013 - link

    Sure, maybe most mainstream users who facebook and email all day on laptops don't need anything more than integrated graphics, but guys like me (engineering students) who actually do stuff on computers will still rely on discrete GPUs. I can't properly run any of my CAD software without a discrete GPU.
  • nerdstaz - Monday, April 22, 2013 - link

    http://www.notebookcheck.net/NVIDIA-GeForce-GTX-68...

    Your comparison is a bit off base <3

Log in

Don't have an account? Sign up now