Ever since its arrival in the ultra mobile space, NVIDIA hasn't really flexed its GPU muscle. The Tegra GPUs we've seen thus far have been ok at best, and in serious need of improvement at worst. NVIDIA often blamed an immature OEM ecosystem unwilling to pay for the sort of large die SoCs necessary in order to bring a high-performance GPU to market. Thankfully, that's all changing. Earlier this year NVIDIA laid out its mobile SoC roadmap through 2015, including the 2014 release of Project Logan - the first NVIDIA ultra mobile SoC to feature a Kepler GPU. Yesterday in a private event at Siggraph, NVIDIA demonstrated functional Logan silicon for the very first time.

NVIDIA got Logan silicon back from the fabs around 3 weeks ago, making it almost certain that we're dealing with some form of 28nm silicon here and not early 20nm samples.

NVIDIA isn't talking about CPU cores, but it's safe to assume that Logan will be another 4+1 arrangement of cores - likely still based on ARM's Cortex A15 IP (but perhaps a newer revision of the core). On the GPU front, NVIDIA confirmed our earlier speculation that Logan includes a single Kepler SMX:

One Kepler SMX features 192 CUDA cores. NVIDIA isn't talking about shipping GPU frequencies either, but it did provide this chart to put Logan's GPU capabilities into perspective:

Don't get too excited as we're looking at a comparison of GFLOPS and not game performance, but the peak theoretical ALU bound performance of mobile Kepler should exceed that of a Playstation 3 or GeForce 8800 GTX (memory bandwidth is another story however). If we look closely at NVIDIA's chart and compare mobile Kepler to the iPad 4, we get a better idea of what sort of clock speeds NVIDIA would need to attain this level of performance. Doing some quick Photoshop estimation it looks like NVIDIA is claiming mobile Kepler has somewhere around 5.2x the FP power of the PowerVR SGX 554MP4 in the iPad 4 (76.8 GFLOPS). That works out to be right around 400 GFLOPS. With a 192 core implementation of Kepler, you get 2 FLOPS per core or 384 FLOPS per cycle. To hit 400 GFLOPS you'd need to clock the mobile Kepler GPU at roughly 1GHz. That's certainly doable from an architectural standpoint (although we've never seen it done on any low power 28nm process), but it's probably a bit too high for something like a smartphone.

NVIDIA didn't want to talk frequencies but they did tell me that we might see something this fast in some sort of a tablet. I suspect that most implementations will be clocked significantly lower. Even at half the frequency though, we're still talking about roughly Playstation 3 levels of FP power out of a mobile SoC. We know nothing of Logan's memory subsystem, which obviously plays a major role in real world gaming performance but there's no getting around the fact that Logan's Kepler implementation means serious business. For years we've lamented NVIDIA's mobile GPUs, Logan looks like it's finally going to change that.

API Support and Live Demos
 

Unlike previous Tegra GPUs, Kepler is a fully unified architecture and OpenGL ES 3.0, OpenGL 4.4 and DirectX 11 compliant. The API compliance alone is a huge step forward for NVIDIA. It's also a big one for game developers looking to move more seriously into mobile. Epic's Tim Sweeney even did a blog post for NVIDIA talking about Logan's implementation of Kepler and how it brings feature parity between PCs, next-gen consoles and mobile platforms. NVIDIA responded in kind by running some Unreal Engine 4 demos on Android on a Logan test platform. That's really the big story behind all of this. With Logan, NVIDIA will bring its mobile GPUs up to feature parity with what it's shipping in the PC market. Game developers looking to port games between console, PC, tablet and smartphone should have an easier job of doing that if all platforms supported the same APIs. Logan will take NVIDIA from being very behind in API support (with no OpenGL ES 3.0 support) to the head of the class.

NVIDIA took its Ira demo, originally run on a Titan at GTC 2013, and got it up and running on a Logan development board. Ira did need some work to make the transition to mobile. The skin shaders were simplified, smaller textures are used and the rendering resolution is dropped to 1080p. NVIDIA claims this demo was done in a 2 - 3W power envelope.

The next demo is called Island and was originally shown on a Fermi desktop part. Running on Logan/mobile Kepler, this demo shows OpenGL 4.3 and hardware tessellation working.

The development board does feature a large heatspreader, but that's not too unusual for early silicon just out of bring up. Logan's package size should be comparable to Tegra 4, although the die size will clearly be larger. The dev board is running Android and is connected to a 10.1-inch 1920 x 1200 touchscreen.

Power Consumption & Final Words
POST A COMMENT

141 Comments

View All Comments

  • Yojimbo - Saturday, July 27, 2013 - link

    I have to disagree. I am not sure about your analysis that anything that comes out later is "by definition behind." Rogue is not promising 400 GFLOPS, is it? And it's expected to be out in Q4 2013 isn't it? What we know for sure is that Logan leapfrogs Rogue in terms of API compliance. Suppose Logan devices come out one year later than Rogue devices, and Logan performs upwards of 50% better than Rogue. Would that not be considered being "caught up?" And I never even said that with Logan, Nvidia would be "caught up." I said that Nvidia has a history of catching up. So my claim is that the advantage PowerVR holds over Nvidia immediately after Rogue and Logan, taking into account adjustments made for when the product is first made available to the market, will be less than the advantage that PowerVR has held over Nvidia for the duration of the previous generation, with similar adjustments made. I further claim that there is a good chance that eventually (0, 1, 2 generations?) that gap will be negative. My argument here has nothing to do with Logan being better than Rogue. It is a refutation of kukurachee's dismissal of Nvidia's claims for Logan's performance that he based on the GPUs that Nvidia had on their previous generations SOCs, and the amount of hype/interest they tried to generate for these SOCs. Reply
  • michael2k - Sunday, July 28, 2013 - link

    Yes, actually, Rogue is promising 400 GFLOPS, it promises OpenGL ES 3*/2/1.1, OpenGL 3.x/4.x, and full WHQL-compliant DirectX9.3/10, with certain family members extending their capabilities to DirectX11.1 functionality.

    For Logan to be upwards of 50% better than Rogue it would have to be a 600 GFLOP chip since PowerVR 6 is expected to hit 400 GFLOP. What would not be caught up is if Logan was a 400 GFLOP chip released next year. You see, Rogue is intended to hit 1 TF, possibly in full powered tablet form factors, so for Logan to truly best Rogue it would need to hit 1.5TF in a tablet form factor.

    I don't believe Logan is specced that high.
    Reply
  • Yojimbo - Sunday, July 28, 2013 - link

    The information I found on Rogue listed 250 GFLOPs. Where did you find 400 GFLOPs? Reply
  • michael2k - Monday, July 29, 2013 - link

    http://www.imgtec.com/powervr/sgx_series6.asp

    PowerVR Series6 GPUs can deliver 20x or more of the performance of current generation GPU cores targeting comparable markets. => iPad currently is about 70GF, so a Series 6 implementation would be 1.4 TF

    PowerVR Series6 GPU cores are designed to offer computing performance exceeding 100GFLOPS (gigaFLOPS) and reaching the TFLOPS (teraFLOPS) range => 100 GF is the bottom of the expected range

    My point holds though that the Series 6 is designed to go over 1TF in performance, which is more than enough to match NVIDIA's Logan for the foreseeable future.
    Reply
  • TheJian - Monday, August 05, 2013 - link

    You do realize IMG.L exists in phones because they couldn't cut it vs. AMD/NV in discrete gpus right? They have been relegated to cheapo devices by AMD/NV and are broke compared to NV. Apple apparently doesn't give them much profits. They had to borrow money just to buy 100mil mips cpus (borrowed 20mil), meanwhile NV buys companies like icera for 330mil cash, July29th bought Portland Group (compiler teams just got better at NV), with I'm assuming CASH again as they have 3.75B in the bank:
    http://blogs.nvidia.com/blog/2013/07/29/portland/
    I'm guessing this hurts OpenCL some also, while boosting NV in HPC even more. PGI owns 10% of compilers, while Intel owns ~25%. So NV just gained a huge leg up here with this deal.

    Back to socs, none of the competing socs have had to be GOOD at GAMES until now and not even really yet as we're just getting unreal 3 games coming and announced. How do you think this plays out vs. a team with 20yrs of gaming development work on drivers & with the devs? Every game dev knows AMD/NV hardware inside out for games. That can't be said of any SOC maker. All of those devs have created games on hardware that is about to come to socs next year (no new work needed, they already know kepler). We are leaving the era of minecraft & other crap and entering unreal 3/unreal 4 on socs. Good luck to the competition. If AMD can survive long enough to get their soc out they may be a huge player eventually also, but currently I'd bet on NV doing some major damage with T5/T6 etc (if not T4/T4i). T4 is just to hold people off until the real war starts next year as games are just gearing up on android, and T4i will finally get them into phones in greater numbers (adding more devices in the wild based on a very good gpu set).

    That being said, T4 is already roughly equal to all others. Only S800 looks to beat its gpu (I don't count apple, sales dropping and not housing anything but apple hardware anyway). I don't see rogue making any speeches about perf, just features and I don't see them running unreal 4 demos :) I don't hear anything from Mali either. S800 appears to have a good gpu (330) but it remains to be seen if games will fully optimize for it. I see no tegrazone type optimizations so far on Adreno. There is no Adrenozone (whatever, you get the point). All soc makers are about to be seriously upset by gaming becoming #1 on them. Before they needed a good 2D gui etc, not much more to run stupid stuff like minecraft or tetris type junk. We'll see how they all fare in stuff like Hawken etc types. I'd bet on NV/AMD with NV obviously being in the driver seat here, with cash no debt etc helping them and already on T5 by the time AMD ships A1 or whatever they call it.

    Soc venders better get their gaming chops up to snuff in a hurry or the NV GPU train will run them over in the next 3yrs. NV gpus will be optimized for again and again on desktop and that tech will creep into socs a year or two later over and over. All games made for PC's will start to creep to android on the very same hardware they were made for already a few years before on desktops. Devs will start to make advanced games on HTML5 (html6 etc eventually), OpenGL, WebGL, OpenCL, Java etc to ease portability making directx less needed. At that point Intel/MS are both in trouble (we're already heading there). IF you aim a game at directx you have a much harder time porting to everywhere else. That same game made on OpenGL ports easily (same with Html5 etc) to all devices.

    In a few short years we'll be looking at a 500w normal PC type box with Denver or whatever in it (Cortex A57 I'd guess) with an NV discrete card for graphics. With another 3yrs of games under the android belts it should make for a pretty good home PC with no WINTEL at all in it. Google just needs to get an office package for home out that does most of what office does by that time and there is no need for windows/office right (something aimed more at home users)? They need to court Adobe now to port it to android before Denver or Boulders launches (and anyone else aiming at desktops) or come up with their own content creation suite I guess. A lot of content comes from Adobe's suite. You need this on android with an office package also to convert many home users and work pc's. I see horrible stock prices for Intel/MS in less than 5yrs if google courts Adobe and packages a nice office product for home users. They can give away the OS/Office until MS/Intel bleed to death. All they need is games, adobe, NV/AMD gpus (choose either or both) in a tower with any soc vendor. They will make money on ads not the hardware or software MS/Intel need to make profits on. Margins will tank for these two (witness MS's drop on RT already, and Intel's profits tanking), and android will come to the front as a decent alternative to WINTEL. It’s kind of funny Google/(insert soc vendor name here) are about to turn MS into netscape. They gave IE away until netscape bled to death. The same is about to happen to them…ROFL. At the very least Wintel won’t be the same in under 5yrs. While Intel now has android, it would be unwise of Google to help Intel push them around (like samsung does to some degree). Much better to make a soc vender be the next Intel but much weaker. Google can push a soc vender around, not a samsung or Intel (not as easily anyway).

    I’ll remind you that PowerVR has already been killed once by NV (and ATI at the time). That's why they are in phones not PC's :) Don't forget that. I don't see this playing out different this time either. Phones/tablets etc are accidentally moving into NV/AMD territory (gaming) now and with huge volumes making it a no brainer to compete here for these two. It's like the entire market is coming to AMD/NV's doorstep like never before. Like I said, good luck to Qcom/Imagination/Arm/Samsung trying to get good at gaming overnight. There is no Qcom "gaming evolved" or "The way it's meant to be played" games yet. NV/AMD have had 20yrs of doing it, and devs have the same amount of time in experience with their hardware (which with die shrinks will just slide into phones/tablets shortly with less cores, but it’s the same tech!). If Qcom/Samsung (or even Apple) don't buy AMD soon they are stupid. It is the best defensive move they can make vs the NV gaming juggernaut coming and if that happens I'll likely sell my NV stock shortly after...LOL. Apple should do this to keep android on NV from becoming the dominate gaming platform. MS could buy them also, but I don't think Intel can get away with it quite yet (though they should be able to, since ARM is about to become their top cpu competitor). When an ARM soc (or whatever) gets into a 500w PC like box Intel can make that move to buy AMD for gpus. Until then I don't think they can without the FTC saying "umm, what?". I'm also not sure Samsung can legally buy AMD (security reasons for a non American company having that tech?).

    Anyway, it's not just about perf, it's also about optimizations (such as NV buying PGI for even MORE cuda prowess for dev tools). IE - Cuda just got a boost. More prep work for Denver/Boulder is afoot :) Just one more piece of the puzzle so to speak (the software stack). For google, they don't care who wins the soc race, as long as it pushes Android/Chrome as the Wintel replacement in the future for the dominant gaming platform and eventually as a PC replacement also for at least some work (assuming they court Adobe and maybe a few others for content creation on android/chrome or whatever). I’m guessing Denver/Boulder (and their competition) are pipelined to run at 3.5-4ghz in 70-100w envelopes. It would make no sense to compete with Intel’s Haswell/Broadwell etc in 8w. A desktop chip needs to operate at ~70w etc just like AMD/Intel to make the race even. You have a PSU, heatsink/fans and a HUGE box in a PC, it would be stupid not to crank power up to use them all. A Cortex-A57 chip at 4ghz should be very interesting with say, a Volta discrete card in the box :) A free OS/Office suite on it, with say $100 A57 4ghz chip should be easy to sell. A soc chip for notebooks, and a stripped gpu-less version for desktops with discrete cards sounds great. I like the future.

    My 2c. Heck call it a buck....whatever...LOL.
    Reply
  • ancientarcher - Wednesday, August 07, 2013 - link

    $10.0 even!!
    A very good explanation of your theory. not in the mainstream yet, and Intel (and Anandtech) keep on pushing how great its latest and great chip is that you can get for only $500. This monopoly is going to crash and burn. MS, as you said tried to go the ARM way (the system just wont support monopoly profits for two monopolies, maybe one but not two) but didn't execute well

    Denver won't necessarily be ported to the PC form factor, but who knows. It will surely change the landscape in the smartphone/tablet segments. I also think IMG, Mali and Adreno can learn to compete in games. Yes, they would not have had 20 years, but they do have a lot of momentum and the sheer number of devices running these platforms will push developers to code for them.
    Reply
  • Mondozai - Saturday, September 07, 2013 - link

    Who says they are measuring it against the iPad 4(which is 80 gigaflops btw)?

    Their wording is so vague that they can choose a midrange model and say that's it's more representative of the GPU performance of most GPU's(at the time that statement was made), which would be correct.

    You're making some pretty wild guesses.
    Reply
  • MySchizoBuddy - Saturday, August 10, 2013 - link

    mainboard chipsets?
    they exited that market. so you only have one example of Nvidia operating like a locomotive.
    Reply
  • beck2050 - Monday, August 12, 2013 - link

    Agreed. It looks like Logan will finally put Nvidia on the map in the mobile world big time, and after that it should get very interesting. Reply
  • CharonPDX - Wednesday, July 24, 2013 - link

    Even if the power consumption is double what they claim, and the performance half - it's *STILL* massively impressive.

    The big question is: Do Intel and PowerVR have similar tricks up their sleeve?
    Reply

Log in

Don't have an account? Sign up now