CPU Performance

LG's G2 features a quad-core Snapdragon 800 SoC (MSM8974). For a quick refresher, Snapdragon 800 features four Krait 400 cores running at up to 2.3GHz courtesy of TSMC's 28nm HPM process. The 2.3GHz max clock speed comes at a surprisingly low voltage thanks to the low power HPM process. Gone are the days of 1.4V to hit near-2GHz frequencies it seems, instead 8974 will hit 2.3 GHz at around 1V. Krait 400 improves L2 access latencies over Krait 300 (which is at the heart of Snapdragon 600 and S4 Pro) and is optimized for higher frequency operation, but Krait 400 is otherwise architecturally similar to Krait 300. Make no mistake, MSM8974 is the new high-end, pushing Snapdragon 600 and S4 Pro parts further down into midrange category. There are other SoC level enhancements as well, including things like a new version of the Hexagon DSP and obviously Adreno 330 vs. Adreno 320 (which I'll get to later). We already ran through a performance preview of Snapdragon 800/MSM8974 using Qualcomm's 8974 Tablet Mobile Development Platform, but today we get to do the same with the G2.

Gallery: LG G2

LG was pretty eager to get us a G2 sample as early as possible, unfortunately that comes at the expense of software maturity. LG made it very clear to us that the International G2 sample (LG-D802) we received has nowhere near final software, and as a result may not deliver performance indicative of what we'll see when the device shows up later this month. This puts us in an interesting situation as we want to see how close shipping Snapdragon 800 devices come to the Snapdragon MDP/T we tested back in June. Software maturity aside, there's no skirting the fact that the G2 simply has a smaller chassis and perhaps lower thermal limits than the tablet MSM8974 MDP/T we tested previously.

 

The most interesting comparison points here will be to LG's Optimus G Pro which ships with a Snapdragon 600 (4 x Krait 300 running at 1.7GHz), the Exynos 5 Octa based Galaxy S 4 (SHVE300S) and to the MDP/T. As always, we'll start with a look at CPU performance.

The state of CPU performance testing under Android is unfortunately still quite broken. We're using a mix of browser based tests with Java & Native apps (AndEBench).

SunSpider Javascript Benchmark 1.0 - Stock Browser

SunSpider has quickly become an exercise in browser optimization rather than platform performance. Qualcomm's browser optimizations are clearly good for showing off Snapdragon 800's potential, however the G2 doesn't appear to have the same optimizations in place (yet). Performance isn't bad, but it's merely on par with Snapdragon 600 and ARM's Cortex A15.

Mozilla Kraken Benchmark - 1.1

Kraken is an interesting test as it has (thus far) remained less of a browser optimization target. Kraken is also a physically larger and longer benchmark, which provides results that I tend to be a little happier with. The G2 once again falls short of Qualcomm's MDP/T, but given its early software I'm not too surprised. Performance is roughly on par with the Exynos 5 Octa, and slightly behind the very high clocked Snapdragon 600 in the nearly stock Moto X.

Google Octane Benchmark v1

Octane is the first benchmark where we see the Snapdragon 800 flex its potential. Here the G2 not only ties the Snapdragon 800 MDP/T, but it also roughly equals the performance of the Cortex A15 based Exynos 5 Octa. Ultimately that's the comparison that Qualcomm will be most interested in winning. If Snapdragon 800 can deliver better performance (or at least perf per watt) than the Cortex A15, it'll be a definite win for Qualcomm.

Browsermark 2.0

If Octane had the S800 in the proverbial passing lane, Browsermark 2.0 shows the G2 in the clear lead. Here LG was able to even outperform Qualcomm's own reference design by 16%. I suspect this has more to do with browser optimizations than anything else though, as the S600 based Optimus G Pro also does extremely well.

AndEBench - Java

AndEBench - Native

AndEBench provides us with very low level look at SoC performance. I'm not a huge fan of these types of tests, especially ones that aggregate a bunch of microbenchmarks and attempt to present a single performance number. AndEBench is unique (and useful) in that it presents performance in both native code and Dalvik interpreted states. The G2's native performance here is quite good, but it's actually equalled by the Galaxy S 4 GPe and not far ahead of the Optimus G Pro. I suspect we're once again seeing the limits of early software rather than a full understanding of Snapdragon 800's performance in a retail device. Dalvik performance is a bit worse. The relatively high ranking of the Google Play Edition devices points to software optimization being a culprit here.

Vellamo Benchmark - 2.0

Vellamo Benchmark - 2.0

Both Vellamo tests put the G2 on par with Qualcomm's Snapdragon 800 MDP/T.

 
Battery Life GPU Performance
Comments Locked

120 Comments

View All Comments

  • Krysto - Sunday, September 8, 2013 - link

    Cortex A9 was great efficiency wise, and better perf/Watt than what Qualcomm had available at the time (S3 Scorpion), but Nvidia still blew it with Tegra 3. So no, that's not the only reason. Nvidia can do certain things like moving to smaller node or keeping the clock speed low of the GPU's, but adding more GPU cores, and so on, to increase efficiency and performance/Watt. But they aren't doing any of that.
  • UpSpin - Sunday, September 8, 2013 - link

    You mean they could and should have released more iterations of Tegra 3 and adding more and more GPUs to improve at least the graphics performance than waiting for A15 and Tegra 4.

    I never designed a SoC myself :-D so I don't know how hard it is but I did lots of PCB which is practically the same except on a much larger scale :-D If you add some parts you have to increase the die size, thus move other parts on the die around, reroute the stuff etc. So it's still a lot of work. The main bottleneck of Tegra 3 is memory bandwidth. So adding more GPU cores without adressing the memory bandwidth would not have made any sense most probably.

    They probably expected to ship Tegra 4 SoCs sooner, thus they saw no need in releasing a totally improved Tegra 3 and focused on Tegra 4.

    And if you compare Tegra 4 to Tegra 3, then they did exactly what you wanted, moving to a smaller node, increasing the number of GPU cores, moving to A15 while maintaining the power efficient companion core, increasing bandwidth, ...
  • ESC2000 - Sunday, September 8, 2013 - link

    I wonder whether it is more expensive to pay to license ARM's A9, A15, etc (thought they were doing an A12 as well?) or to develop it yourself like Qualcomm does. Obviously QCOM isn't starting from scratch every time, but R&D adds up fast.

    This isn't a perfect analogy at all but it makes me think of the difference between being a pharmaceutical company that develops your own products and one that makes generic versions of products someone else has already developed once the patent expires. Of course now in the US many companies that technically make their own products from scratch really just take a compound already invented and tweak it a little bit (isolate the one useful isomer, make the chiral version, etc), knowing that it is likely their modified version will be safe and effective just as the existing drug hopefully is. They still get their patent, which they can extend through various manipulations like testing in new populations right before the patent expires, but the R&D costs are much lower. Consumers therefore get many similar versions of drugs that rely on one mechanism of action (see all the SSRIs) and few other choices if that mechanism does not work for them. Not sure how I got off into that but it is something I care about and now maybe some Anandtech readers will know haha.
  • krumme - Sunday, September 8, 2013 - link

    Great story mate :), i like it.
  • balraj - Saturday, September 7, 2013 - link

    My first comment on Anandtech
    The review was cool...I'm impressed by g2 battery life n camera...
    Wish Anandtech can have a UI section
    Also can you ppl confirm if lg will support g2 with Atleast 2 yrs of software update
    That's gonna be deciding factor in choosing between g2 or nexus 5 for most of us !!!!!!!
  • Impulses - Saturday, September 7, 2013 - link

    Absolutely nobody can guarantee that, even if an LG exec came out and said so there's no guarantee they wouldn't change their mind or a carrier wouldn't delay/block an update... If updates are that important to you, then get a Nexus, end of story.
  • adityasingh - Saturday, September 7, 2013 - link

    @Brian could you verify whether the LG G2 uses Snapdragon 800 MSM8974 or MSM8974AB?

    The "AB" version clocks the CPU at 2.3Ghz, while the standard version tops out at 2.2Ghz.. However you noted in your review that the GPU is clocked at 450Mhz.. If I recall correctly, the "AB" version runs the GPU at 550Mhz.. while the standard is 450Mhz

    So in this case the CPU points to one bin.. but the GPU points to another.. Can you please confirm?
    Nice "Mini Review" otherwise.. Am looking forward to the full review soon.. Please include the throttling analysis like the one from the MotoX. It would be nice to see how the long the clocks stay at 2.3Ghz :)
  • Krysto - Sunday, September 8, 2013 - link

    He did mention it's the first. no the latter.
  • neoraiden - Saturday, September 7, 2013 - link

    Brian could you comment on how the lumia 1020 compares to a cheap ($150-200) camera as I was impressed by the difference in colour for the video comparison even if ois wasn't the best.

    I currently have a note 2 but the camera quality in low light conditions is just too bad, also the inability to move apps to my memory card has been annoying. I have an upgrade coming up in January I think, but I might try to change phone before. I was wondering whether you could comment on whether the lumia 1020 is worth the jump from android due to picture quality or will an htc one or nexus 5 (if similar to the g2) suffice? I was considering the note 3 as I like everything else but it still doesn't have ois or would the note 3 with a cheap compact be better even given the inconvenience of having to bring a camera?

    The main day to day use of my phone is news apps, Internet, email some threaded (which I hear is a problem for windows phone).
  • abrahavt - Sunday, September 8, 2013 - link

    I would wait to see what camera nexus 5 would have. Alternative is to get the Sony QX 100 and you would get great pictures irrespective of the phone

Log in

Don't have an account? Sign up now