Late last month, Intel dropped by my office with a power engineer for a rare demonstration of its competitive position versus NVIDIA's Tegra 3 when it came to power consumption. Like most companies in the mobile space, Intel doesn't just rely on device level power testing to determine battery life. In order to ensure that its CPU, GPU, memory controller and even NAND are all as power efficient as possible, most companies will measure power consumption directly on a tablet or smartphone motherboard.

The process would be a piece of cake if you had measurement points already prepared on the board, but in most cases Intel (and its competitors) are taking apart a retail device and hunting for a way to measure CPU or GPU power. I described how it's done in the original article:

Measuring power at the battery gives you an idea of total platform power consumption including display, SoC, memory, network stack and everything else on the motherboard. This approach is useful for understanding how long a device will last on a single charge, but if you're a component vendor you typically care a little more about the specific power consumption of your competitors' components.

 

What follows is a good mixture of art and science. Intel's power engineers will take apart a competing device and probe whatever looks to be a power delivery or filtering circuit while running various workloads on the device itself. By correlating the type of workload to spikes in voltage in these circuits, you can figure out what components on a smartphone or tablet motherboard are likely responsible for delivering power to individual blocks of an SoC. Despite the high level of integration in modern mobile SoCs, the major players on the chip (e.g. CPU and GPU) tend to operate on their own independent voltage planes.


A basic LC filter

What usually happens is you'll find a standard LC filter (inductor + capacitor) supplying power to a block on the SoC. Once the right LC filter has been identified, all you need to do is lift the inductor, insert a very small resistor (2 - 20 mΩ) and measure the voltage drop across the resistor. With voltage and resistance values known, you can determine current and power. Using good external instruments (NI USB-6289) you can plot power over time and now get a good idea of the power consumption of individual IP blocks within an SoC.


Basic LC filter modified with an inline resistor

The previous article focused on an admittedly not too interesting comparison: Intel's Atom Z2760 (Clover Trail) versus NVIDIA's Tegra 3. After much pleading, Intel returned with two more tablets: a Dell XPS 10 using Qualcomm's APQ8060A SoC (dual-core 28nm Krait) and a Nexus 10 using Samsung's Exynos 5 Dual (dual-core 32nm Cortex A15). What was a walk in the park for Atom all of the sudden became much more challenging. Both of these SoCs are built on very modern, low power manufacturing processes and Intel no longer has a performance advantage compared to Exynos 5.

Just like last time, I ensured all displays were calibrated to our usual 200 nits setting and ensured the software and configurations were as close to equal as possible. Both tablets were purchased at retail by Intel, but I verified their performance against our own samples/data and noticed no meaningful deviation. Since I don't have a Dell XPS 10 of my own, I compared performance to the Samsung ATIV Tab and confirmed that things were at least performing as they should.

We'll start with the Qualcomm based Dell XPS 10...

Modifying a Krait Platform: More Complicated
Comments Locked

140 Comments

View All Comments

  • powerarmour - Friday, January 4, 2013 - link

    So yes, finally confirming what anyone with half a brain knows, competitive ARM SoC's use less power.
  • apinkel - Friday, January 4, 2013 - link

    I'm assuming you are kidding.

    Atom is roughly equivalent to (dual core) Krait in power draw but has better performance.

    The A15 is faster than either krait or the atom but it's power draw is too much to make it usable in a smartphone (which is I'm assuming why qualcomm had to redesign the A15 architecture for krait to make it fit into the smartphone power envelope).

    The battle I still want to see is quad core krait and atom.
  • ImSpartacus - Friday, January 4, 2013 - link

    Let me make sure I have this straight. Did Qualcomm redesign A15 to create Krait?
  • djgandy - Friday, January 4, 2013 - link

    No. Qualcomm create their own designs from scratch. They have an Instruction Set licence for ARM but they are arm "clones"
  • apinkel - Friday, January 4, 2013 - link

    Sorry, yeah, I could have worded that better.

    But in any case the comment now has me wondering if I'm off base in my understanding of how Qualcomm does what it does...

    I've been under the impression that Qualcomm took the ARM design and tweaked it for their needs (instead of just licensing the instruction set and the full chip design top to bottom). Yeah/Nay?
  • fabarati - Friday, January 4, 2013 - link

    Nay.

    They do what AMD does, they license the instruction set and create their own cpus that are compatible with the ARM ISA's (in Krait's case, the ARMv7). That's also what Apple did with their Swift cores.

    Nvidia tweaked the Cortex A9 in the Tegra 2, but it was still a Cortex A9. Ditto for Samsung, Hummingbird and the Cortex A8.
  • designerfx - Friday, January 4, 2013 - link

    do I need to remind you that the Tegra 3 has disabled cores on the RT? Using an actual android device with Tegra 3 would show better results.
  • madmilk - Friday, January 4, 2013 - link

    The disabled 5th core doesn't matter in loaded situations. During idle, screen power dominates, so it still doesn't really matter. About all you'll get is more standby time, and Atom seems to be doing fine there.
  • designerfx - Friday, January 4, 2013 - link

    The core allows a lot of different significant things - so in other words, it's extremely significant, including in high load situations as well.

    That has nothing to do with the Atom. You get more than standby time.
  • designerfx - Friday, January 4, 2013 - link

    also, during idle the screen is off, usually after whatever amount of time the settings are set for. Which is easily indicated in the idle measurements. What the heck are you even talking about?

Log in

Don't have an account? Sign up now