Modifying a Krait Platform: More Complicated

Modifying the Dell XPS 10 is a little more difficult than Acer's W510 and Surface RT. In both of those products there was only a single inductor in the path from the battery to the CPU block of the SoC. The XPS 10 uses a dual-core Qualcomm solution however. Ever since Qualcomm started doing multi-core designs it has opted to use independent frequency and voltage planes for each core. While all of the A9s in Tegra 3 and both of the Atom cores used in the Z2760 run at the same frequency/voltage, each Krait core in the APQ8060A can run at its own voltage and frequency. As a result, there are two power delivery circuits that are needed to feed the CPU cores. I've highlighted the two inductors Intel lifted in orange:

Each inductor was lifted and wired with a 20 mΩ resistor in series. The voltage drop across the 20 mΩ resistor was measured and used to calculate CPU core power consumption in real time. Unless otherwise stated, the graphs here represent the total power drawn by both CPU cores.

Unfortunately, that's not all that's necessary to accurately measure Qualcomm CPU power. If you remember back to our original Krait architecture article you'll know that Qualcomm puts its L2 cache on a separate voltage and frequency plane. While the CPU cores in this case can run at up to 1.5GHz, the L2 cache tops out at 1.3GHz. I remembered this little fact late in the testing process, and we haven't yet found the power delivery circuit responsible for Krait's L2 cache. As a result, the CPU specific numbers for Qualcomm exclude any power consumed by the L2 cache. The total platform power numbers do include it however as they are measured at the battery.

The larger inductor in yellow feeds the GPU and it's instrumented using another 20 mΩ resistor.

Visualizing Krait's Multiple Power/Frequency Domains

Qualcomm remains adament about its asynchronous clocking with multiple voltage planes. The graph below shows power draw broken down by each core while running SunSpider:

SunSpider is a great benchmark to showcase exactly why Qualcomm has each core running on its own power/frequency plane. For a mixed workload like this, the second core isn't totally idle/power gated but it isn't exactly super active either. If both cores were tied to the same voltage/frequency, the second core would have higher leakage current than in this case. The counter argument would be that if you ran the second core at its max frequency as well it would be able to complete its task quicker and go to sleep, drawing little to no power. The second approach would require a very fast microcontroller to switch between v/f modes and it's unclear which of the two would offer better power savings. It's just nice to be able to visualize exactly why Qualcomm does what it does here.

On the other end of the spectrum however is a benchmark like Kraken, where both cores are fairly active and the workload is balanced across both cores:

 

Here there's no real benefit to having two independent voltage/frequency planes, both cores would be served fine by running at the same voltage and frequency. Qualcomm would argue that the Kraken case is rare (single threaded performance still dominates most user experience), and the power savings in situations like SunSpider are what make asynchronous clocking worth it. This is a much bigger philosophical debate that would require far more than a couple of graphs to support and it's not one that I want to get into here. I suspect that given its current power management architecture, Qualcomm likely picked the best solution possible for delivering the best possible power consumption. It's more effort to manage multiple power/frequency domains, effort that I doubt Qualcomm would put in without seeing some benefit over the alternative. That being said, what works best for a Qualcomm SoC isn't necessarily what's best for a different architecture.

Introduction Krait: Idle Power
POST A COMMENT

140 Comments

View All Comments

  • powerarmour - Friday, January 04, 2013 - link

    "Intel doesn't want to create a chip that cuts into it's very profitable mainstream CPU market."

    Indeed, they've left Cedar Trail to fester and die by totally withdrawing driver support :-

    http://communities.intel.com/message/175069#175069

    Quite a lot of desktop Atom hardware is still on the market, and they are trying their best to kill it off.
    Reply
  • djgandy - Friday, January 04, 2013 - link

    All that says to me is that they don't care about Win7 i.e. non tablets. Reply
  • Krysto - Friday, January 04, 2013 - link

    Cortex A15 coupled with Cortex A7 will use half the power on average. Also, I told you before that Mali T604 is more efficient than PowerVR in the latest iPads, and that's why Apple managed to use a more powerful GPU - because it's more inefficient. They sacrificed energy efficiency for performance, because they can use a very large battery in the iPad.

    I saw you're trying hard to "prove something" about Intel lately, and I'm not sure why. Is Intel is biggest "client" when they pay you for reviews here? Is that why you're trying so hard to make them look good?

    You're also always making unebelivable claims about what Intel chips will do in the future. Even if they get Haswell to 8W (is that for CPU only? The whole SoC? Is it peak TDP? Will it still need fans?), you do realize a Haswell chip costs as much as the whole BOM of an iPhone 5 right? Haswell chips will never arrive in smartphones, or in tablets that are competitive on price.
    Reply
  • Tetracycloide - Friday, January 04, 2013 - link

    You're always making "unebelivable" claims about what corruption does here. Do you have anything to back up your allegations to a normal person who would view any excitement about future possibilities as some kind of damning evidence that the writer must be on the take? It's like you think everyone that doesn't share your opinion of Intel is paid to have that opinion or something. Reply
  • trivik12 - Friday, January 04, 2013 - link

    Haswell ULV is a SOC. So the platform TDP was < 8W. You like it or lot intel has the best process technology and ultimately they will produce a platform which is faster and lower TDP.

    That being said ARM will dominate the smartphone market and even majority of low end laptops. I see intel existing only in mid to higher end smartphone plus tablets > $500.

    I am personally waiting for broadwell based tablet which should hopefully cut power even more in 14nm process.
    Reply
  • djgandy - Friday, January 04, 2013 - link

    You'd hope two brand new technologies would be better than two 3/4 year old ones wouldn't you. Clearly you are blinded by your love for ARM in the same way many here are blinded by love for Nvidia and actually consider Tegra 3 a competitive SOC.

    I don't think many people would be astonished to find that the T604, an architecture only released a few months back, is more efficient than PowerVR Series 5, dating back to 2008.

    Why are people so shocked to find that Intel can make a low power chip? It's not some kind of magic, it is a business goal. Power is a trade off just like performance. When you have desktop systems the trade off for using more power is seen as a pro for a 40-50% performance gain.
    Reply
  • mrdude - Friday, January 04, 2013 - link

    He's spot on about the pricing issue, though. Intel isn't going to start selling Haswell SoCs for $30, and if they do then they'll quickly go out of business. It's a completely different business model that they're trying to compete with. The Tegra 3 costs $15-$25 (and way closer to that $15 to date) while Intel charges $70+ for their CPU+GPU, and that's before you get to the chipset, WiFi and the rest. A low-TDP Haswell chip might offer great performance and fit in the same form factor (tablets), but if the tablet ends up costing $800+ and isn't Apple, well... nobody cares.

    It's not just a matter of performance but performance-per-dollar and design wins. Intel can't afford to drop prices to competitive levels on their Core products unless they can supplement it with very high volume. For very high volume you need to sell a lot of competitive SoCs that can do it all at a very reasonable price. The Tegra 3 was a big success not because it was an amazing performer, but because it offered decent performance for a very low price tag. Can Intel afford to do that with their cash cow business slipping? Remember that x86 seeing drops in sales and PCs aren't exactly doing very well right now. Intel already had to drop their margins and they've let fabs run idle and sent home engineers at their 14nm fab in Ireland all the while processor prices haven't decreased even a tiny bit. Those aren't signs of a company that's willing to compete on price
    Reply
  • Homeles - Friday, January 04, 2013 - link

    I'm more than willing to pay for the performance premium. Reply
  • mrdude - Friday, January 04, 2013 - link

    While you may be willing to fork over that much cash, most people won't. If you don't believe me, check out the recent sales figures of Win8 devices. The Win8 tablets (excluding Surface RT) don't even make up 1% of all Win8 products sold. That's not poor, that's absolutely horrible. On the other side the cheap Android tablets and smartphones have been gaining significant market share and outselling even the iPhone and iPads. Price matters. A lot. Furthermore, device makers/OEMs are more likely to go with the cheaper SoC if the experience is roughly equal. Remember that a majority of tablet and smartphone buyers don't browse Anandtech for benchmarks but buy based on things like display quality or whether it's got a nice look (or brand name, in the case of Apple). If an OEM can take that $60 saved and put it towards a better display, a larger battery or more NAND then that means a lot more in differentiating yourself from the competition than being 10-15% faster in X benchmark.

    People forget that these are SoCs and not CPUs. They also forget that these aren't DIY computers but tablets. Think about how much people complain when they see a $900 Ultrabook with a crappy 1366x768 TN display but those same people don't utter a word about how Intel's ULVs cost the same as their 35W parts. If the Intel chip was cheaper you'd probably have a better display or a cheaper price tag. This same notion extends to tablets and smartphones.

    Qualcomm is in a place where they can offer something everybody wants; their LTE is second to none. What does Intel have to offer to warrant Intel prices? Currently Intel's chipsets cost as much as an entire Tegra 3 SoC. x86 PC/server and ARM SoCs are in a completely different universe when it comes to pricing, and unless you've got something special (see Qualcomm or Apple) or you're making and selling the device (Samsung), then you're going to have a very rough time of it.
    Reply
  • jeffkro - Saturday, January 05, 2013 - link

    I paid $15 to "upgrade" my laptop and have since gone back to win 7. A lot of people simply don't want win 8 at any cost. Reply

Log in

Don't have an account? Sign up now