Power Consumption

As always I ran the Xbox One through a series of power consumption tests. I’ve described the tests below:

Off - Console is completely off, standby mode is disabled
Standby - Console is asleep, can be woken up by voice commands (if supported). Background updating is allowed in this mode.
Idle - Ethernet connected, no disc in drive, system idling at dashboard.
Load (BF4) - Ethernet connected, Battlefield 4 disc in drive, running Battlefield 4, stationary in test scene.
Load (BD Playback) - Ethernet connected, Blu-ray disc in drive, average power across Inception test scene.
CPU Load - SunSpider - Ethernet connected, no disc in drive, running SunSpider 1.0.2 in web browser.
CPU Load - Kraken - Ethernet connected, no disc in drive, running Kraken 1.1 in web browser

Power Consumption Comparison
Total System Power Off Standby Idle Load (BF4) Load (BD Playback)
Microsoft Xbox 360 Slim 0.6W - 70.4W 90.4W (RDR) -
Microsoft Xbox One 0.22W 15.3W 69.7W 119.0W 79.9W
Sony PlayStation 4 0.45W 8.59W 88.9W 139.8W 98.0W

When I first saw the PS4’s idle numbers I was shocked. 80 watts is what our IVB-E GPU testbed idles at, and that’s with a massive 6-core CPU and a Titan GPU. Similarly, my Haswell + Titan CPU testbed has a lower idle power than that. The Xbox One’s numbers are a little better at 69W, but still 50 - 80% higher than I was otherwise expecting.

Standby power is also surprisingly high for the Xbox One. Granted in this mode you can turn on the entire console by saying Xbox On, but always-on voice recognition is also something Motorola deployed on the Moto X and did so in a far lower power budget.

The only good news on the power front is really what happens when the console is completely off. I’m happy to report that I measured between 0.22 and 0.45W of draw while off, far less than previous Xbox 360s.

Power under load is pretty much as expected. In general the Xbox One appears to draw ~120W under max load, which isn’t much at all. I’m actually surprised by the delta between idle power and loaded GPU power (~50W). In this case I’m wondering if Microsoft is doing much power gating of unused CPU cores and/or GPU resources. The same is true for Sony on the PS4. It’s entirely possible that AMD hasn’t offered the same hooks into power management that you’d see on a PC equipped with an APU.

Blu-ray playback power consumption is more reasonable on the Xbox One than on the PS4. In both cases though the numbers are much higher than I’d like them to be.

I threw in some browser based CPU benchmarks and power numbers as well. Both the Xbox One and PS4 ship with integrated web browsers. Neither experience is particularly well optimized for performance, but the PS4 definitely has the edge at least in javascript performance.

Power Consumption Comparison
Lower is Better SunSpider 1.0.2 (Performance) SunSpider 1.0.2 (Power) Kraken 1.1 (Performance) Kraken 1.1 (Power)
Microsoft Xbox One 2360.9 ms 72.4W 111892.5 ms 72.9W
Sony PlayStation 4 1027.4 ms 114.7W 22768.7 ms 114.5W

Power consumption while running these CPU workloads is interesting. The marginal increase in system power consumption while running both tests on the Xbox One indicates one of two things: we’re either only taxing 1 - 2 cores here and/or Microsoft isn’t power gating unused CPU cores. I suspect it’s the former, since IE on the Xbox technically falls under the Windows kernel’s jurisdiction and I don’t believe it has more than 1 - 2 cores allocated for its needs.

The PS4 on the other hand shows a far bigger increase in power consumption during these workloads. For one we’re talking about higher levels of performance, but it’s also possible that Sony is allowing apps access to more CPU cores.

There’s definitely room for improvement in driving down power consumption on both next-generation platforms. I don’t know that there’s huge motivation to do so outside of me complaining about it though. I would like to see idle power drop below 50W, standby power shouldn’t be anywhere near this high on either platform, and the same goes for power consumption while playing back a Blu-ray movie.

Image Quality - Xbox One vs. PlayStation 4 Final Words
Comments Locked

286 Comments

View All Comments

  • elerick - Wednesday, November 20, 2013 - link

    Thanks for the power consumption measurements. Could the xbox one standby power be higher due to power on / off voice commands?
  • althaz - Wednesday, November 20, 2013 - link

    I suspect this is almost certainly the case. I wonder if it drops down below 10w if you turn off the Kinect (which I would never do myself)?

    I also hope Sony update their software - the Xbox stays in the 16-18w range when downloading updates, whereas the PS4 jumps up to 70 (70w when in standby and downloading an update, but still takes 30 seconds to start up!).
  • mutantsushi - Saturday, November 23, 2013 - link

    It seems that the PS4's extremely high standby/download power draw is due to the ARM co-processor not being up to the task, it was supposed to be able to handle basic I/O tasks and other needed features, but apparently it wasn't quite spec'd sufficiently for the task, forcing Sony to keep the main CPU powered on to handle that task. The rumor is that they will "soon" release a new revision with a more powerful ARM core that is up to the task, and which should allow powering down the x86 CPU completely, as per the original plan. (either that, or managing to rework the "standby" software functions so that the existing ARM core can handle it would also do the trick)

    I believe MS is now also rumored to "soon" release a revision of the Xbone, although what that might entail is unknown. An SKU without the Kinect could allow them to drop the price $100 to better compete with PS4.

    Incidentally, the Xbone seems to be running HOTTER than the PS4, so MS' design certainly cannot be said to be a more efficient cooling design, more like they have more open space which isn't being efficiently used compared to PS4's design. The temp differential is also probably down to MS' last minute decision to give a 5% clockspeed bump to the APU.

    I'm looking forward to the 'in depth' article covering each. As far as performance is applicable in actual use scenarios, i.e. games, I'm interested to get the real low-down... The vast similarity in most aspects really narrows the number of factors to consider, so the actual differentiating factors really should be able to comprehensively addressed in their implications.

    Like Anand says, I don't think memory thru-put is a gross differentiator per se, or at least we could say that Xbone's ESRAM can be equivalent under certain plausible scenarios, even if it is less flexible than GDDR and thus restricts possible development paths. For cross-platform titles at least, that isn't really a factor IMHO.

    The ROP difference is probably the major factor for any delta in frame buffer resolution, but PS4's +50% compute unit advantage still remains as a factor in future exploitability... And if one wants to look at future exploitability then addressing GPU and PS4's features applicable to that is certainly necessary. I have seen discussion of GPGPU approaches which essentially can more efficiently achieve 'traditional graphics' tasks than a conventional pure GPU approach, so this is directly applicable to 'pure graphics' itself, as well as the other applications of GPGPU - game logic/controls like collisions, physics, audio raycasting, etc.

    When assessing both platforms' implications for future developments, I just don't see anything on Xbone's side that presents much advantage re: unique architectural advantages that isn't portable to PS4 without serious degradation, while the reverse does very much present that situation. While crossplatform games of course will not truly leverage architectural advantages which allow for 'game changing' differences, PS4's CU, ROP, and GPGPU queue advantages should pretty consistently allow for 'turning up the quality dial' on any cross-platform title... And to the extent that their exploitation techniques becomes widely used, we could in fact see some 'standardizd' design approachs which exploit e.g. the GPGPU techniques in ways easily 'bolted on' to a generic cross platform design... Again that's not going to change the ultimate game experience, but it is another vector to increase the qualitative experience. Certainly in even the first release games there is differences in occlusion techniques, and this is almost certainly without significant exploitation of GPGPU.
  • mutantsushi - Saturday, November 23, 2013 - link

    If Xbone's resolution is considered satisfactory, I do wonder what PS4 can achieve at the same resolution but utilizing the extra CU and GPGPU capacity to achieve actually unique difference, not just a similar experience at higher resolution (i.e. 1080 vs. 900). If 900 upscaled is considered fine, what can be done if that extra horsepower is allocated elsewhere instead of increasing the pixel count?
  • errorr - Wednesday, November 20, 2013 - link

    That is still ridiculous considering the moto x and some other future phones can do the same thing at order of magnitudes less power draw.
  • Teknobug - Wednesday, November 20, 2013 - link

    I love my Moto X, X8 8-core processor means each core has its own job, 1 core for always listening and 1 core for active notifications. Very easy on the battery, which is why it is one of the best battery life phones right now.
  • uditrana - Thursday, November 21, 2013 - link

    Do you even know what you are talking about?
  • blzd - Thursday, November 21, 2013 - link

    Moto X is dual core. x8 is a co processor not eight cores.
  • errzone - Monday, November 25, 2013 - link

    That's not entirely true either. The Moto X uses the Qualcomm MSM8960. The SOC is a dual core processor with an Adreno 320 GPU, which has 4 cores. Adding the 2 co-processors equals 8; hence Motorola marketing speak of X8.
  • Roy2001 - Wednesday, December 11, 2013 - link

    Kinect?

Log in

Don't have an account? Sign up now