The CPU

TI was one of the earliest partners with ARM on the Cortex A15 and silicon just came back from the fab at the beginning of this year. Even if Apple were similarly instrumental in the definition of the Cortex A15 architecture, it would be Q3 at the earliest before it could have working silicon available in volume. With no A15 design ready and presumably no desire to jump into the custom-designed ARM CPU market quite yet, Apple once again turned to the Cortex A9 for the A5X.

Apple confirmed that there are only two Cortex A9 cores on the A5X and it neglected to mention operating frequency. I suspect the lack of talk about CPU clocks indicates that they perhaps haven't changed. We could still be looking at a 1GHz max operating frequency.

Although we've speculated that Apple moved to a 32nm design with the A5X, it is entirely possible that we're still dealing with mature 45nm silicon here. It would explain the relatively conservative GPU clocks, although the additional GPU cores would balloon die size to 150 - 160mm^2 (roughly twice the size of Tegra 3). If A5X is 32nm, assuming a relatively conservative 80% scaling factor Apple would be able to maintain a die size of around 125mm^2, similar to the previous generation A5.

A quad-core CPU design does make some sense on a tablet, but only one that is either running heavily threaded workloads or is subjected to pretty intense multitasking. As we found in our iPhone 4S review, many iOS apps are still not very well threaded and have a difficult time utilizing two cores, much less four. On the multitasking front, Apple has enabled task switching but there's still no way to run two applications side-by-side. The most CPU intensive workloads on iOS still require that the app is active in the foreground for user interaction. Apps can work in the background but it's not all that constant/common, and again, they aren't pegging multiple cores. Apple built a very efficient, low overhead platform with iOS - it had to thanks to the hardware specs of the original iPhone. A result of iOS' low-overhead, very efficient design is expectedly low CPU utilization for most tasks. This is not to say that CPU performance isn't important under iOS, just that it's hard to find apps that regularly require more than a single core and definitely hard to find those that can benefit from more than two cores.

I will say though, Apple could easily add more cores if it wanted to spend the die area without a significant impact on power consumption. Remember that idle cores can be fully power gated, effectively reducing their power consumption while idle to zero. Apple could also assume a fairly conservative CPU governor and only wake up the third and fourth cores when absolutely necessary (similar to what we see happening with Tegra 3 on Android).

What about the Next iPhone?

Apple has traditionally used the iPad SoC in the subsequent iPhone release that followed later in the same year. It would make sense to assume that we'll see a smartphone version of the A5X SoC (at lower clocks) later this year. The A6? That'll probably debut next year with the 4th generation iPad.

Memory Capacity

Apple wouldn't let us run any third party applications on the new iPad so we couldn't confirm the actual memory capacity of the new model. On stage at the event, Epic mentioned that the new iPad has more memory and a higher output resolution than the Xbox 360 or PlayStation 3. The Xbox 360 has 512MB of memory, and Apple's A5/A5X has a dual-channel LPDDR2 memory controller. Each channel needs to be populated evenly in order to maintain peak bandwidth, which greatly narrows the options for memory capacity on the new iPad. 768MB would imply 512MB on one channel and 256MB on the other, delivering peak performance for apps and data in the first 512MB but lower performance for the upper 256MB. Given the low cost of DRAM these days, I think it's safe to assume that Apple simply went with two 512MB DRAM devices in a PoP configuration on the A5X for a total of 1GB of LPDDR2 memory in the new iPad.

4G LTE Support

Brian did an excellent analysis on the LTE baseband in the new iPad here. Qualcomm's MDM9600, a 40nm design appears to be used by Apple instead of the 28nm MDM9615. In hindsight, speculating the use of a 28nm LTE baseband for the new iPad was likely short sighted. Apple had to be in the mass production phase for the new iPad somewhere in the January/February timeframe. Although 28nm silicon is shipping to customers today, that was likely too aggressive of a schedule to make it work for an early-March launch.

Apple iPad Pricing
  16GB 32GB 64GB
WiFi $499 $599 $699
WiFi + 4G $629 $729 $829

Apple offers carrier specific iPad 4G models on AT&T and Verizon, although both versions can roam on 3G networks around the world. Apparently the iPad 4G isn't SIM locked, so you'll be able to toss in a SIM from other carriers with compatible networks. LTE data plans are available from AT&T and Verizon with no long-term contract:

iPad LTE Plan Pricing (Monthly)
  $14.99 $20 $30 $50
AT&T 250MB - 3GB 5GB
Verizon - 1GB 2GB 5GB

 

The Name

Apple surprised many by referring to the 3rd generation iPad simply as "the new iPad". The naming seems awkward today, but it's clearly a step towards what Apple does across many of its product lines. The MacBook Air, MacBook Pro and iPod all receive the same simple branding treatment; newer models are differentiated by a quietly spoken year or generation marker.

I still remember back several years ago when PC OEMs were intrigued by the idea of selling desktops based on model year and not on specs. Apple has effectively attained the holy grail here.

The GPU A Much Larger Battery
Comments Locked

161 Comments

View All Comments

  • c4v3man - Friday, March 9, 2012 - link

    Why not spend another $5-10 on components and make a $600 32GB transformer the base model? That way you still maintain most of the profit margin you want to have, while also being competitive cost-wise. I can appreciate that you are using some components that may be considered better than the newiPad, but you are also using some that can be considered worse. Past experience shows that tablets priced higher than Apple fail in the marketplace since people can't accept a reality where Apple isn't the "premium offering".
  • Lucian Armasu - Friday, March 9, 2012 - link

    By the way Anand. Is there any way to test the graphics performance on their native resolutions anymore? I think you should bring that back and show the fixed resolution vs native resolution tests side by side. Because I actually think the iPad 3 suffered a very significant performance drop due to the new resolution, just like the iPhone 4 was always the device at the bottom of the graphics test because of its retina display.

    So I'm aware that the chip itself should be faster when comparing everything at the same 720p resolution. But that doesn't really mean much for the regular user does it? What matters is real world performance, and that means it matters how fast the iPad is at its *own* resolution, not a theoretical lower resolution that has nothing to do with it.
  • WaltFrench - Friday, March 9, 2012 - link

    Let's think this through a bit. Users don't run GLBench or that sort of stuff; they run games that a developer has tweaked for a platform. Subject to budget — I haven't had the pleasure myself, but hear tell that it requires good, solid engineering and lots of it — the developer puts out the best mix of resolution & speed that will please the customers the most. (Who would do otherwise?)

    Obviously, if you don't have the resolution, you go for fps. So it's conceivable that a 720p device could show better speed. But that'd only be true if the dev was pushing so hard on the iPad's 4X of pixels that he sacrificed play speed. If it came to that, he'd pull back on AA or other detail/texture quality efforts. Right? Wouldn't you?

    So what I think it comes to is how hard a given game dev will work on a particular platform's capabilities. Here, fragmentation and total sales come to play, big time. Anand might be able to give you a theoretical tradeoff that a dev faces, but it might be quite the challenge to translate that into how well gamers would like a given device for stuff they can actually play.
  • medi01 - Saturday, March 10, 2012 - link

    In other words, did Apple's marketing department forbid you doing native resolution benchmarks?
  • doobydoo - Monday, March 12, 2012 - link

    Native resolution test is a flawed test.

    As I've explained to your other comments, performance has to take into account the resolution. IE, 100 fps at 10 x 10 is clearly worse than 60 fps at 2000 x 1000.

    It's very telling that you make this suggestion now Apple has come out with the highest resolution device. Not something you requested previously when Android tablets had higher resolution.

    The iPhone 4 was never bottom of any sensible benchmarks because of its retina display. The tests, as always, were done at the same resolution as they always should be. The iPhone 4 was low down in the benchmarks because it had a slow GPU.
  • rashomon_too - Friday, March 9, 2012 - link

    If displaysearchblog.com is correct (http://www.displaysearchblog.com/2012/03/ipad-3-cl... most of the extra power consumption is for the display. Because of the lower aperture ratio at the higher pixel density, more backlighting is needed, requiring perhaps twice as many LEDs.
  • jacobdrj - Friday, March 9, 2012 - link

    I am no coder, but even with my gaming rig, I have a 27" 1900x1200 display (that I admittedly paid too much for), but I flanked it with 2 inexpensive 1080p displays, rotated vertically in portrait mode for eyefinity and web browsing.
  • IHateMyJob2004 - Friday, March 9, 2012 - link

    Save your money.

    Buy a Playbook
  • tipoo - Friday, March 9, 2012 - link

    Save your money. Buy a toaster.

    Wait no, I forgot the part where they do different things :P
  • KoolAidMan1 - Monday, March 12, 2012 - link

    Not to mention that toasters are actually useful

Log in

Don't have an account? Sign up now