The CPU

TI was one of the earliest partners with ARM on the Cortex A15 and silicon just came back from the fab at the beginning of this year. Even if Apple were similarly instrumental in the definition of the Cortex A15 architecture, it would be Q3 at the earliest before it could have working silicon available in volume. With no A15 design ready and presumably no desire to jump into the custom-designed ARM CPU market quite yet, Apple once again turned to the Cortex A9 for the A5X.

Apple confirmed that there are only two Cortex A9 cores on the A5X and it neglected to mention operating frequency. I suspect the lack of talk about CPU clocks indicates that they perhaps haven't changed. We could still be looking at a 1GHz max operating frequency.

Although we've speculated that Apple moved to a 32nm design with the A5X, it is entirely possible that we're still dealing with mature 45nm silicon here. It would explain the relatively conservative GPU clocks, although the additional GPU cores would balloon die size to 150 - 160mm^2 (roughly twice the size of Tegra 3). If A5X is 32nm, assuming a relatively conservative 80% scaling factor Apple would be able to maintain a die size of around 125mm^2, similar to the previous generation A5.

A quad-core CPU design does make some sense on a tablet, but only one that is either running heavily threaded workloads or is subjected to pretty intense multitasking. As we found in our iPhone 4S review, many iOS apps are still not very well threaded and have a difficult time utilizing two cores, much less four. On the multitasking front, Apple has enabled task switching but there's still no way to run two applications side-by-side. The most CPU intensive workloads on iOS still require that the app is active in the foreground for user interaction. Apps can work in the background but it's not all that constant/common, and again, they aren't pegging multiple cores. Apple built a very efficient, low overhead platform with iOS - it had to thanks to the hardware specs of the original iPhone. A result of iOS' low-overhead, very efficient design is expectedly low CPU utilization for most tasks. This is not to say that CPU performance isn't important under iOS, just that it's hard to find apps that regularly require more than a single core and definitely hard to find those that can benefit from more than two cores.

I will say though, Apple could easily add more cores if it wanted to spend the die area without a significant impact on power consumption. Remember that idle cores can be fully power gated, effectively reducing their power consumption while idle to zero. Apple could also assume a fairly conservative CPU governor and only wake up the third and fourth cores when absolutely necessary (similar to what we see happening with Tegra 3 on Android).

What about the Next iPhone?

Apple has traditionally used the iPad SoC in the subsequent iPhone release that followed later in the same year. It would make sense to assume that we'll see a smartphone version of the A5X SoC (at lower clocks) later this year. The A6? That'll probably debut next year with the 4th generation iPad.

Memory Capacity

Apple wouldn't let us run any third party applications on the new iPad so we couldn't confirm the actual memory capacity of the new model. On stage at the event, Epic mentioned that the new iPad has more memory and a higher output resolution than the Xbox 360 or PlayStation 3. The Xbox 360 has 512MB of memory, and Apple's A5/A5X has a dual-channel LPDDR2 memory controller. Each channel needs to be populated evenly in order to maintain peak bandwidth, which greatly narrows the options for memory capacity on the new iPad. 768MB would imply 512MB on one channel and 256MB on the other, delivering peak performance for apps and data in the first 512MB but lower performance for the upper 256MB. Given the low cost of DRAM these days, I think it's safe to assume that Apple simply went with two 512MB DRAM devices in a PoP configuration on the A5X for a total of 1GB of LPDDR2 memory in the new iPad.

4G LTE Support

Brian did an excellent analysis on the LTE baseband in the new iPad here. Qualcomm's MDM9600, a 40nm design appears to be used by Apple instead of the 28nm MDM9615. In hindsight, speculating the use of a 28nm LTE baseband for the new iPad was likely short sighted. Apple had to be in the mass production phase for the new iPad somewhere in the January/February timeframe. Although 28nm silicon is shipping to customers today, that was likely too aggressive of a schedule to make it work for an early-March launch.

Apple iPad Pricing
  16GB 32GB 64GB
WiFi $499 $599 $699
WiFi + 4G $629 $729 $829

Apple offers carrier specific iPad 4G models on AT&T and Verizon, although both versions can roam on 3G networks around the world. Apparently the iPad 4G isn't SIM locked, so you'll be able to toss in a SIM from other carriers with compatible networks. LTE data plans are available from AT&T and Verizon with no long-term contract:

iPad LTE Plan Pricing (Monthly)
  $14.99 $20 $30 $50
AT&T 250MB - 3GB 5GB
Verizon - 1GB 2GB 5GB

 

The Name

Apple surprised many by referring to the 3rd generation iPad simply as "the new iPad". The naming seems awkward today, but it's clearly a step towards what Apple does across many of its product lines. The MacBook Air, MacBook Pro and iPod all receive the same simple branding treatment; newer models are differentiated by a quietly spoken year or generation marker.

I still remember back several years ago when PC OEMs were intrigued by the idea of selling desktops based on model year and not on specs. Apple has effectively attained the holy grail here.

The GPU A Much Larger Battery
Comments Locked

161 Comments

View All Comments

  • medi01 - Saturday, March 10, 2012 - link

    Don't iZombie much, please.

    I keep my phone and tablet at the same distance, I guess I "hold it wrong way" in Hypnosteve's books.

    The point of "retina" was that density was so high, that pixels were indistinguishable for a human eye. (distance matters a lot here) at some magical distance.

    Indeed by playing with distance one could reduce resolution yet claim "it's "retina"". But then one could apply that "retina" buzzword to many pieces of older hardware.

    Off-screen benchmarks show no practical results to the customers and are only deceiving. Nobody uses CPU/GPU on their own, it's used only with particular resolution screen and decoupling them is just a way to deceive.
  • doobydoo - Monday, March 12, 2012 - link

    How far you personally hold your tablet away is irrelevant. 'Retina' term isn't about you. It's about a typical user, with typical vision, holding the tablet at a typical distance, being unable to distinguish pixels.

    Typical users DO hold tablets further away, so it's perfectly logical.

    By 'Playing with the distance' you could indeed claim anything is retina - but that would make your claim incorrect because people don't hold the device at that distance, on average. The consensus amongst scientists and tech experts is that people DO hold tablets at the distance required to make this display retina.

    Off screen benchmarks eliminate both resolution and v-sync as factors (v-sync on screen benchmarks are the only reason the iPad 2 was slower in any GPU benchmarks - it limits FPS). As a result, you are given an accurate comparison of GPU performance. 'Practical Results' that you describe is a very difficult metric to calculate. While you would seemingly advocate a raw FPS metric, that fails to take into account resolution.

    For example, is 100 FPS at 10 x 10 resolution better than 60 FPS at 2000 x 1000? Of course not.

    Whichever way you look at it, the new iPad has a GPU which is up to 4x faster than the fastest Android tablet. It also has the best resolution. Any games designed to run on that high resolution will be tested to make sure they run at a playable FPS so the 'real world' performance will be both higher resolution and just as fast as any Android tablet.

    You seem to be completely bitter and unable to admit Apple has the technological lead right now.
  • seanleeforever - Monday, March 12, 2012 - link

    i didn't realize my 2 year old 1080p 65 inch TV was 'retina' display.
  • Michiel - Friday, March 9, 2012 - link

    Envy eats you alive. Go see a shrink !
  • medi01 - Saturday, March 10, 2012 - link

    Oh, sorry, I've forgotten it's a status thing.
    People paying 20-50 Euros less for a Samsung Galaxy obviously can not afford these über - revolutionary devices, hence they could only envy.
  • ripshank - Sunday, March 11, 2012 - link

    medi01: So sad. Your remarks only show your insecurity to the world.

    Relax, breathe and just let others enjoy their gadgets of choice rather than resorting to name calling and mockery. Realize these are friggin gadgets, not politics or religion. But from your comments, it's like Apple killed your family, took away your job and stole your wife.

    What is wrong with the world today when people get so worked up over an object?
  • medi01 - Sunday, March 11, 2012 - link

    Ad hominem, eh?

    There is nothing wrong with objecting to lies.

    Reviewers "forgetting iPhone in the pocket" on comparison photos where it would look pale, including nVidia's cherry picked card vs AMD's stock on marketing department's request and "off-screen benchmarks" all over the place are not simply bad, it stinks.
  • stsk - Monday, March 12, 2012 - link

    Seriously. Seek help.
  • doobydoo - Monday, March 12, 2012 - link

    1 - There is something wrong with objecting to lies INCORRECTLY. That's your own failing.

    2 - Ad hominem? I'll never understand why you Americans try to use that phrase all the time, as well as 'Straw man' - it not only makes you sound pretentious, trying to sound more intelligent than you are, it's also hypocritical:

    'Don't iZombie much, please.'

    Just say 'insults' - jeez.

    3 - Off-screen benchmarks are used by impartial review sites, as I explained above, because that is the only way to properly compare GPU performance. On-screen benchmarks have different resolutions and are limited by v-sync.

    4 - Claims of conspiracies on photos is just ridiculous.
  • Greg512 - Monday, March 12, 2012 - link

    "you Americans"

    Way to be a pretentious hypocrite.

Log in

Don't have an account? Sign up now