Display

The iPhone 5s, like the iPhone 5c, retains the same 4-inch Retina Display that was first introduced with the iPhone 5. The 4-inch 16:9 LCD display features a 1136 x 640 resolution, putting it at the low end for most flagship smartphones these days. It was clear from the get-go that a larger display wouldn’t be in the cards for the iPhone 5s. Apple has stuck to its two generation design cadence since the iPhone 3G/3GS days and it had no indication of breaking that trend now, especially with concerns of the mobile upgrade cycle slowing. Recouping investment costs on platform and industrial design are a very important part of making the business work.


Apple is quick to point out that iOS 7 does attempt to make better use of display real estate, but I can’t shake the feeling of being too cramped on the 5s. I’m not advocating that Apple go the route of some of the insanely large displays, but after using the Moto X for the past month I believe there’s a good optimization point somewhere around 4.6 - 4.7”. I firmly believe that Apple will embrace a larger display and branch the iPhone once more, but that time is just not now.

The 5s’ display remains excellent and well calibrated from the factory. In an unusual turn of events, my iPhone 5c sample came with an even better calibrated display than my 5s sample. It's a tradeoff - the 5c panel I had could go way brighter than the 5s panel, but its black levels were also higher. The contrast ratio ended up being very similar between the devices as a result. I've covered the panel lottery in relation to the MacBook Air, but it's good to remember that the same sort of multi-source components exist in mobile as well.

Brightness (White)

Brightness (Black)

Contrast Ratio

CalMAN Display Performance - White Point Average

CalMAN Display Performance - Grayscale Average dE 2000

Color accuracy is still excellent just out of the box. Only my iPhone 5c sample did better than the 5s in our color accuracy tests. Grayscale accuracy wasn't as good on my 5s sample however.

CalMAN Display Performance - Saturations Average dE 2000

Saturations:


 

CalMAN Display Performance - Gretag Macbeth Average dE 2000

GMB Color Checker: 


Cellular

When early PCB shots of the 5s leaked, I remember Brian counting solder pads on the board to figure out if Apple moved to a new Qualcomm baseband solution. Unfortunately his count came out as being the same as the existing MDM9x15 based designs, which ended up what launched. It’s unclear whether or not MDM9x25 was ready in time in order to be integrated into the iPhone 5s design, or if there was some other reason that Apple chose against implementing it here. Regardless of the why, the result is effectively the same cellular capabilities as the iPhone 5.

Apple tells us that the wireless stack in the 5c and 5s is all new, but the lack of LTE-Advanced features like carrier aggregation and Category 4 150Mbps downlink make it likely that we’re looking at a MDM9x15 derivative at best. LTE-A support isn’t an issue at launch, however as Brian mentioned on our mobile show it’s going to quickly become a much needed feature for making efficient use of spectrum and delivering data in the most power efficient way.

The first part is relatively easy to understand. Carrier aggregation gives mobile network operators the ability of combining spectrum across non-contiguous frequency bands to service an area. The resulting increase in spectrum can be used to improve performance and/or support more customers on LTE in areas with limited present day LTE spectrum.

The second part, improving power efficiency, has to do with the same principles of race to sleep that we’ve talked about for years. The faster your network connection, the quicker your modem can transact data and fall back into a lower power sleep state.

The 5s’ omission of LTE-A likely doesn’t have immediate implications, but those who hold onto their devices for a long time will have to deal with the fact that they’re buying at the tail end of a transition to a new group of technologies.

In practice I didn’t notice substantial speed differences between the iPhone 5s, 5c and the original iPhone 5. My testing period was a bit too brief to adequately characterize the device but I didn’t have any complaints. The 5s retains the same antenna configuration as the iPhone 5, complete with receive diversity. As Brian discovered after the launch, the Verizon iPhone 5s doesn’t introduce another transmit chain - so simultaneous voice and LTE still aren’t possible on that device.

Apple is proud of its support for up to 13 LTE bands on some SKUs. Despite the increase in support for LTE bands there are a lot of iPhone 5s SKUs that will be shipped worldwide:

Apple iPhone 5S and 5C Banding
iPhone Model GSM / EDGE Bands WCDMA Bands FDD-LTE Bands TDD-LTE Bands CDMA 1x / EVDO Rev A/B Bands

5S- A1533 (GSM)
5C- A1532

850, 900, 1800, 1900 MHz 850, 900, 1700/2100, 1900, 2100 MHz 1, 2, 3, 4, 5, 8, 13, 17, 19, 20, 25 N/A N/A

5S- A1533 (CDMA)
5C- A1532

800, 1700/2100, 1900, 2100 MHz

5S- A1453
5C- A1456

1, 2, 3, 4, 5, 8, 13, 17, 18, 19, 20, 25, 26
5S- A1457
5C- A1507
850, 900, 1900, 2100 MHz 1, 2, 3, 5, 7, 8, 20 N/A
5S- A1530
5C- A1529
1, 2, 3, 5, 7, 8, 20 38, 39, 40

 

Apple iPhone 5S/5C FCC IDs and Models
FCC ID Model
BCG-E2642A A1453 (5S) A1533 (5S)
BCG-E2644A A1456 (5C) A1532 (5C)
BCG-E2643A A1530 (5S)
BCG-E2643B A1457 (5S)
BCG-E2694A A1529 (5C)
BCG-E2694B A1507 (5C)

WiFi

WiFi connectivity also remains unchanged on the iPhone 5s. Dual band (2.4/5GHz) 802.11n (up to 150Mbps) is the best you’ll get out of the 5s. We expected Apple to move to 802.11ac like some of the other flagship devices we’ve seen in the Android camp, but it looks like you’ll have to wait another year for that.

I don’t believe you’re missing out on a lack of 802.11ac support today, but over the life of the iPhone 5s I do expect greater deployment of 802.11ac networks (which can bring either performance or power benefits to a mobile platform).

WiFi Performance - iPerf

WiFi performance seems pretty comparable to the iPhone 5. The HTC One and Moto X pull ahead here as they both have 802.11ac support.

Video Final Words
Comments Locked

464 Comments

View All Comments

  • Wilco1 - Wednesday, September 18, 2013 - link

    If all you can do is name calling then you clearly haven't got a clue or any evidence to prove your point. Either come up with real evidence or leave the debate to the experts. Do you even understand what IPC means?

    For example in your link a low clocked Jaguar is keeping up with a much higher clocked Bay Trail (yes it boosts to 2.4GHz during the benchmark run), so the obvious conclusion is that Jaguar has far higher IPC than Bay Trail. For example Jaguar has 28% higher IPC than BT in the 7-zip test. Just like I said.

    Now show me a single benchmark where BT gets better IPC than Jaguar. Put up or shut up.
  • zeo - Wednesday, September 18, 2013 - link

    The point that BT Beats Jaguar, especially at performance per watt, clearly proved the point given!

    And insisting as you are on your original assessment is a characteristic of acting like a Troll... So you're not going to convince anyone by simply insisting on being right... especially when we can point to Anandtech pointing out multiple benchmarks in this article that showed the Kabini performing lower than bother BT and the A7!

    So either learn to read what these reviews actually post or accept getting labeled a Troll... either way, you're not winning this argument!
  • Wilco1 - Wednesday, September 18, 2013 - link

    No, Bob's claim was that Bay Trail was faster clock for clock than Jaguar, when the link he gave to prove it clearly showed that is false. BT may well beat Jaguar on perf/watt, but that's not at all what we were discussing.

    So next time try to understand what people are discussing before jumping in and calling people a Troll. And yes I stand by my characterization of various microarchitectures, precisely because it's based on actual benchmark results.
  • Bob Todd - Wednesday, September 18, 2013 - link

    IPC as a comparison point made a lot of sense when we were arguing about which 130 watt desktop processor had the better architecture. It seems largely irrelevant for mobile where we care about performance per watt. Your argument is continually that the ARM/AMD designs are 'faster' based on Geekbench. If Jaguar has a 28% higher IPC than Bay Trail, do you honestly think it matters if Bay Trail is still the faster chip @ 1/3 (or less) of the power requirements? If someone came up with a crazy design that needed 5x the clocks to have a 2x performance advantage of their competitor, but did so with half the power budget, they'd still be racking up design wins (assuming parity for all other aspects like price). That's a two way street. If ARM designs a desktop/server focused chip that needs higher clocks than Intel to reach performance parity or be faster than Haswell, but does so with significantly less power it's still a huge win for them.
  • Wilco1 - Wednesday, September 18, 2013 - link

    IPC matters as you can compare different microarchitectures and make predictions on performance at different clock speeds. I'm sure you know many CPUs come in a confusing variation of clockspeeds (and even different base/turbo frequencies for Intel parts), but the underlying microarchitecture always remains the same. You can't make claims like "Bay Trail is faster than Jaguar" when such a claim would only valid at very specific frequencies. However we can say that Jaguar has better IPC than BT and that will remain true irrespectively of the frequency. So that is the purpose of the list of microarchitectures I posted.

    I was originally talking about the performance of Apple A7 and Bay Trail in Geekbench. You may not like Geekbench, but it represents close to actual CPU performance (not rubbish JavaScript, tuned benchmarks, cheating - remember AnTuTu? - or unfair compiler tricks).

    Now you're right that besides absolute performance, perf/W is also important. Unfortunately there is almost no detailed info on power consumption, let alone energy to do a certain task for various CPUs. While TDP (in the rare cases it is known!) can give some indication, different feature sets, methodologies, "dial-a-TDP" and turbo features makes them hard to compare. What we can say in general is that high-frequency designs tend to be less efficient and use more power than lower frequency, higher IPC designs. In that sense I would not be surprised if the A7 also shows a very good perf/Watt. How it compares with BT is not clear until BT phones appear.
  • Bob Todd - Wednesday, September 18, 2013 - link

    Your point about benchmarks is actually what surprises me the most nowadays. The biggest thing every in-depth review of a new ARM design brings to light is how freaking piss poor the state of mobile benchmarking is from a software standpoint. I didn't expect magic by the time we got to A9 designs, but it's a little ridiculous that we're still in a state of infancy for mobile benchmarking tools over half a decade after the market really started heating up.
  • Bob Todd - Wednesday, September 18, 2013 - link

    And by "ARM design" I mean both their cores or others building to their ISA.
  • Wilco1 - Thursday, September 19, 2013 - link

    Yes, mobile benchmarking is an absolute disgrace. And that's why I'm always pointing out how screwed up Anand's benchmarking is - I'm hoping he'll understand one day. How anyone can conclude anything from JS benchmarks is a total mystery to me. Anand might as well just show AnTuTu results and be done with it, that may actually be more accurate!

    Mobile benchmarks like EEMBC, CoreMark etc are far worse than the benchmarks they try to replace (eg. Dhrystone). And SPEC is useless as well. Ignoring the fact it is really a server benchmark, the main issue is that it ended up being a compiler trick contest than a fair CPU benchmark. Of course Geekbench isn't perfect either, but at the moment it's the best and fairest CPU bench: because it uses precompiled binaries you can't use compiler tricks to pretend your CPU is faster.
  • akdj - Thursday, September 19, 2013 - link

    SO.....what is it the 'crew' is supposed to 'do'? NOT provide ANY benchmarks? Anand and team are utilizing the benchmarks available right now. They're not building the software to bench these devices...they're reviewing them...with the tools available, currently, NOW---on the market. If you're so interested in better mobile benchmarking (still in it's infancy---it's only really been 5 years since we've had multiple devices to even test), why not pursue and build your own benchmarking software? Seems like it may be a lucrative project. Sounds like you know a bit about CPU/GPU and SoC architecture---put something together. Sunspider is ubiquitous, used on any and all platforms from desktops to laptops---tablets to phones, people 'get it'. As well, GeekBench is re-inventing their benchmarking software---as well, the Google Octane tests are fairly new...and many of the folks using these devices ARE interested in how fast their browser populates, how quick a single core is---speed of apps opening and launching, opening a PDF, FPS playing games, et al.
    Again---if you're not 'happy' with how Anand is reviewing gear (the best on the web IMHO), open your own site---build your own tools, and lets see how things turn out for ya!
    Give credit where credit is due....I'd much rather see the way Anand is approaching reviews in the mobile sector than a 1500 word essay without benchmarking results because current "mobile benchmarking is an absolute disgrace"
    YMMV as always
    J

    PS---Thanks for the review guys....again, GREAT Job!
  • Bob Todd - Thursday, September 19, 2013 - link

    Umm...I think you missed my point. I love the reviews here. That doesn't change the fact that mobile benchmarking software sucks compared to what we have available on the desktop. That isn't a slam against this site or any of the reviewers, and I fully expect them to use the (relatively crappy) software tools that are available. And they've even gone above and beyond and written some tools themselves to test specific performance aspects. I'm just surprised that with mobile being the fastest growing market, nobody has really stepped up to the plate to offer a good holistic benchmarking suite to measure cpu/gpu/memory/io performance across at least iOS/Android. And no, I don't expect anyone at Anandtech to write or pay someone to write such a tool.

Log in

Don't have an account? Sign up now