A New Way To Navigate Between Apps

iOS 9 has a couple new features that Apple collectively calls Deep Linking. Users are undoubtedly familiar with the scenario where they’ll receive a link to a YouTube video or a tweet, or something else that relates to a service that does have a native iOS app. Unfortunately, clicking these links will usually bring you to a web page in mobile Safari instead of the appropriate application, which isn’t a very good experience. Android tackled this issue long ago with Intents, but even with the improvements to inter-app communication in iOS 8 there wasn’t a way for developers to easily and seamlessly implement such a feature. With iOS 9 they finally can.

Deep Linking builds on the same foundation as Handoff from the web which was introduced in iOS 8. The way that developers can implement deep linking is fairly straightforward. On their web server there needs to be a file called an apple-app-site-association file. This file contains a small amount of JSON code which indicates which sections or subdomains of your website should be directed to an on-device application rather than Safari. To ensure security, the association file needs to be signed by an SSL certificate. Once the developer has done this there’s nothing more to be done, and iOS will take care of opening the appropriate application on the device when an application is clicked.

The aspect of the improved navigation between applications that is most relevant to users right now is the back link that appears in the status bar when you move from one app to another. In the past, opening a link from within an application would either rip you out of that application and take you to Safari, or open the link in a mini web browser of sorts inside the application. The new back link is designed to allow users to quickly return to the application they were working in from the application that a link took them to.

Right now, the primary use for this is returning from Safari when you’re taken to it from inside an application. Once developers implement deep linking this feature will become even more significant, because clicking a Twitter link in the Google Hangouts application will simply slide the Twitter app overtop for you to see, and when you’re done you can click the back link to return to Hangouts.

It’s important not to confuse this new feature as a back button in the same sense as the one in Android or Windows Phone. iOS still has the exact same architecture for app navigation, with buttons to go back and forward located within each application rather than having a system wide button. The new back link is more of a passage to return to your task after a momentary detour into another application.

As for the UX, I think this is basically the only way that Apple could implement it. I’m not a fan of the fact that it removes the signal strength indicator, but anywhere else would have intruded on the open application which would cause serious problems. I initially wondered if it would have been better to just use the swipe from the left gesture to go back, but this wouldn’t be obvious and risks conflating the back link with the back buttons in apps. It looks like having the link block part of the status bar is going to be the solution until someone imaginative can come up with something better.

Reducing Input Lag

iOS is fairly good at minimizing input lag, but there has always been a certain amount of delay that you couldn't get around no matter how quick your application is. Part of this is related to how frequently the capacitive touch sensors scan for input, which is usually at the same frequency as the refreshing of the display, which puts the time between scans at about 16.7ms. Devices like the iPad Air 2 and upcoming iPad Pro scan at 120Hz for finger input, which drops this time to about 8.3ms. On the software side you have the steps and processes of the iOS touch and graphics pipeline, which is where Apple has made improvements to reduce touch latency on all devices in iOS 9.

The above image is a visual representation of the iOS touch and graphics pipeline. As you can see, with a typical application you're looking at about four frames of latency. Apple includes the display frame in their definition of latency which I disagree with because you'll see the results of your input at the beginning of the frame, but for the purposes of this discussion I'll use their definition.

In this theoretical case, it just so happens that the time for the updating the state of the application is exactly one frame long. One would think that decreasing the amount of time for the app to update state would reduce the latency for input by allowing Core Animation to start translating the application's views into GPU commands to be passed to the GPU for rendering. Unfortunately, this has not been the case on iOS in the past. No matter how well optimized an app is, Core Animation would only begin doing work at the beginning of the next display frame. This was because an application can update state several times during a single frame, and these are batched together to be rendered on screen at once.

In iOS 9, Apple has removed the requirement that Core Animation will begin working at the beginning of the next frame. As a result, an optimized application can take care of updating state, having Core Animation issue GPU commands, and drawing the next frame all within the time span of a single display frame. This cuts the touch latency down to only three frames from the previous four, while applications with complicated views that require more than a single frame for Core Animation and GPU rendering can drop from five frames of latency to four.

Apple has also made improvements at the actual touch input level. Like I mentioned earlier, the iPad Air 2 and iPad Pro scan for finger input at 120Hz, which can shave off an additional half frame of latency by allowing applications to begin doing work midway through a frame. In addition to reducing latency, the additional touch input information can be used by developers to improve the accuracy of how applications respond to user input. For example, a drawing application can more accurately draw a line that matches exactly where the user swiped their finger, as they now have twice the number of points to sample. Apple calls these additional touches coalesced touches, and they do require developers to add a bit of code to implement them in their applications. However, being able to begin app updating in the middle of a frame is something that will happen automatically on devices with 120Hz input.

The last feature Apple is implementing in iOS 9 to reduce input latency is predictive touch. Put simply, this is information that developers can access to have some idea of where the user's finger is moving on the display. This allows them to update views in advance using the estimated information, and once actual information about the user's movement has been received they can undo the predicted changes, apply the actual changes, and also apply predictions made based on the user's new movements. Because Apple provides predicted touches for one frame into the future this technique can reduce apparent input latency by another frame. Combined with the improvements to the input pipeline this drops latency as low as 2 frames on most devices, and on the iPad Air 2 the effective touch latency can now be as low as 1.5 frames, which is an enormous improvement from the four frames of latency that iOS 8 provided as a bare minimum.

App Thinning

Apple gets a lot of criticism for shipping only 16GB of NAND in the starting tier for most iOS devices. I think this criticism is warranted, but it is true that Apple is putting effort into making those devices more comfortable to use in iOS 9. The most obvious improvements are with their cloud services, which allows users to store data like photos in iCloud while keeping downscaled 3.15MP copies on their devices. Unfortunately iCloud still only offers 5GB of storage for free which really needs to be increased significantly, but they have increased the storage of their $0.99 monthly tier from 20GB to 50GB, dropped the price for 200GB from $3.99 to $2.99 monthly, and reduced the 1TB tier price to $9.99, while eliminating the 500GB option that previously existed at that price.

In addition to iCloud, iOS 9 comes with a number of optimizations to reduce the space taken up by applications. Apple collectively refers to these improvements as app thinning, and it has three main aspects.

The first aspect of app thinning in iOS 9 is called app slicing. This refers to something that honestly should have been implemented five years ago, which is only including the assets that a device needs rather than having a package including assets for all devices. For example, current applications will come with 1x, 2x, and 3x scale bitmaps in order to work across every iOS device. Anyone who owned an iPad 2 may have noticed that app sizes inflated significantly after the release of the iPad 3, and this is because when apps were updated you needed to store all the 2x resolution bitmaps even though they were completely irrelevant to you. With the introduction of ARMv8 devices this problem has gotten even more significant, as app packages now include both 32bit and 64bit binaries. GPU shaders for different generations of PowerVR GPUs also contribute to package bloat.

App slicing means that applications will only include the assets they require to work on the device they are downloaded onto. The analogy Apple is using is that application only needs a single slice of the assets that a developer has made. What's great about app slicing is that any application already using the default Xcode asset catalogs will automatically be compatible, and the App Store will handle everything else. Developers using custom data formats will need to make an asset catalog and opt into app slicing.

Apple's statistics show a significant reduction in app size when app thinning is used. Savings are typically in the range of 20-40%, but this will obviously depend on the application. I'm not trying to criticize what is really a good feature, but it's hard to believe that this is just now being implemented. It's likely that the introduction of the iPhone 6 Plus increased the need for this, as apps now need to include 3x assets, whereas previously having 1x assets in addition to the 2x ones wasn't a huge deal when most devices were retina and the 1x assets weren't very large.

The next aspect of app thinning is on demand resources. This feature is fairly self explanatory. Developers can specify files to be downloaded only when they're needed, and removed from local storage when they aren't in use. This can be useful for games with many levels, or for storing tutorial levels that most users won't end up ever using.

The third and final aspect of app thinning is bitcode. Bitcode is an intermediate representation of an application's binary, which allows the App Store to recompile and optimize an application for a specific device before it gets sent to the user. This means that the application will always be compiled using the latest compiler optimizations and can take advantage of optimizations specific to future processors. While I've mainly focused on how app thinning relates to trimming the space of applications, bitcode is targeted more toward thinning apps in the sense that they're always as fast as they can be.

Much like the situation with RAM on iOS devices, my favored solution to users having issues with storage is to start giving them more storage. With the iPhone 6s and 6s Plus still starting at 16GB it looks like the physical storage situation on iOS will remain unchanged for some time. At the very least, Apple is making an effort to come up with software improvements that will free up space on devices. This is still not an optimal fix, and even better would be implementing these changes and also giving users more storage, but any improvements are certainly welcome. Implementing them in ways that don't require much hassle on the part of developers is something they will certainly appreciate as well.

The Performance Implications of Multitasking Under the Hood: Safari
Comments Locked

227 Comments

View All Comments

  • Speedfriend - Thursday, September 17, 2015 - link

    @melgross I have recently seen numerous tablets being used by businesses (restaurants, delivery companies) that were clearly no-name Android tablets designed for that specific tasks. Why would a corporate that needs a tablet for a single task buy a $500 iPad when they can get a $200 Android?

    iPad is now caught in the middle between cheap single task Androids and multi-task windowns 2-in-1s. Our CEO is obsessed with Apple products but we have gone Windows tablets and it looks like we are going to go full surface range soon (3 and Pros). Why, because an iPad is too limited even as something you just take to meetings with you.
  • FunBunny2 - Wednesday, September 16, 2015 - link

    -- a respected history of hardware-design and innovation..

    really, really now? Apple has always bought their silicon, 99.44% is off-the-shelf. Yeah, I know, the fanbois brag that the Ax chips are somehow blessed by Apple. Fact is: Apple only tweaked around the edges, using industry standard silicon design tools, a bit of cache added here and there. Just look at the BoM from any of the usual teardown sites. You'll see the fact: it's always other people's parts.
  • osxandwindows - Wednesday, September 16, 2015 - link

    So why is apple not using 8core chips ha?
  • Intervenator - Wednesday, September 16, 2015 - link

    I hope that post was sarcastic or it would really be funny.
  • damianrobertjones - Thursday, September 17, 2015 - link

    ...Because that would be too far a jump. Apple wants to MILK its customers for everything then can hence the small updates. Someone like Nokia committed a mortal sin as they released a 41Mp (36+5mp) camera phone while others are still messing with 20Mp.

    All about the cash.

    P.s. Android NEEDS more cores as it runs like a bag of crap.
  • calden - Monday, September 28, 2015 - link

    Actually Android runs just as smooth as iOS. The problem is skinned, custom versions of Android, i.e. TouchWiz. When I replaced TouchWiz with CM 12.1 on my Note 4, the system took up only 580MB, where as TouchWiz took up more than 1.5GB before a single app was even installed. I also installed the launcher SmartLauncher 3, the whole experience is lightning fast. Even when running multiple apps in the background, something iOS still can't do. I think it is ridiculous that a modern OS in 2015 cannot do something as simple as stream a movie to your TV and still allow you to use the device, iOS simply pauses, even disconnects the stream in some cases if you want to do something as simple as look up an actors name in IMDB. With my Android tablet I can not only stream a film to my TV but play a game like Modern Combat 5 at the same time. As a programmer I need to run a terminal app to stay connected to my firms server during trading hours as I have monitoring tools. IOS has terminal apps as well but I can't run them them in the background the entire day without iOS terminating it's connections. Again, I find this to be absolutely ridiculous as who wants to stare at a terminal the entire day, especially when I need access to my tablet or phone to do other tasks. Apple adding Pro behind the iPad doesn't automatically make it a pro device. IOS still has one of the worst document, file management systems on the market today. My Nokia 9500 from 2004 is light years better than what iOS provides, apps should never be allowed to manage their own files. Default apps, I still can't change the default apps in iOS, why? I have no use for Apple's included apps, if I had the choice I would immediately delete them from the system, as such I need the ability to select my own browser, email client, messenger, media player, etc. as the default applications. I find this tactic of not being able to select my own default apps in iOS highly anti-competitive. The EU went after Microsoft for including Internet Explorer in XP, even though the user had the option of choosing another browser as their default. Why hasn't Apple be scrutinized about this?
  • calden - Monday, September 28, 2015 - link

    I'm aware of those few audio and GPS apps that can run in the background in IOS, but this is a far cry than allowing any app that the user needs in the background. No, this has nothing to do with battery life, if it is than Apple really needs to rewrite iOS. My BlackBerry Passport, running three apps in the background, easily lasts the entire day on a single charge, actually it lasts a day and a half with moderate to heavy use. Android has the ability to select how many apps are allowed to run in the background, you can even set it to 1. So if people feel like their apps are eating up their battery they can control the amount of running apps. Apple could easily implement such a feature, they don't though, which means they have all the control, they dictate how the user uses their own devices. iOS is a wolf in sheep's clothing, looks pretty, inviting but once you start to do real work you encounter a brick wall a 20 stories high. How many times have you iOS users logged into iCloud on your device, I had to do it over 25 times to cover every app. Why, why do I need to log in even twice, once should be enough, in Android upon setting up my Google user that was it, from that point on every app that could communicate with Google Drive would automatically be setup. This is because the apps talk to the system at the lowest level, iOS requires spaghetti API's, a spiderweb of tunnels trying to pass info to each other. The Share TO function in iOS works only if the app dveloper has created a share profile. Why can't the system just dynamically create these Share To lists like Android 5.1.2, SailFish 2.0, Windows Mobile 10, BlackBerry OS 10.3.2 by looking for every compatible app that is installed and than listing. No, instead iOS uses this half ass API system. What about mult-user support, will never happen in iOS because of the way it handles files. To support multi-users in iOS each user would have to reinstall each app over again to distinguish each users. They could embed the users info in the file's metadata so the app can distinguish each user but that is just hacky at best would and how would these modified files react when used on other systems. IOS is definitely not a pro system and anyone thinking differently is either lying to themselves to protect their beloved Apple brand, aren't professionals themselves so don't reall understand the meaning or are working around these limitations, fighting the system at every point to get their work done which falls in line with point one, their lying to themselves.

    I'm not saying that iOS devices don't have their uses, they do. They make great consumer devices for media consumption, social media, gaming, drawing and other artistic apps, music and music creation, etc. However as a productivity tool these devices are highly limited and can't compete with the likes of a Surface Pro 3 or even Surface 3. Even an Android tablet would be a better option. With my Nexus 9 I can log into the LDAP and gain access to all my allowed users NAS storage, mount it as a local asset. Set file extensions to open up certain apps, etc. Trying to do this in iOS is like trying to put a round peg into a square one. You can do it with a bit of force but your going against it's designed purpose. Apple needs to completely rewrite iOS, combine many functions found in OSX before I would ever consider using another mobile device from Apple.
  • mikhapop - Monday, September 28, 2015 - link

    you really nailed it, i am a web developer and i often fail to tell my friends how the ios is very limiting for even the basic stuff (my basic stuff). android is far better as far as the os go. Now i am using a surface pro 3 and never looked back, very good in meetings, and it is now my main machine for 98% of my work.
  • blackcrayon - Wednesday, September 16, 2015 - link

    Sounds like you know close to nothing about the Ax chips. They are custom Apple designs, and they also optimize their OS for them. I bet you thought Intrinsity and PA Semi were just marketing facades that didn't actually do anything before Apple acquired them years ago, right?
  • KoolAidMan1 - Wednesday, September 16, 2015 - link

    Apple spent billions acquiring semiconductor companies and is one of the few companies along with Qualcomm that has a license to make ARM chips. Anand himself highlighted this while showing that Apple's custom designs matched or exceeded Intel's Bay Trail.

    You really think their custom designs are something to be dismissed just because of the name on the package? The fanboy is strong in your posts

Log in

Don't have an account? Sign up now