Paranoid Android’s Halo, Apple’s Assistive Touch, and the future of smartphone UI

When Google showed off Android 3.0 Honeycomb in February 2011, to me it signaled the beginning of the end for hardware buttons. Practically every mobile device that’s been sold before that demo has included at least one button on the face of the device. Apple has the iconic home key, most Android phones used to have four dedicated capacitive keys, and if you look back far enough, you could see Palm OS and Windows Mobile both had similar controls.

Earlier this year I escaped Finland’s brutal winter and spent 2.5 months exploring South East Asia. Thailand, Taiwan, Malaysia, the Philippines, Singapore, my passport now has two full pages of stamps. During my travels I paid close attention to the devices people were using and how they were using them. Regardless of the city I was in, it felt like half the people living there used iPhones.

That didn’t shock me.

What shocked me was seeing how people used their iPhones on the subway, on the bus, on the train, what have you. They never pressed their home keys, not once. Instead they pressed an on screen circle that was always hovering over whatever app they happened to be in. Later I’d learn that this feature is called “Assistive Touch”.

When Facebook unveiled Chat Heads to the world last month, to me it seemed like Apple’s Assistive Touch, except with the ability to pull up a chat window. And this morning I saw a four minute video on The Verge demoing “Halo”, which is basically a clone of Chat Heads for Android, except that it notifies you of everything, not just new Facebook messages.

Now let’s talk about the future. Stock Android, pretty as you might think it is, has a tragic flaw. The bottom of your screen is “wasted” showing buttons for back, home, and multitasking. I wouldn’t be surprised if a newer version of Android did away with those buttons and moved to an Apple “Assistive Touch” style UI.

I also wouldn’t be surprised to see the entire mobile industry move towards this UI model as well.

Think about it. Pulling down a notification shade was terrible easy back in 2010 when the 3.7 inch Nexus One was around. But in 2013, with the 5 inch Samsung Galaxy S4 and the 5.5 inch LG Optimus Pro, hitting the top of the screen is a gesture that requires far too much effort.

It probably isn’t going to happen this year, but I have a strong feeling that in 2014 we’ll see Apple ditch the home button and Google introduce a new Android interaction model, both of which are focused around a floating orb that once pressed allows you to interact with everything on your device.

I’m excited, because I love large screens, but at the same time I hate the inefficient use of screen real estate. If we can move those UI controls to a little ball that can be placed wherever an individual prefers them to be, then we’ll move smartphone UI that much more forward.

Like this post? Share it!

  • lalala

    here in malaysia assistive touch is also famous for androids.

  • Thanks for sharing how an accessibility feature has become mainstream. It is often the case that features inspired by disabilities have value for the wider market: high contrast UIs with white text on black backgrounds, for instance.

    Your notion of bringing controls to the current position of the finger, rather than requiring the finger to go to a set control zone is also interesting. However, it is worth considering a couple of things: the current model, where you have a set place for generic controls like home and back, as well as contextual menus which appear when you touch and hold, makes an implicit separation in the user’s mind. This can aid usability and gives more flexibility in ensuring you can have several generic and contextual controls without the risk of overcrowding if they were cluttered together.

    It may seem overly simplistic to assign a single function to its own hardware home button, but one of the reasons iOS devices remain more intuitive to the majority is the low cognitive load in knowing there is always one button which does a single thing: taking you back to a familiar place, where you have your bearings and know how to continue your journey. Power users will value, and should have the option to customise, but out of the box usability for the majority is best served by this single function, home – based navigation model.

    • Stefan Constantinescu

      The iPhone introduced a new interaction model over half a decade ago, if a new one is born then people will quickly adapt. I’m a firm believer in that. Pull to refresh is something everyone cloned seemingly overnight, and I wouldn’t be surprised to see the same thing happen to Chat Heads.

      I agree that the iPhone is drop dead easy to use, but I think you underestimate people’s ability to learn new gestures.

  • The question – for me – is *where* do you put that control? Either you do like iOS and have to move it every time it obscures a UI component, or it has a fixed position.

    When I was using the BB Q10 I was really impressed by the fact that you could swipe from the bezel. Although, in BB’s case, it only worked 75% of the time – the rest of the time it either didn’t register or was picked up by the app as a vertical scroll.

    Having flipped back and forth between devices with and without buttons, I prefer having a single physical button. I’d like it to work more like that iOS example – where multiple options spring from it. But, of course, with screen size being king, I can’t see obvious buttons staying around for too long.

    Perhaps using the proximity / camera sensor? If your finger is over the lens, pop up the controls?

    • Stefan Constantinescu

      I like your line of thinking. A button can stay, but you change the definition of a button, which in your case is a finger over a camera sensor. I’m thinking something more like a touch enabled strip on the bezel of the device. It’s not as old school as a jog dial, but it’s modern enough that it just might work.

      I take if you’ve seen the Google Glass video where the demo is all about gestures you do on a touch sensor located on your temple. Makes sense to copy that idea.

      All I know is that I want change, and having mechanical buttons just seems off these days. Plus, I can’t figure out how Apple could put a 4.3 or 4.5 inch screen in a device that stays small without simply getting rid of the home button.

  • Olivier

    In asia, most people have Assistive Touch and a portable battery charger because the 2 main problems with the iPhone are:
    – the home button gets not responsive very quickly
    – the batterry drains too fast when you are on LINE all the time
    (or at least that’s the reason my friends gave me)

    I also used the accessive touch when my iPhone 4 had its home botton broken but as soon as I got my iPhone 5, I stopped using it. I can’t prove it but the assistive touch is only used to replace the home button, no one tolds me that he used it for something different.

    Another interresting and different usage from asia, is how people are using Line/WeChat. It is really popular to send short voice message instead of typing. Having seen that in Europe or America yet and I doubt it will ever come (but I am a quite bad as predicting the future)

    • Olivier

      For me the future should be like:
      1 – Meego: No freaking home (or any other) physicall button or button that are always there (like back). It should be a gesture (a swap!) that will perform the action everywhere.

      2 – the facebook messaging/alert system or “true multitasking”. Meaning that my screen can show me more than one thing at a time and I can do multiple things without having to in/out between the app.

      That makes me think of something really shitty about Windows 8. Their new skype covers all or at least 1/4 screen which makes the app unusable on a laptop (unless you want to remove the multitasking functionality).
      By the way, for me the iOS multitasking way is not a true multitasking as you always have the feeling of going in and out (unlike Meego again…)