Apple gave us iBeacon with the launch of iOS7. It opened up a wave of innovation around proximity-based experiences that was showing no signs of slowing down.
But iOS8 is orders of magnitude more significant. At its World Wide Developer Conference Apple didn’t just tweak a few iBeacon settings or add a few new classes to play with. Instead, they advanced a paradigm for computing that will change the Apple user experience forever.
iOS8 is a Paradigm Change and Its Built Around iBeacon
Now…I generally dislike the phrase ‘paradigm change’. But in this case I use it without reservation.
Because Apple has sketched out their forward vision for computing and how we related to our machines which is in some ways deeply surprising but in others was foreshadowed by everything we’ve learned by playing around with beacons.
Their vision will keep them on a philosophical, strategic and tactical collision course with Google, Amazon, Samsung and (to a lesser but still significant degree) the Facebooks and Pinterests of the world.
That battle will make yesterday’s fights for search, online eyeballs or mobile market share look like mild little warm-up exercises.
Because the new battle isn’t to see who can sell the most phones or who will own the biggest slice of ad dollars.
The new battle is to see who wins in the digitization of physical reality. And it’s a battle built around beacons.
iBeacon: The Gateway Drug to the Internet of Everything
When Apple opened up support for Bluetooth LE proximity detection in iOS7 it had implications for retail or ‘destination venues’ that were readily apparent. Walk down the aisle of a grocery store, and the promise of beacons was that it could deliver a coupon in front of the cereal boxes. Wander through a museum, and beacons could deliver biographies or art commentary in front of a specific painting.
But we proposed that the significance of Bluetooth LE beacons was more profound.
Because as you peeled back the layers of an iBeacon (the Apple-certified version of a BLE beacon) you’d quickly understand that:
- iBeacon represents proximity not location. And while they work well with maps, proximity is a completely different thing. Proximity means that you could locate a user not just in place (by geolocating your beacon) but also by how close you are to something that is moving or has moved; by your connection not just to locations but to people and things; and that distances matter – allowing you to push one message out at 80 meters and another at 8 inches.
- iBeacon and BLE beacons aren’t just devices you can stick to a wall. Your phone can be a beacon. You can put a beacon on a dog, or your child, or a piece of luggage. The shop assistant can be a beacon, and their phone can detect nearby ‘customer beacons’.
The longer you spend thinking about beacons, the more you realize that the old rules don’t apply: that you need to think about UX design using a new paradigm: one in which the physical world becomes an interface, and meaningful connections to moving objects and people suddenly become possible.
Because of this, we called iBeacon the “gateway drug to the Internet of Everything”. Because wherever you started with beacons, you ended up in conversation about connected and physical spaces that would respond and talk back.
Your devices were once (mostly) blind to the world around them – but with iBeacon they now could see.
The World: Visible
iBeacon wasn’t mentioned on the main stage at the WWDC Keynote. The workshops didn’t reveal some kind of massive change in how beacons themselves function.
And yet Apple went further. Because the undercurrent to what Apple revealed about its vision for the future was imbued with beacons. And for a simple reason:
If beacons allowed our phones and tablets to see the world around them in a new and profound way, Apple has just launched a new philosophy and approach for the purpose of computing: to connect it to the physical world.
- HomeKit gives us new tools for the connected home
- HealthKit gives us new tools for the quantified (and very physical) body
- Continuity gives one app a sense of presence with another, allowing them to seamlessly transition tasks from one to another
- MapKit has been extended to allow us to map internal spaces
- TouchID will let your fingerprint do more than just unlock your phone, it can now be used by developers to secure transactions
- Home screen widgets and notifications will “rule the interface” and be used to create more granular interfaces to content as we move through our busy lives
- Apps that are specific to a location will be recommended on the home screen of your phone or tablet
Even apps themselves have been busted out of their silos. Extensions will allow apps to share data and functionality with each other – and might be how one beacon-enabled app gives access to its beacon networks to another.
Even Metal, I’d argue, which lets games and other apps tap into the raw processing power of the on-board graphic chips will, yes, make phones more immersive but they’ll also open them up to realistic simulations, more advanced augmented reality and new and improved versions of Flappy Bird. (OK, that last one we can cross off).
The biggest announcement out of WWDC for developers was the launch of an entirely new programming language. (My jaw dropped to the floor and I was both thrilled yet saddened as I glanced over at all the damn iOS7 books I bought in order to teach myself how to code in objective-C).
But even Swift carries with it the hints of a language that’s primed and ready for the Internet of Everything – and holds within it the prospect of a more distributed form of code deployment where smaller blocks will power smart watches or thermostats while “deep apps” can be developed more efficiently and with more power under the hood.
User Up, Data Down
Apple, of course, isn’t alone in recognizing that the concept of beacons is one way to understand the emergence of a fully connected world. Google is about to announce its own plans and is expected to build Google Nearby which will use the presence of beacons along with other technologies to allow deeper detection of the world around you.
It will be a welcome development. With Kit Kat adoption finally gaining steam, there are hundreds of millions of beacon-enabled Android devices coming online, and Android has been behind the curve in offering robust developer tools compared to the Apple beacon frameworks.
But enhanced support for beacons and proximity won’t change a fundamental difference between Google and Apple which Benedict Evans so brilliantly points out:
For Google, devices are dumb glass and the intelligence is in the cloud, but for Apple the cloud is just dumb storage and the device is the place for intelligence. And it’s built a whole new set of APIs, CloudKit, to enable this for developers, which it is (for the first time, I believe) dogfooding, building the photos product on it.
There’s a release cycle question in here. A phone that’s refreshed every year or replaced every two can iterate and innovate much faster than a TV, car (or fridge, or, perhaps thermostat) that may be replaced only every five or ten years. So it seems like the place for the intelligence should be in the phone rather than the TV. But the extension of this is that a cloud product can iterate every day. This is the killer advantage of enterprise SaaS over on-premises software – you can improve things all the time. And Apple updates its OS once a year and, so far, the same is true for the cloud products it builds for developers, where Google can update all of its products every week.
I agree. And yet.
The focus on the ‘cloud’ is actually yesterday’s battle. The real question isn’t whether you can iterate faster in the cloud or on the device. The real question is to try to guess which ‘ecosystem’ has a more direct path to winning the real battle: for ownership of the digitization of the physical world.
Nothing else matters. You can argue who “gets” cloud more. But the war for the cloud, the war for online advertising – they’re nothing, they’re parlor games compared to where Apple is taking us with beacons.
The physical world, until now mostly ‘dumb’ and disconnected from our devices is, for better or worse, waking up, and our devices are responding.
There is no offline. And last week, Apple demonstrated that nearly everything it’s doing to enhance its platform is directed at that fact.
Share Your Thoughts
How will iOS8 change user’s perceptions about what’s possible? What was the most significant feature for the ‘becosystem’ out of WWDC? Drop a line in the comments below.