iOS8 : Bigger Than iBeacon


Apple gave us iBeacon with the launch of iOS7. It opened up a wave of innovation around proximity-based experiences that was showing no signs of slowing down.

But iOS8 is orders of magnitude more significant. At its World Wide Developer Conference Apple didn’t just tweak a few iBeacon settings or add a few new classes to play with. Instead, they advanced a paradigm for computing that will change the Apple user experience forever.

iOS8 is a Paradigm Change and Its Built Around iBeacon

Now…I generally dislike the phrase ‘paradigm change’. But in this case I use it without reservation.

Because Apple has sketched out their forward vision for computing and how we related to our machines which is in some ways deeply surprising but in others was foreshadowed by everything we’ve learned by playing around with beacons.

Their vision will keep them on a philosophical, strategic and tactical collision course with Google, Amazon, Samsung and (to a lesser but still significant degree) the Facebooks and Pinterests of the world.

That battle will make yesterday’s fights for search, online eyeballs or mobile market share look like mild little warm-up exercises.

Because the new battle isn’t to see who can sell the most phones or who will own the biggest slice of ad dollars.

The new battle is to see who wins in the digitization of physical reality. And it’s a battle built around beacons.

iBeacon: The Gateway Drug to the Internet of Everything

When Apple opened up support for Bluetooth LE proximity detection in iOS7 it had implications for retail or ‘destination venues’ that were readily apparent. Walk down the aisle of a grocery store, and the promise of beacons was that it could deliver a coupon in front of the cereal boxes. Wander through a museum, and beacons could deliver biographies or art commentary in front of a specific painting.

But we proposed that the significance of Bluetooth LE beacons was more profound.

Because as you peeled back the layers of an iBeacon (the Apple-certified version of a BLE beacon) you’d quickly understand that:

  • iBeacon represents proximity not location. And while they work well with maps, proximity is a completely different thing. Proximity means that you could locate a user not just in place (by geolocating your beacon) but also by how close you are to something that is moving or has moved; by your connection not just to locations but to people and things; and that distances matter – allowing you to push one message out at 80 meters and another at 8 inches.
  • iBeacon and BLE beacons aren’t just devices you can stick to a wall. Your phone can be a beacon. You can put a beacon on a dog, or your child, or a piece of luggage. The shop assistant can be a beacon, and their phone can detect nearby ‘customer beacons’.

The longer you spend thinking about beacons, the more you realize that the old rules don’t apply: that you need to think about UX design using a new paradigm: one in which the physical world becomes an interface, and meaningful connections to moving objects and people suddenly become possible.

Because of this, we called iBeacon the “gateway drug to the Internet of Everything”. Because wherever you started with beacons, you ended up in conversation about connected and physical spaces that would respond and talk back.

Your devices were once (mostly) blind to the world around them – but with iBeacon they now could see.

The World: Visible

iBeacon wasn’t mentioned on the main stage at the WWDC Keynote. The workshops didn’t reveal some kind of massive change in how beacons themselves function.

And yet Apple went further. Because the undercurrent to what Apple revealed about its vision for the future was imbued with beacons. And for a simple reason:

If beacons allowed our phones and tablets to see the world around them in a new and profound way, Apple has just launched a new philosophy and approach for the purpose of computing: to connect it to the physical world.

  • HomeKit gives us new tools for the connected home
  • HealthKit gives us new tools for the quantified (and very physical) body
  • Continuity gives one app a sense of presence with another, allowing them to seamlessly transition tasks from one to another
  • MapKit has been extended to allow us to map internal spaces
  • TouchID will let your fingerprint do more than just unlock your phone, it can now be used by developers to secure transactions
  • Home screen widgets and notifications will “rule the interface” and be used to create more granular interfaces to content as we move through our busy lives
  • Apps that are specific to a location will be recommended on the home screen of your phone or tablet

Even apps themselves have been busted out of their silos. Extensions will allow apps to share data and functionality with each other – and might be how one beacon-enabled app gives access to its beacon networks to another.

Even Metal, I’d argue, which lets games and other apps tap into the raw processing power of the on-board graphic chips will, yes, make phones more immersive but they’ll also open them up to realistic simulations, more advanced augmented reality and new and improved versions of Flappy Bird. (OK, that last one we can cross off).

The biggest announcement out of WWDC for developers was the launch of an entirely new programming language. (My jaw dropped to the floor and I was both thrilled yet saddened as I glanced over at all the damn iOS7 books I bought in order to teach myself how to code in objective-C).

But even Swift carries with it the hints of a language that’s primed and ready for the Internet of Everything – and holds within it the prospect of a more distributed form of code deployment where smaller blocks will power smart watches or thermostats while “deep apps” can be developed more efficiently and with more power under the hood.

User Up, Data Down

Apple, of course, isn’t alone in recognizing that the concept of beacons is one way to understand the emergence of a fully connected world. Google is about to announce its own plans and is expected to build Google Nearby which will use the presence of beacons along with other technologies to allow deeper detection of the world around you.

It will be a welcome development. With Kit Kat adoption finally gaining steam, there are hundreds of millions of beacon-enabled Android devices coming online, and Android has been behind the curve in offering robust developer tools compared to the Apple beacon frameworks.

But enhanced support for beacons and proximity won’t change a fundamental difference between Google and Apple which Benedict Evans so brilliantly points out:

For Google, devices are dumb glass and the intelligence is in the cloud, but for Apple the cloud is just dumb storage and the device is the place for intelligence. And it’s built a whole new set of APIs, CloudKit, to enable this for developers, which it is (for the first time, I believe) dogfooding, building the photos product on it.

There’s a release cycle question in here. A phone that’s refreshed every year or replaced every two can iterate and innovate much faster than a TV, car (or fridge, or, perhaps thermostat) that may be replaced only every five or ten years. So it seems like the place for the intelligence should be in the phone rather than the TV. But the extension of this is that a cloud product can iterate every day. This is the killer advantage of enterprise SaaS over on-premises software – you can improve things all the time. And Apple updates its OS once a year and, so far, the same is true for the cloud products it builds for developers, where Google can update all of its products every week.

I agree. And yet.

The focus on the ‘cloud’ is actually yesterday’s battle. The real question isn’t whether you can iterate faster in the cloud or on the device. The real question is to try to guess which ‘ecosystem’ has a more direct path to winning the real battle: for ownership of the digitization of the physical world.

Nothing else matters. You can argue who “gets” cloud more. But the war for the cloud, the war for online advertising – they’re nothing, they’re parlor games compared to where Apple is taking us with beacons.

The physical world, until now mostly ‘dumb’ and disconnected from our devices is, for better or worse, waking up, and our devices are responding.

There is no offline. And last week, Apple demonstrated that nearly everything it’s doing to enhance its platform is directed at that fact.

Share Your Thoughts

Join our weekly e-mail list for more on iBeacons. Join the conversation on Twitter, or connect with me on LinkedIn.

How will iOS8 change user’s perceptions about what’s possible? What was the most significant feature for the ‘becosystem’ out of WWDC? Drop a line in the comments below.

15 Responses to “iOS8 : Bigger Than iBeacon”

  1. Jacob Emmerick

    What you have described fits the definition of an emergent phenomenon, wherein the seven things you list (and there are probably more) together create something much bigger and qualitatively different from the composition of its parts. Exciting stuff!

  2. I’m a bit puzzled why there is so little mention of what Apple *did* announce at WWDC – namely accurate indoor location without iBeacons. Before the acquisition, WiFiSlam was getting 1ish meter precision (with Android). Assumedly with they will do at least that well on iOS with more mature algorithms, motion sensors, etc. Yes it requires a mapping process but that’s likely way less work than installing, configuring and managing beacons at scale.

    So if you have accurate indoor positioning then proximity comes for free. Even the inclusion of iBeacons in the indoor location presentation at WWDC felt really forced to me. Sure there are some use cases for beacons, but *maybe* just a tiny fraction of what we thought was going to happen with them.

    Would love to hear other perspectives on this. I’m happy to be wrong.

    • Battery Life (Consumer > Vendors), Privacy (Randomised MAC?), Cost (Low Cost of Beacons + Saas Pricing Model) & User Experience (For people who are gonna deploy the beacons )

      • Cost of piggybacking on existing infrastructure is basically zero, doesn’t require configuring/managing dozens/hundreds of devices in a venue, batteries don’t die, security issues are the same with randomized MAC addresses.

        And now you know where you are +/- 1 meter… as opposed to beacons where you are 4 meters, er 10, er 2, er, oh, my beacons stopped ranging so now I have to restart my phone. Would Apple let this problem persist for, what, 3 months now if they really cared about iBeacons? Something’s fishy.

        • You’re right that the existing infrastructure are ready in place to support accurate indoor location, but these systems are not readily open for integration with iOS. By putting up an iBeacon specification, Apple now has control over the technology and down to the minor details.

          Accuracy of the beacons are in the centimeters range, not meters. An example’s available from in this video ( Check it out from 3:20 – 4:45 for the accuracy.

          • By the commercial release of iOS 8, Core Location will have support for indoor positioning using existing infrastructure. If you haven’t watched “Taking Core Location Indoors” (, it’s worth an hour.

            As far as iBeacon accuracy… the variability of signal strength based on physical environment, device orientation, battery strength, code buried in Core Bluetooth/Core Location, star alignment, etc. is pretty well-documented. What you didn’t see in the Scoble video is that if you look at signal strength, it bounces all around. Maybe it doesn’t matter to bounce between 1cm and 4cm, but there aren’t that many good use cases for immediate distances. But in the real-world, bouncing between 1m and 4m is huge in a retail environment.

            I know I sound like a hater, but I’m not… I’m in beta with a product that uses iBeacons. But remember when iBeacons were announced and we thought that we could triangulate a few beacons and find out exactly where we were indoors? And what we got was a fundamentally flawed technology for this with dozens of immature product implementations and, a year later, almost no real commercial implementations. So we became iBeacon apologists and started talking about proximity instead of location. I don’t think iBeacons completely go away, but I do think 90% of what we thought they’d give us will be provided better by indoor positioning.

    • Jim Bonner

      I don’t agree about indoor localization being “more important” that iBeacons. That assumes that “location” is the only important use case. It’s just not true. Proximity is just as important as location.

      Sure, most iBeacon projects are lame, because people are jumping too quick and not thinking about it right. Too much retail coupons and not enough else. But that doesn’t mean Proximity is unimportant.

      For my own project, Proximity and Indoor Location work together syergistically. You must, I repeat MUST have them both to make it work.

      • Certainly proximity is important, but if you get that for free with positioning, then why bother with an extra layer of cumbersome hardware, software and physical processes? What’s the point of sticking beacons everywhere if you know lat/long of everything. I think it’s completely analogous to GPS. Triangulating from a handful of satellites is massively preferential to sticking beacons every 20 feet along roadways.

        • Larry Lee

          The “free” position that you mention comes at a cost of battery life for users. And in the IoT landscape, many devices (Google Glass, Pebble) are not equipped with GPS/WiFi chipsets to tap on the existing infrastructure to offer ‘context’ to users.

          I might be biased but it seems to me that Apple’s tried to ensure all cases are covered and its up to developers to decide what’s most suitable for whatever they are building.

          Rather than forcing technology into a solution, consider if its necessary.

          • I certainly agree with your last statement… the real world is messy; we don’t know what the killer apps are; and the more stuff Apple et al throw out there is good for everyone.

            Having said that, indoor positioning is essentially free when users are engaged or where a device needs regular data connectivity (and thus would benefit from wifi’s efficiency vs. 3G/4G). I guess in a very sparse environment where you only occasionally have relevant content that wouldn’t be true. But I prefer to think of this stuff as an always-on digital overlay to the real world vs. something more precious to be rationed out. As far as BLE wearables, they all talk to your phone anyway so I don’t think they change the basic argument.

            I’m sure indoor positioning isn’t going to live up to the hype that will get generated. And beacons can provide a bit of extra ground truth to improve indoor positioning accuracy. But I just am excited about an alternative scenario that is probably 10X easier to scale than with beacons.

  3. Have you seen the “” Getting started guide? Nothing special, just a start.. but an official start by Apple.

    Bye…and thanks for your nice work with this website!

    Luciano – Italy

  4. This just popped on my feed today, so I know I am late, but still feel compelled to reply.
    I experimented with iBeacons for a time, the nagcations I got drove me nuts, so I shut the whole thing down.
    A few spoiled it for the many.
    Unless Apple get some sort of control over how devs are abusing this, future articles are talk about the rejection to acceptance ratio of people that have this service turned off.
    I love how you are looking at the big picture and are getting excited, but talk to your friends and end users that are not geeks or Devs and find out how excited they are about it……


Leave a Reply

XHTML: You can use these tags: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>