Bluetooth Beacons: Love, Intimacy and Seduction on the Internet of Things

Love Your Beacon

Bluetooth LE and iBeacon technology opens up a new level of intimacy with your user. The seemingly subtle difference between location and proximity hides a more profound idea: that we can now move past where someone is and begin to craft experiences based on how close they are to something.

Proximity technology is to location as watching a video on a tablet is to a TV screen: at first glance, both might seem to deliver much the same thing.

But even if we ignore the different affordances such as gestures and interfaces, there’s a different level of emotion and touch when you’re holding content in your lap versus watching it at a distance on a large screen TV.

I’ve learned a hard lesson, however: that when you try to explain this by describing the technology it’s hard for someone new to the concept of iBeacons and proximity tools to grasp the significance.

I’ll spend 10 minutes explaining the difference between NFC (proximity) and GPS (location) and how they create different user experiences.

But most people are about as unexcited as they are knowing how much RAM they have in their computer. All they really want to know is whether it’s “really really fast”.

Beyond Industrial and Digital Paradigms

At its ultimate level, IoT is the final merging of technology with self.

At first glance, the idea of sensors in the world around us sounds a lot like automation.

Sensors can adjust the temperature in our living room, detect how much traffic there is ahead, water the lawn, or track the shipment of widgets from China without any human intervention. Tasks that used to need us won’t need us anymore. The machines will get smarter and we won’t even really notice – life will just be slightly easier.

(Of course, the other view of automation might equally come true: that tomorrow’s Internet of Things will become yesterday’s remote control – nearly unusable because we end up with way too many buttons in our lives that need to be pressed for stuff to happen and we can’t remember where we put the manual).

I think we’re preconditioned to view machines through either an industrial-era lens, in which machines are enablers of productivity and scale; or through a web-era lens in which machines primarily produce data and we have the means to make knowledge of that data because of its interconnection to interface and people. Social media isn’t truly social – it’s simply a graph of data which is given the affordance of an interface and a “like” button to connect that data to other people.

Neither of these lenses is invalid and both will produce profound technologies that will change our lives.

The quantified self movement is an example of extending these lenses into the Internet of Things: it views our physical bodies as data sources for the interfaces that underpin the web. Our smart watch measures our pulse and distance run, and our dashboards help us to create meaningful connections and knowledge from that data stream.

But I can’t help thinking that just as the difference between location and proximity might seem so subtle that it’s meaningless, the difference between yesterday’s web-era lens and tomorrow’s IoT-driven lens might also seem subtle while hiding a more profound shift in how we think about technology.

Beyond Context

Perhaps the closest language we have for this shift is embodied by the ideas of contextual and ubiquitous computing: the insight that the machines will become increasingly invisible.

A Bluetooth beacon doesn’t ask that you do anything: it talks to your ‘machine’ (your phone) and stuff happens without you needing to take action. It’s one of the reasons why retailers get so excited about its potential: aside from any other reason, it can combat showrooming by interrupting your tendency to jump online and price compare by delivering content before you have a chance.

If you’re in front of the TV aisle at an electronic store, the retailer can create a dialog with you without asking you to open an app or do much more than just stand there. Because of this the retailer might be able to create enough context and relevance that their customer won’t pop open their browser to shop on Amazon instead.

Ubicomp and contextual computing conceive of a world in which interfaces become either calm or invisible. They’re the ultimate dream: to reduce the number of ‘clicks’ from start to finish to almost zero…removing user agency and driving right to the point.

This sounds like marketing and UX nirvana: bypass all the pesky mistakes and tangents a user can make and get right to the transaction.

But removing user agency feels like it contradicts one of the true paradigm-shifting possibilities of IoT.

In the pre-IoT era, the user has been mostly abstracted from the layer at which production-level value is created (the data layer) by the interfaces that lie between. In the post-IoT era, the user is finally back in connection with the ‘thing’ that is the source of all the value: namely, reality.

There Is No Offline

We have become very intimate with our machines. Go to a coffee shop and you’ll mostly see people staring at laptops and cell phones. The screens abstract the Internet’s value layer into interfaces that let us view and interact with data and connections in order to extract our own value: whether knowledge, social currency, expressions of identity, forms of play or other.

While IoT is both useful and a source of innovation and value when viewed through an industrial or web-era lens, it’s more profound implication might instead be that it removes the abstraction layer: not by making the machines or the interfaces invisible, but instead by transferring it to the physical world.

IoT lets me track my pulse, my stride and my breathing when I go for a run. The data it creates can be abstracted through another screen. The technology itself can be calm, ubiquitous and invisible except when we go to make knowledge of that data through the abstraction of a dashboard or interface layer.

But viewed in reverse, the presence of IoT and nearly invisible computing does something else: it turns my physical body into the layer where the value is created within this larger ecosystem of connections and abstractions. I become conscious of that value, I become connected to its expression, I end up with a heightened awareness of my body as the “layer” where “stuff” is happening in a way that’s accentuated and heightened.

Pre-IoT, unless you were a data scientist or networking engineer you were always one step removed from the value layer by the abstractions of interface.

Now, the real world is the new digital infrastructure, we’re all being given permission to peek in and see how it works and we’re all beacons on the Internet of Things.

From Seduction to Intimacy on the Internet of Things

The difference between location and proximity is the difference between flirting and seduction on the one hand and intimacy and love on the other.

If you’re designing a Bluetooth LE experience, you’ll still need to woo. You’ll still need to flirt across the room and get your customer to let you buy them a drink.

In the seduction phase you’re using mobile or web ads, signs in your shop window, flyers you hand out at the door: you’re trying to get your customer to pay attention with the ultimate goal being that they download your app and let you take them out on a date. You’re using GPS and geofencing and location-based advertising to invite them into your store.

It’s when they walk in the door and your customer gets passed over to Bluetooth LE that you can really start to get intimate. You can buy them flowers, you can compliment them on how nice they look, you can tell them stories and laugh about things in front of the candy display.

We tell marketers that Bluetooth LE isn’t the answer: it’s part of a larger user experience. Just like it’s not usually advisable to, um, pay to be intimate with your customer…it also isn’t advisable to think of Bluetooth LE as a short cut to intimacy. You still need to woo them, seduce them, and treat them real well – and no amount of ubiquitous or contextual computing is going to change that.

From the very granular level of your application architecture, you’re going to need to think about:

  • Online and offline tools to help prompt an app download
  • Email and other ‘reminder’ campaigns to bring users into the vicinity of your location
  • GPS-driven geofencing and geoaware technologies to help locate and do initial app ‘wake-up’
  • Alternate ways to push notifications to address different user scenarios
  • Bluetooth LE beacon detection for initial beacon ranging
  • Beacon ‘pairing’ and other technologies for close proximity messaging and app content changes
  • Back-end systems for content management
  • Integration with CRM, social media and APIs
  • Back-end analytics

Where The Eyes Take You

But here’s the thing. When you think about what’s truly happening to your user as you shift them through seduction to intimacy, ask yourself this: where do their eyes go?

After all, you’re moving your customer from a web/mobile lens into an IoT lens and you’d expect there to be some differences. They have your app, you’ve geolocated them in the neighborhood of your store through GPS and geofencing…are they looking at you yet? Or are they looking at their screen? Since they’re not at your store yet their only interaction with you is still abstracted to their screen.

Now, they get passed off to your Bluetooth beacons. You deliver them some content – and NOW where do they look?

If your customer is still looking at their screen then you’ve done something wrong.

Because one of the profound affordances of IoT isn’t that it makes what’s on our screens more relevant.

It’s that it makes us aware that the real world is the new value layer, it’s the new platform, it’s the new interface….it’s the place that all the new “stuff” that’s facilitated by IoT happens.

A Renaissance of the Real

While the real world will now create more data than we’ll easily know what to do with because of IoT, while the real world will create nodes on the web-based network of screens and interfaces and abstractions, it will also democratize and reconnect us to the physical in new and profound ways.

For now, it might be enough to use IoT and beacons to make content more relevant and contextual, and to soldier on with our paradigms of screens and abstractions.

But if you want to dream, don’t dream of screens. Dream instead of how your beacons and your apps can help your users to look up again, to touch and see and feel, to reach out and run their fingers along the hem of a blouse or the mottled skin of an organic orange.

Reality is now connected. We can put a bunch of screens in front of it and look at it through polished glass. Or we can recognize that IoT can make technology more human, more visceral, more emotional and can treat us all not just as sources of data but as beacons on the path to a more tangible, intimate and physical future.

Jump Onto Our Mailing List
Why not join our mailing list for ‘BEEKn unplugged’?
And check out our flashy awesome company site too.

Be the Beacon!

Leave a Reply

XHTML: You can use these tags: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>