Intention Attention: Relationships, Voices and Tokens

INTENTION ATTENTION: RELATIONSHIPS, VOICES AND TOKENS

 

PART ONE: MOTHER ALEXA

When you were very small – before you could even crawl – you could only bawl and any time you had a need. Hungry, cold, tired – every disturbance led to the same outcome.

As you grew older you could reach for the things you wanted, and soon you learned to ask for them. You’d say a few words, paired with the ‘magic word’ – “please?” – and all sorts of things would happen, at your bidding. Maybe not every single thing, but enough of them, consistently enough, to meet your needs.

You can’t always get what you want.

We have mixed memories of this period of our lives, because while we ourselves are helpless, help is always at hand. Just say the word – well, scream it – and mommy or daddy come a-running, ready to do whatever it takes.

It’s nice to be waited on hand-and-foot, even if, at the time, we don’t understand how nice it is – simply because we’ve never known another way.

We get tossed out of that Eden eventually. In response to our earnest requests, we’re told, “Do it yourself,” or “Later,” or – most frighteningly – “No.”

That first denial leaves a mark, winding a spring that propels us into the world. We grow determined to meet our own needs. We dream of a world where we only ever hear “Yes!” to our wants and needs.

Our fantasies always overpromise what reality can deliver. That chasm between our dreams and our capacities – like a spark gap – hums with potential, and we use that energy to bend reality toward our desires. Our dissatisfaction powers our passage through the world.

It’s good to be a bit uncomfortable. The Stoics knew this. Discomfort moves us on. But we always secretly long for that earlier world where we had all our needs met. It’s an infantile wish – and a universal one. Freud knew it. Edward Bernays knew it. Herman Marcuse knew it.

And Jeff Bezos knows it.

A lot of folks – futurists particularly – are prone to make grand statements about Amazon. What it is. What it means.

Far be it from me to depart from this noble tradition.

The twenty-four years of Amazon’s corporate history can be read as a headlong assault on a single goal: providing for every need.

Bezos started with books, moved on to music and movies, then into electronics, then basics, then groceries. Pretty much anything sold by anyone anywhere can be ordered through and delivered by Amazon – at least in the United States.

Until recently, you had to go through a fair bit of work to order anything from Amazon. You had to find your heart’s desire among the millions of items on offer, then – in a clear indication of where this was always going – use the patented “one-click” checkout mechanism, which efficiently deals with both billing and delivery particularities.

It’s neat, it’s quick – and it’s not perfect. It still feels like work.

Three years ago, Alexa changed all of that.

To most of us, Alexa is nothing much more complicated than an Internet-connected microphone and speaker. In itself that’s not terribly innovative. But this isn’t what Alexa is. It’s simply the interface to Alexa. Alexa is something else entirely.

Before we talk about what Alexa is, it’s a good idea to pop up a bit and take a look at the landscape from far above, so we can see Alexa in context. Alexa is part of a much broader trend toward embedding artificial intelligence pervasively throughout the word.

I use the phrase ‘artificial intelligence’ almost like it’s a magic spell – it tells you something’s in play that seems almost impossibly complex, shrouded in mystery, the realm of boffins and a scientific priesthood.

That’s bullshit.

The dark truth of artificial intelligence is that it is simply a way for computers to learn from their mistakes. No more and no less. There is no magic. Just endless repetition.

Let me give you an example.

You might have heard that back in May, a computer program became the best player of Go in history.

Go is a 2500 year-old Chinese board game that is broadly believed to be the most complex and subtle of all board games – so important in ancient China that knowledge of Go was considered one of the ‘four essential arts’ of the cultivated person.

Go is not like that most famous of board games, chess. A computer can simulate all possible moves on a chessboards thousands of times faster than a human.

The number of possible moves in a Go game is 10 trillion trillion trillion trillion trillion. No computer can handle that kind of calculation. Instead, to succeed at Go, you have to learn how to play.

The only way you can learn how to play Go is by playing Go.

So researchers in the UK taught a computer program to play Go, but – and here’s the important bit – they also gave it the capacity to learn from its mistakes.

What this meant in practice is that the computer program studied the game board after every move it made. It learned which moves left it weakened, and which moved it toward victory.

Not that there was a lot of victory at the beginning.

Even a very bad human opponent could beat this computer program – named AlphaGo.

No human could expect to win their Go first games. AlphaGo didn’t either. But it learned from every loss. Every bad move made it better, fed into a continuing stream of data used to improve its performance.

In every game it did just a bit better than the game before it. Several thousand games later, AlphaGo could defeat a novice human player.

At this point, AlphaGo’s creators upped the pressure, matching the program against more expert players. AlphaGo lost more matches – but learned from better players.

Simultaneously, these researchers did something quite sensible – they got AlphaGo to play matches against itself.

In addition to the thousands of games it played against increasingly proficient Go players, it now played tens of thousands of matches against itself.

That’s when AlphaGo started to get very good.

Early in 2016 it beat a top-level grandmaster of Go.

Then, last May, AlphaGo played against Ke Jie, ranked as the #1 Go player in the world.

AlphaGo wiped him out utterly, in 5 matches out of 5.

At a post-tournament press conference, Ke Jie marveled at an AlphaGo that “played like a god”.  

AlphaGo got to be godlike by making billions of mistakes. Every one of those mistakes made AlphaGo smarter.

We’re fond of saying that we learn more from our failures than from our successes. Take that, and multiply it by the absolute focus and indefatigability of a computer program, and that’s modern artificial intelligence. That’s the whole of it.

And so back to Alexa. When it was introduced Alexa wasn’t very bright. It couldn’t recognise a lot of speech patterns, many accents. Alexa made a lot of mistakes. Millions of them. Alexa learned from every one of those mistakes, becoming better with every one. Every day, people say over a billion words to Alexa. That’s a lot of mistakes, and a lot of learning. So by now, at the beginning of 2018, Alexa will be able to interpret much that’s said to it in English.

Ditto Google Assistant, and ditto ditto Siri. They all have so many people saying so many things, that they make so many mistakes, which they all learn from to get better at interpreting speech.

But speech does not a mind make.

We are all familiar with that digital disappointment when we make a request to Alexa or Google Assistant or Siri, and listen as it fails spectacularly. Alexa may be adept at interpreting language, but that, as it turns out, was the easy bit. The hard part is turning our weird commands into some task Alexa knows how to perform.

On the simple side of things, let me share a story from last year’s Agency Leader’s Symposium up in the Hunter Valley. I was invited to make a short presentation on behalf of SouthernCrossAustereo about audio, podcasting and the future. SCA had just launched their PodcastOne Australia network, and this event was designed to help the agency folks plan for a future where they purchased advertising on those podcasts.

As a thank you gift for attending the session, every person in the room was sent away with their own Google Home – quite a nice way of keeping SCA front of mind. After the event, I asked the woman coordinating the event, “Do you know if Google Home can play our podcasts?”

She hadn’t checked.

As soon as I could, I asked Google Assistant “OK, Google, play the podcast The Next Billion Seconds.”

After a pause, “I’m sorry, I’m afraid I can’t do that.”

Ugh.

I phoned a friend who had recently bought a Google Home, and asked him to give it a try. Same answer.

So we’d gifted all of these agency folks an expensive toy that under no circumstances could demonstrate the strength of the SCA brand or my content.

Before we can talk about the future, we have to begin with the ways in which we haven’t even caught up to the world of today. These voice assistants are the hottest selling devices since the smartphone. Google and Amazon sold tens of millions over the holidays, and Apple’s HomePod sold out the moment it went on sale. People want these devices. But they’re going to get annoyed – or worse, bored – when they don’t do what they want them to.  Worse yet, brands are going to panic when they realise they’ve become invisible.

Voice is the new medium, but producing for that medium isn’t simply a matter of audio production – nor is it as simple as search engine optimisation. It involves the study of user journeys. How do people ask for things? How do they find them? What is the relationship of the brand to the process? How can that relationship feel entirely natural?

This is the state of the art in 2018, and I’m sure you’re working on this for your clients. But right now we’re in a weird moment when millions of these devices have nothing to say about your clients because no one has connected the dots.

If there’s going to be any future for brands, that’s going to have to be the first stop.

But it’s only the first step, and it really only opens the door to resources that can be delivered digitally.You’ll soon have Alexa and Google Assistant and Siri telling the brand story – but what happens then?

This is where the strategic differences between Google, Amazon and Apple grow into chasms a mile wide. Google has every bit of data that can be harvested from the digital world, and runs the most sophisticated commercial artificial intelligence program to help connect people to that data. The search box on Google is actually an interface to an incredibly sophisticated and complicated artificial intelligence system that Google has been hard at work on for over a decade. It’s the reason those search results are so nearly perfect, almost all the time.

But Google is like a cloud. It may be everywhere, but it’s tangible nowhere. People use Google, yet they never really touch Google. For us, Google is more hallucination than reality. It lives in our imagination.

Amazon, on the other hand, has always been grounded in the the real, delivering an ever-growing array of physical goods directly to customers. And it’s been doing that with increasing rapidity, getting post packages to most customers within two days — and slowly rolling out two-hour delivery within major urban areas. The Amazon drone started as a bad joke and now increasingly looks like the immediate future.

All of that effort has one design intention in mind – to remove the gap between a voice command issued to Alexa, and the physical manifestation of that command. Within the next twenty-four months we can expect that in certain parts of the world, Alexa will be able to fulfil almost any material desire almost immediately.

That’s the Amazon master plan – at least in the short term – and because of the enormous digital infrastructure, supply chains and logistics that Amazon has amassed, they’re they only corporation capable of fulfilling that promise. A promise that takes us back to the beginning of this talk – a time we dimly remember, when anything we wanted could be had just by the asking.

Of course, at that point we were infants. Now we’re adults. So we’re heading to an interesting moment, when infantile desire will be realised within an adult’s cognition.

Freud would have a field day.

As never before, a brand can be an enveloping presence, constantly front of mind in the most natural of relationships – serving the needs of the customer.

We need to have a good think about what that means. It’s no longer, “Alexa, what kind of car should I buy?” It’s now, “Alexa, how is my car doing?” Alexa is going to need to be able to give a substantial answer to that question.

That sounds like a huge technological effort, but I doubt that’s the case. In reality, the car only needs to share what it knows about itself with Alexa in a way that makes sense to the customer. Does it need petrol or wiper fluid or brake pads or a check-up? Servicing by the local dealership can be booked by voice command in that moment – and should be.

This relationship, handled properly, leads to a very comfortable confusion where it’s neither the customer leading the brand nor the brand leading the customer. Instead, both are working in partnership – a partnership that’s given agency by the customer, capacity by the brand, and voice by Alexa.

Now apply that to every product and service everywhere. Talking to Alexa about our car may be a beginning, but it will soon come to everything. The ‘internet of things’ is less about everything being connected than about everything having a voice.

Managing those voices as they crowd around a person or a family or a business is going to be one of the fundamental design challenges of the middle of the 21st century. Too much and people will feel claustrophobic. Too little and they’ll feel ignored. And the balance between those two will be different from person to person – and, for a given person, from day to day.

All of that calls for a certain sensitivity and awareness – knowing when to grow closer, knowing when to step back.

If that sounds distinctly non-digital, something better left to humans than machines, it’s time to come up to speed on another story, which is a bit of a lesson on how this doesn’t work.

PART TWO: TOO CLOSE FOR COMFORT

On the first of May last year, The Australian broke a bit of very disturbing news.  

A 23-page Facebook document seen by the Australian marked ‘Confidential: Internal only’ and dated 2017, outlines how the social network can target ‘moments when young people need a confidence boost’ in pinpoint detail:

‘Facebook is using sophisticated algorithms to identify and exploit Australians as young as 14, by allowing advertisers to target them at their most vulnerable, including when they feel “worthless” and “insecure”, secret internal documents reveal.’

By monitoring posts, pictures, interactions and internet activity in real time, Facebook can detect the moments young people feel ‘stressed’, ‘defeated’, ‘overwhelmed’, ‘anxious’, ‘nervous’, ‘stupid’, ‘silly’, ‘useless’ and a ‘failure’.

Facebook spent the last six years building an apparatus that ties profiling into artificial intelligence capabilities. Facebook users are watched every moment they use the platform, with each action being carefully recorded. Facebook actively ‘curates’ each user’s newsfeed, watching for the consequence of those curation choices. Does the user engage with the content, or do they disengage.

That’s the mechanism by which Facebook drove user engagement to a staggering fifty minutes a day for its billion and a half daily visitors. It profiles them, and conducts a continuous series of content experiments, feeding the results of those experiments back into the profile. Within a short span of time, the profile because irresistible.

Each profile is a simulation of the emotional state of the user – so Facebook can read a user’s emotional state as easily as it could read the numbers off a dial. That’s the kind of depth of knowledge Facebook have been able to build for themselves, then offer to brands.

All of that has had some knock-on effects.

The broader cultural awareness of ‘fake news’ has its roots in part in the fact that Facebook’s newsfeed tends to reinforce our beliefs and prejudices – leaving us open to those who would manipulate us by playing on them. The fractured nature of civil discourse is an inevitable outcome of a tool that allows people to indulge in alternative facts.

More concerning for Facebook’s business model is a migration away from the platform by people under 25. Uni students, Facebook’s founding market, have deserted the platform for Snap and other apps that keep them connected but give them a feeling of intimacy and control that Facebook conspicuously lacks.

Those kids will keep their profile going so they can have a chat with their grandparents – over 55s are now the fastest-growing segment of Facebook’s audience – but they’ll only rarely use it, and they’ll never really trust it.

This weekend the BBC published an article titled “Eight Reasons Facebook Has Peaked”. Even last year such a headline would have been unthinkable. How things have changed.

The new technologies of ability, activity and intimacy – driven by voice interfaces like Alexa, but spreading through the entire world – walk a fine line between privacy and stupidity. Reveal too much and be shut out. Hide too much, and rejected as worthless.

We know how much data is on offer for each of us. We know we can be profiled and exquisitely well known. The art in this is finding the sweet spot, where that intimacy feels comfortable in the long term.

Facebook fumbled this balance, possibly fatally. Google manages to be invisible – most of the time. Google knows more about us than Facebook, for it has known us longer. Standing back means Google is useful, but not really loved. That’s safer but creates a space for an approach that’s both riskier — and more flexible.

Can we build brand relationships that modulate their relation to us? That can respond the way a friend does – with sensitivity?

It’s almost the polar opposite of the Facebook approach, which weaponised profile data to attack and undermine its users at their most vulnerable moments. Learning from that mistake, we can imagine another way forward, where the brand grows to become a trusted friend – one who knows when to come in close, and when to back off.

Technology isn’t an issue. Intent matters. Relationships are living things. Feed them well and they grow into beautiful things. Abuse them and they shrivel and die.

That seems obvious until you realise that the folks running these relationships are not the folks who grew up in an industry that ever had to learn this. Google and Facebook and Microsoft have never had to manage human relationship.

Their blind spot creates an opportunity to reimagine our relationships to brands along human lines. Technology remains important, both as facilitator and monitor – it can connect, and it can let us know when to move in or away. Everything else is up to us – beginning with our intention. Do we want a relationship of equals? Or do we end up, like Facebook, strip-mining our customers?

PART THREE: TOKEN EFFORTS

All of the preceding places the emphasis in this relationship on the brand – and, by extension, you folks.

But the future isn’t going to be anywhere near that neat. There’s another technology coming down the pike that’s reversing all of this, putting the public in the driver’s seat.

For the last twenty-four years, the Web’s advertising model has been fairly consistent.

[ ORGANIC STORY ]

The whole business of paying for impressions on a Webpage has remained remarkably consistent.

But because a Website visitor leaves a trail of data breadcrumbs, website analytics soon became a thing, and grew into an entire field itself, offering both website operators and web advertisers a detailed view of their audiences and their spend.

That seemed like a good idea at the time. But what started as insights quickly turned into an addictive acceleration into the deep profiling that’s led Facebook to its dead-end.

Facebook is pretty much the final word in analytics – knowing their users so well their profiles accurately monitor their emotional states. You can’t really hope for more than that.

So the Web advertising model as we’ve always understood it has played itself out utterly. There’s nowhere left to go. You can double down – that’s certainly Mark Zuckerberg’s plan, but there’s no reason to believe the results will be any different.

That requires a different approach.

For the last two hundred years, the relationship between advertiser and audience has always been mediated by a publisher – newspapers and magazines, broadcasters and websites.  The advertiser pays for impressions promised by the publisher, and the audience gets the content at a reduced rate – on the Web, it’s most often free to the audience.

But like so much else in the digital world, that connection between advertiser and audience is being disrupted. Disintermediated. Advertisers can now pay audiences directly for their attention.

While that’s always been theoretically possible (and is locally true when you consider focus groups) it has never been possible to realise that business model at scale. It’s simply too expensive to think about paying audiences for their attention. The advertiser has always had to rely audience aggregation.

Brendan Eich, the very bright fellow who gave the world Javascript way back in 1995 – creating the modern, programmatic web – stepped forward last year with a very different kind of offering, one that he hoped could fix the bind that we’ve worked our way into with this unhealthy arrangement where advertisers continually pump publishers for more detailed user profile data.

It’s called the ‘Basic Attention Token’.

In the flurry of excitement about cryptocurrencies like Bitcoin and Ethereum, it’s not widely understood that cryptocurrencies do two things that have always been very hard – they create things that are both hard to forge and easy to verify. That’s a quality you need to make digital money like Bitcoin, or issuing digital coupons, or tickets to an event, and so forth. All of that has become easy and cheap. It’s so easy to create digital money that people have now started to spin up all sorts of ideas for how they’d use money for specific purposes.

In Brendan Eich’s case, he developed a new kind of digital money – a token – that’s designed to be used to pay for the attention of a Web user. Rather than paying Fairfax or SEVEN or Bauer Media for thousand impressions, the advertiser establishes a direct relationship with audience members.

Ok, so that still sounds like a lot of work. Why would you do this when you can get a thousand impressions from an aggregator?

Because this is now about relationship. This turning point is less about who gets paid for what than it is about closing the gap between brand and audience.

And this is an emerging point of opportunity for any firm that wants to help brands build those relationships.

There’s work here, surely, but that work endures. That’s indicative of this shift to relationships, a shift we see coming from the other direction, as a brand uses its new voice to meet customer need.

This is the other way to connect with customers; honoring the relationship by paying them for their attention, changing the way the audience thinks about and relates to those brands.

It’s easy to imagine brands working to recognise which folks in the audience they should be lavishing their attentions upon – and it’s easy to see that they’d pay for it.

How do they know? The audience itself tells them. Rather than being profiled by others, the audience reveals itself, and its value to a brand can be gauged by the depth of information offered up – on both sides. Some folks won’t want to work with brands who aren’t transparent about this — other folks simply won’t care.

But in both cases the power comes to the people, for the first time.

In the system envisaged by Eich – and already substantially in place – advertisers can pay for audience attention in these Basic Attention Tokens.

The audience can then spend these Basic Attention Tokens consuming any content of their choice.

The Basic Attention Token creates a virtuous cycle, where a strong brand relationship is reinforced by a meaningful content experience. That’s always been a goal, but it’s always been difficult to guarantee. When the audience starts driving, that sort of thing happens all the time as a matter of course.

All of this is already in place: the Basic Attention Token can be bought for about seventy five cents apiece, and the Brave Browser can seamlessly compensate users for their attention while allowing content providers on sites such as YouTube to compensated for their content.

It’s a beginning. It’s not very big right now. But then, neither was the Web back when Brian Behlendorf co-founded Organic Online.

And, as was true then, it’s an idea whose time has come.

The question for you is where do you want to place yourselves and your clients in the new brand relationships that will frame the middle years of this century? Because the future belongs to those relationships.

Technology has given us the capacity to scale our intimacy. Now it’s up to us to make that intimacy polite, sensitive and meaningful. The future belongs to those who get this – and the lion’s share belongs to those who get their first.

You May Also Like

About the Author: mpesce