What catches your attention?
The business of answering that question attracts hundreds of billions of dollars every year. As long as there have been things to buy, there’s been a market for human attention.
Long ago, capitalizing on human attention consisted of little more than the call of a street vendor over the din of a crowded village market.
Much later, the first one cent copies of The Sun hit the streets of New York, inspired by the realization of its editor that it was much more valuable to sell each reader’s attention to advertisers than to make money off newspaper sales directly.
Today, that concept has been taken to an extreme. Thousands of algorithms on millions of servers auction off your every click and tap, anticipating which emails you’ll open, which search results you’ll read, even how your eye might dart around the page.
Google and Facebook rely almost exclusively on directly reselling human attention. Machines are starting to help optimize email subject lines and article titles based on what might catch your eye. The playbook is simple: attract human attention with cheap or free stuff — cheap newspapers, Google search, interesting reading material — and optionally resell that attention to the highest bidder.
Where are we headed? In the face of this transformation, what can we expect? Answering these questions is hard. To go any further, it’s important to understand how attention works.
How Attention Works
Your attention is like a spotlight cast on a stage. You’re the spotlight operator. You can point the spotlight at specific things on stage, but you can’t control what actually appears on the stage.
The stage is your awareness, and it contains the sum total of information accessible to your mind at this moment. That includes the words on your screen, the sensation of pressure from your chair, and background noises in your environment, as well as the never-ending stream of random thoughts that pop up in your head.
As you read this, you are volitionally casting your spotlight on the words you’re reading. You’ll keep this up for a little while, but inevitably, the spotlight will move without your permission, attracted by an unexpected noise behind you, or someone walking into your field of view, or a stray thought about what you want for lunch.
This is the nature of attention. It darts around, scanning continuously for what’s interesting. This was an invaluable benefit in our ancestral environment. It was rare we might need to focus on one thing for more than a few moments at a time, but essential not to miss that snarling predator lurking in the bush.
As a result, if you try really hard to pay attention to only one thing, you’ll quickly find your attention elsewhere. In fact, usually, your brain decides to change the subject of your attention without your conscious input, much less your permission. You might have already drifted off into a different thought a few times as you read this. Your brain expects a little hit of feel-good neurotransmitter every time your attention jumps to something interesting. Novelty feels good.
This is precisely what makes it hard to reliably capture people’s attention. Generally, people themselves don’t understand what catches their attention or why. Most shifts in attention are unconscious, so it’s impossible for people to articulate why their attention does what it does. They only notice what it does after it has happened. Certain colors of call-to-action buttons work better on a landing page than others not due to any conscious decision by anyone, but due to the unconscious preferences of billions of brains.
Certainly, there are some things that work very well on all of us: bright colors, flashing objects, and attractive scantily clad people are all widely used to great effect. For a new parent, there’s nothing better than an iPad to quell a tantrum from a cranky toddler, and that’s because looking at colorful moving images feels good.
But beyond the obvious, the target of your attention is largely determined by neural mechanisms cultivated over decades of interacting with the world and anticipating the reward from different stimuli. Your brain is constantly moving about the spotlight of attention on the lookout for potential sources of pleasure and pain.
Attention in the Age of Artificial Intelligence
The best approach we’ve developed for understanding what captures people’s attention is empirical. We record as much as we can about what’s in their awareness — or what’s on stage. We then try to record where the spotlight is cast? — by recording a clicked link or opened email. Then, we look for patterns.
Each of those components is going to evolve dramatically over the next few years. The environments where we spend our time increasingly facilitate data collection. Algorithms for working with language, audio, and video are rapidly becoming more sophisticated. Hardware and cloud service improvements are accelerating research and discovery in artificial intelligence. There are several implications:
1) We’ll have more data on both attention and awareness.
Eye-tracking has long been used in psychology, marketing, and consumer research, in both academia and business. It works great for studying cognitive development in infants and can even be used to A/B test their preferences.
Shops already use realtime facial expression APIs to track ad viewers’ age, gender, mood, and interest level. Google’s Project Soli is a miniature solid state radar that can detect the movement of your hand and other objects near your phone. We appear comfortable with inviting Amazon Echo’s Orwellian always-on microphone into our homes.
How long before we see Amazon announce Prime Plus, requesting permission to occasionally activate your front-facing camera, Echo microphone, and motion tracker in exchange for free 30-minute drone delivery?
2) The arms race for attention will expand.
Attention is zero-sum, because every click your competitor gains, you lose. This accelerates competition. That’s why your email inbox is a battleground of people vying for your attention. So is the results page for every Google search. This will be increasingly true of everywhere you spend your attention.
3) Screens will remain the primary conduit of human attention.
Screens are everywhere. Not only did our glossy paranormal hand rectangles become globally ubiquitous in just 10 years, they’ve fundamentally transformed how humans interact with the world. While technology often advances unpredictably, screens are probably likely to persist for a while. That’s because out of the five senses you have — the five ways of putting information into awareness — vision has the highest throughput to the brain. We are multiple breakthroughs away from anything faster.
4) Humans will spend huge amounts of time in virtual worlds.
The $100 billion video game industry continues to boom. Games will become dramatically more immersive as virtual and augmented reality go mainstream. People will routinely spend time in deep and engaging virtual environments with limitless content to explore and hundreds of millions of other real and simulated people to interact with. That could transform how we spend our leisure time, how we learn, and how we meet other people.
Content in the Age of Artificial Intelligence
Think of “content” as all things that attract human attention that can be represented as data. That includes almost anything online that humans make, from blog posts and dance music to short stories, video game livestreams, entertaining social media posts, and more. The more quality content you can produce, the more attention you can scoop up, continuing to sate our limitless thirst for customization and novelty.
1) Generative algorithms for text, images, sound, and video will improve dramatically.
Machine vision, automatic speech recognition, and natural language processing have made tremendous advancements in the past five years. Algorithms can already generate extremely convincing content from scratch.
The next generation of photo and video editing tools will make it trivial to rewrite any record of reality, replacing pixels using algorithms that are aware of what they’re looking at.
Adobe claims to be working on a Photoshop for audio, making it easy to generate an audio clip of anybody’s voice saying anything at all.
Today, you can ask a neural network to hallucinate arbitrarily many images of bedrooms or cats or sailboats, most of which look real enough to fool people. Or you could use a neural network to create a language snippet to insert into an email by reading a company’s website.
Eventually, you might ask a machine to produce a fantasy novel. Say you theme it similar to Harry Potter … but with a Game of Thrones flair. And let’s maybe have the bad guy win this time.
This is a very a long way off, past multiple breakthroughs in semantics and discourse, but current techniques can already generalize well enough to spit out a cohesive and useful paragraph of text.
2) Machines will help us produce content.
Machines will play a much bigger role in helping us produce the content that captures human attention. We’ll see a proliferation of collaborative agents in products that assist us in our workflows. Machines will suggest assets to include in the content you’re making, or subsets of content to include. Executive control will remain with creators, but the ideation and production process will become increasingly automated. Think Clippy the Microsoft Office Assistant, but with a much bigger brain.
3) To cut through greater noise, humans will keep innovating.
Demonstrating that content was created by a human will become much harder. There’s no way you can imagine this article having been written by a machine, but one day, that won’t seem so ridiculous.
Machines are cheap, so as machines contribute more to creating content, the places where we consume content will be flooded. Early adopters of those techniques will benefits, but the late adopters will find that to stand out, they’ll have to produce content that is demonstrably beyond machines’ capabilities in an effort to keep attracting interest.
4) Machines will help us allocate our attention.
Work will become increasingly symbiotic. You’ll spend more of your time deciding among things and less collecting and preparing things. Machines will find relevant documents and emails, do Google searches in the background, and perform other functions that can be defined as a semi-structured set of tasks. As the deluge of content on our screens grows, tools will emerge to stem the flood.
Attention is an essential currency in the global transaction ecosystem. Understanding it is critical for anybody in sales and marketing. Despite the fact that attention is zero-sum in any given transaction, it’s important to remember that the pie is growing dramatically.
Leisure time has grown by seven hours per week since the 1960s, and we will unlock much more free time as we shift toward self-driving vehicles. Economists from the National Bureau of Economic Research published a paper suggesting that high-quality video games are contributing to an increase in unemployment among young men.
Uber, Upwork, and Crowdflower support the emergence of a global market for part-time, on-demand work at a variety of price points. Y Combinator and Elon Musk are calling for a universal basic income plan.
To connect these dots, it’s not hard to imagine a future in which wealthy corporations and governments support a basic minimum wage, and in return, people spend their time and attention generating training data and validating models. It would generally be simple tasks, easily performed on a phone, and would involve only skills or data that machines don’t have yet.
Data on human attention exposes the unconscious information locked away in our minds. That information is valuable and important, because in the aggregate, it is an encoding of everything humans want — not just of our buying preferences and creature comforts, but also of our ethics and values as a species. We want machines to understand us, and monitoring human attention may be a good way to collect the necessary data.
With the curtain pulled back on how powerless we are to control our attention and how valuable it is to everyone, perhaps we’ll all find ourselves being a bit more careful with how we spend our attention.