Think back on your goals from a moment ago. Now try to
imagine what your technologies’ goals are for you. What do you think
they are? I don’t mean the companies’ mission statements and high-
flying marketing messages – I mean the goals on the dashboards in
their product design meetings, the metrics they’re using to define
what success means for your life. How likely do you think it is that
they reflect the goals you have for yourself?
Not very likely, sorry to say. Instead of your goals, success from
their perspective is usually defined in the form of low-level
“engagement” goals, as they’re often called. These include things like
maximizing the amount of time you spend with their product, keep-
ing you clicking or tapping or scrolling as much as possible, or show-
ing you as many pages or ads as they can. A peculiar quirk of the
technology industry is its ability to drain words of their deeper mean-
ings; “engagement” is one such word.
But these “engagement” goals are petty, subhuman goals. No
person has these goals for themselves. No one wakes up in the morn-
ing and asks, “How much time can I possibly spend using social media
today?” (If there is someone like that, I’d love to meet them and
understand their mind.)
This seems to me to be a really big deal, and one that
nobody talks about nearly enough. We trust these technologies to be
companion systems for our lives: we trust them to help us do the
things we want to do, to become the people we want to be.
In a sense, our information technologies ought to be GPSes
for our lives. (Sure, there are times when we don’t know exactly where
we want to go in life. But in those cases, technology’s job is to help us
figure out what our destination is, and to do so in the way we want to
figure it out.) But imagine if your actual GPS was adversarial against
you in this way. Imagine that you’ve just purchased a new one,
installed it in your car, and on the first use it guides you efficiently to
the right place. On the second trip, however, it takes you to an address
several streets away from your intended destination. It’s probably just a
random glitch, you think, or maybe it needs a map update. So you give
it little thought. But on the third trip, you’re shocked when you find
yourself miles away from your desired endpoint, which is now on the
opposite side of town. These errors continue to mount, and they frus-
trate you so much that you give up and decide to return home. But then,
when you enter your home address, the system gives you a route that
would have you drive for hours and end up in a totally different city.
Any reasonable person would consider this GPS faulty and
return it to the store, if not chuck it out their car window. Who would
continue to put up with a device they knew would take them some-
where other than where they wanted to go? What reasons could
anyone possibly have for continuing to tolerate such a thing?
No one would put up with this sort of distraction from a tech-
nology that directs them through physical space. Yet we do precisely
this, on a daily basis, when it comes to the technologies that direct us
through informational space. We have a curiously high tolerance for
poor navigability when it comes to the GPSes for our lives – the
information and communication systems that now direct so much
of our thought and action.
When I looked around the technology industry, I began to see with
new eyes the dashboards, the metrics, and the goals that were driving
much of its design. These were the destinations we were entering into
the GPSes guiding the lives of millions of human beings. I tried
imagining my life reflected in the primary color numbers increment-
ing on screens around me: Number of Views, Time on Site, Number of
Clicks, Total Conversions. Suddenly, these goals seemed petty and
perverse. They were not my goals – or anyone else’s.
I soon came to understand that the cause in which I’d been
conscripted wasn’t the organization of information at all, but of
attention. The technology industry wasn’t designing products; it
was designing users.
But if all of society were to become as distracted in this new,
deep way as I was starting to feel, what would that mean? What would
be the implications for our shared interests, our common purposes,
our collective identities, our politics?