SMKR / Barcroft USA / Barcoft Media via Getty Images

A cadre of young technologists at Google, Twitter, and Facebook admit it: they didn’t think making smartphones addictive would make smartphones this addictive. Come to think of it, any negative consequences of the persuasive design they concocted in their twenties never really occurred to them.

Take Loren Brichter, the designer who created pull-to-refresh (the downward abracadabra swipe that prompts new app content to load). Brichter was 24 when he accidentally popularized this ubiquitous 2D gambling gesture. Of course, analogies between pull-to-refresh and slot machines are only clear to him now — in retrospect, through the hindsight bestowed upon him by adulthood.

“Now 32, Brichter says he never intended the design to be addictive,” Paul Lewis reports in the Guardian‘s latest special technology feature. Yet even the tech whiz behind the curtain has since fallen prey to some of his old design tricks. “I have two kids now,” Brichter confesses, “and I regret every minute that I’m not paying attention to them because my smartphone has sucked me in.”

As if these compulsions weren’t hollow enough, push notification technology rendered pull-to-refresh obsolete years ago. Apps can update content automatically, so user nudges like swiping and pulling aren’t just addictive, they’re redundant. According to Brichter, pull-to-refresh “could easily retire,” but now it’s become like the Door Close button in elevators that close automatically: “People just like to push it.”

So they do — over and over and over and over. In cases of addiction, people “just like to” touch their phones more than 2,617 times a day. As the opportunity costs of all that frittered attention really start to add up, Brichter and his peers find themselves fundamentally questioning their legacies:

“I’ve spent many hours and weeks and months and years thinking about whether anything I’ve done has made a net positive impact on society or humanity at all,” [Brichter] says. He has blocked certain websites, turned off push notifications, restricted his use of the Telegram app to message only with his wife and two close friends, and tried to wean himself off Twitter. “I still waste time on it,” he confesses, “just reading stupid news I already know about.” He charges his phone in the kitchen, plugging it in at 7pm and not touching it until the next morning.

“Smartphones are useful tools,” he says. “But they’re addictive. Pull-to-refresh is addictive. Twitter is addictive. These are not good things. When I was working on them, it was not something I was mature enough to think about. I’m not saying I’m mature now, but I’m a little bit more mature, and I regret the downsides.”

Lewis spotlights several designers who’ve come to similar ethical crossroads in their 30s, many of whom have quit posts at household-name technological juggernauts in the hopes of designing our way out of all this squandering.

If the attention economy is just a euphemism for the advertising economy, these techno-ethicists ask, can we intelligently design our way back to safeguarding our actual intentions? Can we take back the time we’ve lost to touchscreen-enabled compulsions, and reallocate that time to bend it to our will again? Or have we forgotten that human will and democracy, as one of Lewis’ “refuseniks” reminds us, are one and the same?

James Williams does not believe talk of dystopia is far-fetched. The ex-Google strategist who built the metrics system for the company’s global search advertising business, he has had a front-row view of an industry he describes as the “largest, most standardised and most centralised form of attentional control in human history”.

Williams, 35, left Google last year, and is on the cusp of completing a PhD at Oxford University exploring the ethics of persuasive design. It is a journey that has led him to question whether democracy can survive the new technological age.

He says his epiphany came a few years ago, when he noticed he was surrounded by technology that was inhibiting him from concentrating on the things he wanted to focus on. “It was that kind of individual, existential realisation: what’s going on?” he says. “Isn’t technology supposed to be doing the complete opposite of this?”

That discomfort was compounded during a moment at work, when he glanced at one of Google’s dashboards, a multicoloured display showing how much of people’s attention the company had commandeered for advertisers. “I realised: this is literally a million people that we’ve sort of nudged or persuaded to do this thing that they weren’t going to otherwise do,” he recalls.

If the attention economy erodes our ability to remember, to reason, to make decisions for ourselves – faculties that are essential to self-governance – what hope is there for democracy itself?

“The dynamics of the attention economy are structurally set up to undermine the human will,” he says. “If politics is an expression of our human will, on individual and collective levels, then the attention economy is directly undermining the assumptions that democracy rests on.” If Apple, Facebook, Google, Twitter, Instagram and Snapchat are gradually chipping away at our ability to control our own minds, could there come a point, I ask, at which democracy no longer functions?

“Will we be able to recognise it, if and when it happens?” Williams replies. “And if we can’t, then how do we know it hasn’t happened already?”

Read the story