Author Archives

Catherine Cusick
Catherine Cusick is the audience development editor of Longreads. She was previously a rep at the American Booksellers Association, as well as the social editor for IndieBound, a nationwide local-first movement.

An Interview with MacArthur ‘Genius’ Viet Thanh Nguyen

Guillaume Souvant / AFP / Getty Images

Catherine Cusick | Longreads | October 2017 | 9 minutes (2,200 words)

Viet Thanh Nguyen had just gotten back from a summer in Paris when he received an unexpected phone call from a Chicago number. He didn’t recognize the caller, so he let it ring. Out of curiosity, he texted back, “Who is this?”

The number replied, “It’s the MacArthur Foundation.”

“Oh,” Nguyen thought. “I should call these people back right away.”

Nguyen managed to stand for the first few seconds of the call, but soon had to sit down. He’d just won $625,000, no strings attached, as an unrestricted investment in his creative potential.

Eighteen months earlier, Nguyen had received another life-altering phone call when he won the 2016 Pulitzer Prize for Fiction for his debut novel, The Sympathizer. Since the book’s publication in April 2015, Nguyen’s been no stranger to worldwide recognition: He’s also received a Guggenheim fellowship, the Dayton Literary Peace Prize, the First Novel Prize from the Center for Fiction, the Carnegie Medal for Excellence in Fiction, and countless others.

According to the MacArthur Selection Committee, “Nguyen’s body of work not only offers insight into the experiences of refugees past and present, but also poses profound questions about how we might more accurately and conscientiously portray victims and adversaries of other wars.” After writing in obscurity for more than a decade to honor his and others’ war stories — and all refugee stories, Nguyen insists, are war stories — he will now have even more resources to help tilt the world in a more peaceful direction.

I spoke with Nguyen the day after the MacArthur Foundation announced him, along with 23 other extraordinary recipients, as a 2017 MacArthur Fellow. Read more…

Immature Architects Built the Attention Economy

SMKR / Barcroft USA / Barcoft Media via Getty Images

A cadre of young technologists at Google, Twitter, and Facebook admit it: they didn’t think making smartphones addictive would make smartphones this addictive. Come to think of it, any negative consequences of the persuasive design they concocted in their twenties never really occurred to them.

Take Loren Brichter, the designer who created pull-to-refresh (the downward abracadabra swipe that prompts new app content to load). Brichter was 24 when he accidentally popularized this ubiquitous 2D gambling gesture. Of course, analogies between pull-to-refresh and slot machines are only clear to him now — in retrospect, through the hindsight bestowed upon him by adulthood.

“Now 32, Brichter says he never intended the design to be addictive,” Paul Lewis reports in the Guardian‘s latest special technology feature. Yet even the tech whiz behind the curtain has since fallen prey to some of his old design tricks. “I have two kids now,” Brichter confesses, “and I regret every minute that I’m not paying attention to them because my smartphone has sucked me in.”

As if these compulsions weren’t hollow enough, push notification technology rendered pull-to-refresh obsolete years ago. Apps can update content automatically, so swiping and pulling and user nudges aren’t just addictive, they’re redundant. According to Brichter, pull-to-refresh “could easily retire,” but now it’s become like the Door Close button in elevators that close automatically: “People just like to push it.”

So they do — over and over and over and over. In cases of addiction, people “just like to” touch their phones more than 2,617 times a day. As the opportunity costs of all that frittered attention really start to add up, Brichter and his peers find themselves fundamentally questioning their legacies:

“I’ve spent many hours and weeks and months and years thinking about whether anything I’ve done has made a net positive impact on society or humanity at all,” [Brichter] says. He has blocked certain websites, turned off push notifications, restricted his use of the Telegram app to message only with his wife and two close friends, and tried to wean himself off Twitter. “I still waste time on it,” he confesses, “just reading stupid news I already know about.” He charges his phone in the kitchen, plugging it in at 7pm and not touching it until the next morning.

“Smartphones are useful tools,” he says. “But they’re addictive. Pull-to-refresh is addictive. Twitter is addictive. These are not good things. When I was working on them, it was not something I was mature enough to think about. I’m not saying I’m mature now, but I’m a little bit more mature, and I regret the downsides.”

Lewis spotlights several designers who’ve come to similar ethical crossroads in their 30s, many of whom have quit posts at household-name technological juggernauts in the hopes of designing our way out of all this squandering.

If the attention economy is just a euphemism for the advertising economy, these techno-ethicists ask, can we intelligently design our way back to safeguarding our actual intentions? Can we take back the time we’ve lost to touchscreen-enabled compulsions, and reallocate that time to bend it to our will again? Or have we forgotten that human will and democracy, as one of Lewis’ “refuseniks” reminds us, are one and the same?

James Williams does not believe talk of dystopia is far-fetched. The ex-Google strategist who built the metrics system for the company’s global search advertising business, he has had a front-row view of an industry he describes as the “largest, most standardised and most centralised form of attentional control in human history”.

Williams, 35, left Google last year, and is on the cusp of completing a PhD at Oxford University exploring the ethics of persuasive design. It is a journey that has led him to question whether democracy can survive the new technological age.

He says his epiphany came a few years ago, when he noticed he was surrounded by technology that was inhibiting him from concentrating on the things he wanted to focus on. “It was that kind of individual, existential realisation: what’s going on?” he says. “Isn’t technology supposed to be doing the complete opposite of this?”

That discomfort was compounded during a moment at work, when he glanced at one of Google’s dashboards, a multicoloured display showing how much of people’s attention the company had commandeered for advertisers. “I realised: this is literally a million people that we’ve sort of nudged or persuaded to do this thing that they weren’t going to otherwise do,” he recalls.

If the attention economy erodes our ability to remember, to reason, to make decisions for ourselves – faculties that are essential to self-governance – what hope is there for democracy itself?

“The dynamics of the attention economy are structurally set up to undermine the human will,” he says. “If politics is an expression of our human will, on individual and collective levels, then the attention economy is directly undermining the assumptions that democracy rests on.” If Apple, Facebook, Google, Twitter, Instagram and Snapchat are gradually chipping away at our ability to control our own minds, could there come a point, I ask, at which democracy no longer functions?

“Will we be able to recognise it, if and when it happens?” Williams replies. “And if we can’t, then how do we know it hasn’t happened already?”

Read the story

Fine for the Whole Family

Lynne Gilbert / Getty Images

Was I a picky eater as a child? Yes. But now my parents are pickier.

Selecting an appropriate restaurant for a visit from my folks has made for a decade-long challenge. In theory, I should have no shortage of options — New York City is fairly renowned for its culinary variety — but the city itself is short on a few of my parents’ preferences.

Over countless attempts and hundreds of plates, I’ve learned that the right atmosphere requires a delicate ambience of peace and quiet. (We don’t have that here.) There should be ample space. (We don’t have that, either.) Waitstaff should be more talented than necessary, with a cast-iron sense of humor that can withstand my dad’s idea of fun. (It’s the kind of fun that happens after we’ve left: he’ll rib a server with theatrical just-kidding complaints for two hours, then tip big.) It shouldn’t be crowded but it shouldn’t be empty. The bringer of cheese for the pasta should probably just leave the cheese. Dad won’t eat anything spicy. Mom won’t eat anything raw. Mom will always ask if the table is okay, which always sounds like the table isn’t okay, but when I ask her if she thinks the table is okay, she makes this face like, “Bail me out.”

Have we all become people who shouldn’t be taken anywhere? Probably. I’ve gotten used to my perennial failure to find places that thrive at this impossible nexus of enchantments. I doubt there is a food solution that will always make everyone in this particular triangle of our family totally happy. But for a while there, our solution was Olive Garden.

Olive Garden was our go-to when I was in college. There, everyone was happy — or if we weren’t, everyone was fine. My dad would order Shrimp Scampi; I would order Chicken Marsala; my mom would make their Famous House Salad more famous. We’d eat all the breadsticks, request our first refill, then wrap the second batch to go. I’d reheat them one at a time in my dorm room microwave, wrapping each in a paper towel that would soak up five finger-pressed blots of oil I wouldn’t have to clean. That was where I set the bar those days — that’s all it took to make for a singular restaurant experience with my family. Would there be leftovers? Great. Olive Garden was fine, and fine was good.

In “Dear Olive Garden, Never Change,” the latest installment in Eater‘s Death of Chains series on the slow decline of middlebrow chain restaurants, Helen Rosner reminds me that this anodyne fine-for-the-whole-family feel is completely by design. “One of the things I love about the Olive Garden,” Rosner writes, “is its nowhereness. I love that I can walk in the door of an Olive Garden in Michigan City, Indiana, and feel like I’m in the same room I enter when I step into an Olive Garden in Queens or Rhode Island or the middle of Los Angeles. There is only one Olive Garden, but it has a thousand doors.”

After three years at Vox Media as Eater‘s Features Editor turned Executive Editor turned Editor-at-Large, Rosner recently announced her departure from “the best goddamn food publication in the world.” She tweeted mysteriously to watch this space for updates, noting only that she is moving on “to crush some new things.” If they’re anything like her greatest hits thus far — on glorified vending machines, Tina Fey’s sheetcakingchicken tendersTrump’s ketchup-covered crime scenes, and takedowns of chocolatiers who may not always have had beards — her readers will be sure to bring their bottomless appetites to her next endeavor.

I feel an intense affinity for Olive Garden, which — like the lack of olives on its menu — is by design. The restaurant was built for affinity, constructed from the foundations to the faux-finished rafters to create a sense of connection, of vague familiarity, to bring to mind some half-lost memory of old-world simplicity and ease. Even if you’ve never been to the Olive Garden before, you’re supposed to feel like you have. You know the next song that’s going to play. You know how the chairs roll against the carpet. You know where the bathrooms are. Its product is nominally pasta and wine, but what Olive Garden is actually selling is Olive Garden, a room of comfort and familiarity, a place to return to over and over.

In that way, it’s just like any other chain restaurant. For any individual mid-range restaurant, return customers have always been an easy majority of the clientele, and chain-wide, it’s overwhelmingly the case: If you’ve been to one Olive Garden, odds are very high you’ve been to two or more. If the restaurant is doing it right, though, all the Olive Gardens of your life will blur together into one Olive Garden, one host stand, one bar, one catacomb of dining alcoves warmly decorated in Toscana-lite. Each Olive Garden is a little bit different, but their souls are all the same.

Read the story

The Price of Tuition-Free College

Fairfax Media via Getty Images

Tuition-free college is a reality in California. The catch is that eligible students can’t always afford rent, food, or books.

“More than half of California college students don’t need to worry about tuition,” Ashley Powers writes in a recent feature for California Sunday Magazine. Thanks to California’s Master Plan for Higher Education, federal- and state-subsidized grants are available to help students from low-income families cover the cost of tuition at state-financed universities and colleges. “The problem,” Powers explains, “is the cost of everything else.”

In “The College Try,” Powers follows Liz Waite and Kersheral Jessup, two Cal State students who’ve each put themselves through six years of college. Both went to community colleges first to save money — Jessup for three years before transferring, Waite for six. Both believed a bachelor’s degree would spare them from homelessness, wage slavery, and following in their parents’ addictive footsteps (meth in Waite’s case, alcohol in Jessup’s). As they navigate bureaucratic mazes, couch-surfing roulette, and soul-killing jobs that don’t even require advanced degrees, the duo weigh their years of sacrifice against an unverifiable suspicion that years of work experience might have yielded better prospects.

At the Dems’ weekly meeting, about a dozen students chitchatted in a semicircle; the speakers before Liz were looking for volunteers to take surveys about election-related stress. When it was Liz’s turn, she bounded to the center.

“Hey, everybody, let’s make this awkward,” she said. “What words would you guys use to describe me? Like, if you look at me, what words come to your mind? Just shout ’em out.”

“Tall.”

She nodded. “Tall…”

“Student.”

“Blond.”

“Student, blond, right,” she said. “Here’s a word that’s probably not coming to your mind. And it’s” — she shot out her arms the way you would to yell, “Surprise!” — “homeless!” Liz looked at the audience: saucer eyes.

No type of school has been more successful at lifting the poor up to the middle class and beyond than midtier public universities like the Cal States. In a ranking published this year of colleges that helped the highest percentage of students claw their way out of poverty, four Cal State campuses made the top 10. Cal State Long Beach clinched the last spot, vaulting 78 percent of its students from the bottom of the economic ladder, where household incomes top out around $25,000 a year. But for all the good Cal State does for its alumni, most students there struggle to get their degrees. Only one in five finishes in four years, and a little more than half graduate in six, their progress slowed, in part, by soaring living costs in one of the nation’s most expensive states.

Two-thirds of the expense of attending a public four-year college stems from costs like rent, food, and books. The vast majority of Cal State students live off campus (the system has enough housing to accommodate only about 10 percent of its undergraduates). Cal State Long Beach estimates that off-campus students who don’t live at home need close to $18,000 a year in addition to the cost of tuition, or nearly the salary of a full-time minimum-wage worker.

Last year, researchers at Cal State estimated that nearly one in nine students is homeless. Even more couldn’t afford food on a regular basis (a problem at UCs, the California community colleges, and campuses from Hawaii to New York). Students without stable housing, in particular, are more likely to enroll part time, struggle in class, and drop out altogether. In California, lawmakers recently floated a proposal to help many UC and Cal State students with their expenses. Projected to cost more than a billion dollars a year, it sputtered.

Read the story

When Op-Eds Relitigate Facts

Bret Stephens’s first Op-Ed column for The New York Times.

What year were we taught the difference between facts and opinions in grade school? Was it an election year?

To review: The bar for an opinion is low. The bar for a fact is higher. Statements of fact need to be verifiable, substantiated, and proven. An opinion doesn’t need to meet any standards at all. The bar for what constitutes an opinion — sans corroboration, sans evidence, sans proof — is, indeed, low. The bar for who will listen to it is somewhere else.

A published opinion doesn’t need to meet any particular standard, either, other than an editor deeming an opinion piece worthy of publication. In opinion journalism, the publisher sets the bar. And no publisher’s bar placement comes under more scrutiny than The New York Times’.

At Splinter, David Uberti asks: “Who Is The New York Times‘ Woeful Opinion Section Even For?” If the paper of record is to remain any kind of standard-bearer in our current political moment, what should its opinion section look like? How rigorous should its standards be? Uberti advocates for raising the bar, preferably one or two notches above the denial of facts that have been painstakingly reported on the other side of the Times‘ news-opinion firewall:

In his initial column, in late April, Stephens questioned the predictions about the effects of climate change that the Times has reported on extensively. This slickly branded “climate agnostic” approach stuck a finger in the eye of both the Times’s readership and its newsroom. It risked mimicking the pundit-reporter dynamic seen at CNN, where in-house bloviators are paid to spout opinions that at times directly contradict the network’s own news reporting. Bennet defended the column as part of a “free exchange of ideas,” in what Washington Post media critic Erik Wemple described as a “Boilerplate Kumbaya Response to Public Outrage.”

The op-ed page—opposite of the editorial page—was unveiled by the Times in 1970 to foster a true “conflict of ideas,” as onetime Editorial Page Editor John B. Oakes put it. Points of view clashing with the Times’ institutional perspective or biases would be especially welcome. Names floated as potential contributors ranged from Communists to members of the John Birch Society.

“They really wanted diversity when they came out—they really prized it,” said University of Maine media scholar Michael Socolow, who authored a 2010 paper on the origins of the op-ed page. Its debut contributors included a staff column on the need for super-sonic air travel; a Chinese novelist describing Beijing during the Cultural Revolution; a political scientist and former LBJ aide analyzing U.S. policy in Asia; and a New Republic contributing editor slamming Vice President Spiro Agnew. It was a radical expansion of the Times’s opinion offerings that other newspapers soon emulated, and it hasn’t fundamentally changed since then besides expanded publishing space and formats online.

“In general, we’re looking to challenge our own and our readers’ assumptions, and, we hope, put people who disagree on important questions into conversation with each other in order to sharpen everyone’s thinking,” Bennet wrote to Splinter.

Some recent attempts to do so, however, seemed to trade intellectual rigor or true diversity for the appearance thereof.

Read the story

180 Overdoses, 18 Deaths, One Week

Spencer Platt / Getty Images

In July, The Cincinnati Enquirer sent 60 reporters, photographers, and videographers into the community to chronicle an ordinary week during the height of the heroin epidemic in Ohio and Kentucky.

In the interactive feature, “Seven Days of Heroin,” Terry DeMio and Dan Horn piece together a timeline from dozens of videos, transcripts, and field notes. It starts on Monday, July 10, and ends on Sunday, July 16, 2017.

It’s a little after sunrise on the first day of another week, and Cincinnati is waking up again with a heroin problem. So is Covington. And Middletown. And Norwood. And Hamilton. And West Chester Township. And countless other cities and towns across Ohio and Kentucky.

This particular week, July 10 through 16, will turn out to be unexceptional by the dreary standards of what has become the region’s greatest health crisis.

This is normal now, a week like any other. But a terrible week is no less terrible because it is typical. When heroin and synthetic opiates kill one American every 16 minutes, there is little comfort in the routine.

The accounts are harrowing. Vivid, often silent videos punctuate paragraph after paragraph of breathless bodies, emergency dispatches, orphaned children, and death tallies. Loved ones look on as lips turn blue, turn purple. As soon as the reader becomes accustomed to the rhythm of hourly tragedy, each story, like the drug, takes a turn for the worse.

Gaffney, 28, quit cold turkey after learning she was pregnant. She’s living now with the baby at First Step Home, a treatment center in Walnut Hills. They plan to move into an apartment together soon.

After years of addiction, Gaffney’s goals are modest. She wants to raise her child in a normal home. She wants a normal life.

Uebel finishes the examination. “She looks real, real good,” she says.

Gaffney is relieved. She scoops Elliana into her arms and takes her appointment card for her next visit to the clinic in December.

“See you then,” she says.

(Ten days later, Gaffney is dead from a heroin overdose.)

Peter Bhatia, editor and vice president for audience development at the Enquirer and Cincinnati.com, shares why they took on this 7-day project in a postscript:

We undertook this work – spreading our staff throughout courtrooms, jails, treatment facilities, finding addicts on the streets and talking to families who have lost love ones – to put the epidemic in proportion. It is massive. It has a direct or indirect impact on every one of us. It doesn’t discriminate by race, gender, age or economic background. Its insidious spread reaches every neighborhood, every township, every city, regardless of demographics. And it is stressing our health-care systems, hospitals and treatment capacity.

We set out to do this project not to affirm or deny differing views on the cost of battling addiction and its impact. Rather, we set out to understand how it unfolds day in and day out. I believe you will find what we found to be staggering. In the weeks ahead, The Enquirer will build on this effort, devoting more attention to actions our communities can take to make a difference against heroin’s horrible impact.

Hence the title of this ongoing project: “Heroin: Reclaiming Lives.”

Read the story

Kevin Smith’s Second Act

Greg Doherty / Getty Images

To the untrained eye — one without twenty years of 20/20 hindsight — it probably seemed as if Kevin Smith was just building a body of work. From 1994 to 1999, he wrote and directed Clerks, Mallrats, Chasing Amy, and Dogma. If critics and audiences took that as the beginning of a film career, hey, that’s their bad. What Kevin Smith was really doing in the 1990s was building a platform for the business of Kevin Smith.

“He’s still the casual, improvisational creator who slapped together Clerks,” Abraham Riesman writes in his profile of Smith for Vulture, “only now, his professional project isn’t a movie. It’s his existence.”

To call him a filmmaker as of now would be either misleading or misguided. Sure, he still makes movies on occasion — weird ones, deliciously weird, completely unlike the slacker comedies that made him the peer of Tarantino, Linklater, and other indie luminaries — but they’re intermittent affairs. Nowadays, his primary stream of income comes from live performances to sold-out theaters, ones where he typically just gets on stage and talks about whatever for a few hours.

His other outlet for work is similarly based on rambling: podcasts, six of which he personally hosts — discussing topics ranging from Batman to addiction recovery — and many more of which he distributes as part of his imperial “SModcast” brand. He produces and appears on an AMC reality show that just got renewed for a seventh season. He preaches to a congregation of 3.24 million on Twitter and 2.8 million on Facebook. He tours the world. He’s in the business of giving his followers more and more Kevin Smith, and business is quite good.

That’s the key: everything comes back to Smith’s talent for and fixation on talking. Directing a TV show, going on a trip, having sex with his wife — it all provides content for him to speak to his devotees in live shows and on podcasts. “Especially at the theater shows, I think he puts it all out there and he doesn’t really shy away from talking about every aspect of his life,” says friend and Comic Book Men co-host Walt Flanagan. “And bringing it honesty, along with humor. I just think it makes an audience just sit there and become fully engulfed in what he’s saying.”

In other words, Smith has transformed himself into the perfect figure for our current media landscape. Audiences have a decreasing tolerance for entertainment that feels practiced and rehearsed — they want people who shoot from the hip, say what they mean, and mean what they say. Smith delivers all of that. In an informational ecosystem where there are far too many chattering voices, people want someone who speaks loudly and directly to their interests and worldview, and Smith and the SModcast empire do that. We’re all forced to self-promote and self-start these days, and Smith is a patron saint in that realm. Even if his time in the spotlight is in the past, few artists have more expertly navigated the present.

Read the story

Working Class Jilts America’s Sweetheart Deal

Jefta Images / Barcroft Images / Barcroft Media via Getty Images

Inequalities in employment are making America’s favorite business transaction, heterosexual marriage, less and less attractive.

At The Atlantic, Victor Tan Chen — an assistant professor of sociology and author of Cut Loose: Jobless and Hopeless in an Unfair Economy — brings together the latest research on income inequality and education to break down the marriageable-man theory. While marriage rates had previously increased in working class regions in the 1970s and 80s as male earnings rose, Chen finds that this only holds today if women’s earnings also remain relatively flat or depressed. The case now, more often, is that as good jobs for working class men disappear, women are indeed less likely to marry them — unless the bride(-or-not)-to-be is laid off, too, in which case she’ll head to a more gainfully-employed man’s altar.

Here Chen’s examination of income inequality, gender-bending breadwinners, social safety nets, and more illustrates how unemployment disproportionately affects the business of romance in America:

Why are those with less education—the working class—entering into, and staying in, traditional family arrangements in smaller and smaller numbers? Some tend to stress that the cultural values of the less educated have changed, and there is some truth to that. But what’s at the core of those changes is a larger shift: The disappearance of good jobs for people with less education has made it harder for them to start, and sustain, relationships.

What’s more, the U.S.’s relatively meager safety net makes the cost of being unemployed even steeper than it is in other industrialized countries—which prompts many Americans to view the decision to stay married with a jobless partner in more transactional, economic terms. And this isn’t only because of the financial ramifications of losing a job, but, in a country that puts such a premium on individual achievement, the emotional and psychological consequences as well. Even when it comes to private matters of love and lifestyle, the broader social structure—the state of the economy, the availability of good jobs, and so on—matters a great deal.

In doing research for a book about workers’ experiences of being unemployed for long periods, I saw how people who once had good jobs became, over time, “unmarriageable.” I talked to many people without jobs, men in particular, who said that dating, much less marrying or moving in with someone, was no longer a viable option: Who would take a chance on them if they couldn’t provide anything?

And for those already in serious relationships, the loss of a job can be devastating in its own way. One man I met, a 51-year-old who used to work at a car plant in Detroit, had been unemployed on and off for three years. (As is standard in sociology, my interviewees were promised confidentiality.) Over that period, his marriage fell apart. “I’ve got no money and now she’s got a job,” he told me. “All credibility is out the tubes when you can’t pay the bills.” The reason his wife started cheating on him and eventually left him, he said, was that “a man came up with money.”

His loss of “credibility” wasn’t just about earnings. He worried that, like his wife, his two young kids looked down on him. He’d always been working before; now they wondered why he was always home. In his own mind, being out of work for so long had made him less of a man. “It’s kinda tough when you can’t pay the bills, you know. So I have been going through a lot of depression lately,” he told me. Unemployment makes you unable to “be who you are, or who you once were,” he added, and that state of mind probably didn’t him make an appealing person to live with.

Read the story

Joss Whedon and the Feminist Pedestal: A Reading List

Jason LaVeris / FilmMagic / Getty Images

I don’t remember when Joss Whedon went from being a garden-variety household name to being someone I refer to on a first-name basis. I quote Joss, I verb Joss, I adjective Joss. As a woman who was once a teenage girl who grew up with Buffy, I’ve internalized more than my fair share of lessons from Our Lady of Buffdom. For the better part of twenty years, I’ve known Joss Whedon as the creator of a feminist hero.

For the better part of the same twenty years, Kai Cole knew Joss Whedon as her partner and husband. He was just Joss to her, too — far more intimately Joss than to any of his first-name-basis-ing fans.

This weekend, Cole wrote about her divorce with Joss in a post on The Wrap. She writes about how, on their honeymoon in England in 1995, she encouraged him to turn his script for Buffy the Vampire Slayer — which had just been misinterpreted as a film — into a television show. Joss apparently hadn’t wanted to work in television anymore. I repeat: As of 1995, Joss Whedon “didn’t want to work in television anymore.”

Yet on March 10, 1997 — two years after their honeymoon — Buffy aired on The WB.

According to Cole’s post, Joss had his first affair on the set of Buffy, and continued to have affairs in secret for fifteen years. I believe Cole. I believe that when she quotes Joss in her post, she is quoting him verbatim. I’ve quoted him verbatim, too.

(Or have I? I wonder, knowing more now than I did then about writers rooms, whether every line I attribute to episodes credited as “Written by Joss Whedon” were, in fact, written by Joss Whedon. Every time Jane Espenson tweets credit for specific lines to specific writers on Once Upon a Time — or retroactively to Buffy quotes — I wonder. Every time I watch UnREAL, a show co-created by Sarah Gertrude Shapiro and Marti Noxon that sends up how often women are discredited in television, I wonder. I don’t doubt that Joss was responsible for the vast majority of what I’d call classic Joss dialogue. I’ll just never know which lines weren’t actually his.)

After I saw Joss Whedon trending and read Cole’s post, I scrolled through other longtime fans’ and non-fans’ reactions on Twitter. Many were not surprised. I texted friends about my own lack of surprise, punctuated with single-tear emojis: “I almost can’t even call it disappointed. As though it would be actually inhuman to expect something else.”

Cole quotes a letter Joss wrote to her when their marriage was falling apart, when he was “done with” lying to her about the truth of his affairs. He invokes the inhuman in his confession, too — or, as is so often the case with Joss, the superhuman: “When I was running ‘Buffy,’ I was surrounded by beautiful, needy, aggressive women. It felt like I had a disease, like something from a Greek myth. Suddenly I am a powerful producer and the world is laid out at my feet and I can’t touch it.”

Was it superhuman for Cole to expect her husband to resist that kind of power? Would Joss have been running Buffy, if he hadn’t married Cole? “I was a powerful influence on the career choices Joss made during the 20 years we were together,” Cole writes. “I kept him grounded, and helped him find the quickest way to the success he so deeply craved. I loved him. And in return, he lied to me.”

As Marianne Eloise notes below in Dazed, it remains to be seen whether Cole’s letter will impact Joss’s career, most notably as director of the upcoming Batgirl. In the meantime, his fans are left to resolve tense, charged questions, none of which have easy answers: How do we come to personal decisions about whether or not we can separate the art from the artist? Will consequences come in the form of a public fall from feminist grace, or cost Joss professional opportunities he’s been enjoying for decades as a self-proclaimed feminist artist? Do feminists, male or female, need to be perfect to count?

In “Lie to Me” — Season 2 Episode 7, “Written by Joss Whedon” — Angel asks Buffy if she loves him. Buffy answers, “I love you. I don’t know if I trust you.” For fans and collaborators who are working through hard questions about love and the loss of trust this week, here is some guided reading on feminism, fandom, and fidelity for Whedonverse enthusiasts:
Read more…

Can Apple End Smartphone Addiction?

Markus Daniel / Getty Images

According to Tristan Harris, it’s going to take more than infinite willpower for billions of people to resist the infinite scroll of the attention economy. It’s going to take regulation, reform, and Apple becoming something of an acting government.

Harris — a former Google design ethicist and co-founder of Time Well Spent, a nonprofit that encourages tech companies to put users’ best interests before limitless profit models — insists that our minds have been hijacked in an arms race for our attention. He also insists that, with the help of a Hippocratic Oath for software designers, we can win.

“YouTube has a hundred engineers who are trying to get the perfect next video to play automatically,” Harris says in a new interview with WIRED‘s editor in chief Nicholas Thompson. “Their techniques are only going to get more and more perfect over time, and we will have to resist the perfect.”

See? This is me resisting:

In their WIRED interview, Thompson and Harris discuss why now is the moment to invest in reforming the attention economy.

THOMPSON: At what point do I stop making the choice [to use Facebook or Google or Instagram]? At what point am I being manipulated? At what point is it Nick and at what point is it the machine?

HARRIS: Well I think that’s the million-dollar question. First of all, let’s also say that it’s not necessarily bad to be hijacked, we might be glad if it was time well spent for us. I’m not against technology. And we’re persuaded to do things all the time. It’s just that the premise in the war for attention is that it’s going to get better and better at steering us toward its goals, not ours. We might enjoy the thing it persuades us to do, which makes us feel like we made the choice ourselves. For example, we forget if the next video loaded and we were happy about the video we watched. But, in fact, we were hijacked in that moment. All those people who are working to give you the next perfect thing on YouTube don’t know that it’s 2 am and you might also want to sleep. They’re not on your team. They’re only on the team of what gets you to spend more time on that service.

Again, the energy analogy is useful. Energy companies used to have the same perverse dynamic: I want you to use as much energy as possible. Please just let the water run until you drain the reservoir. Please keep the lights on until there’s no energy left. We, the energy companies, make more money the more energy you use. And that was a perverse relationship. And in many US states, we changed the model to decouple how much money energy companies make from how much energy you use. We need to do something like that for the attention economy, because we can’t afford a world in which this arms race is to get as much attention from you as possible.

The opportunity here, is for Apple. Apple is the one company that could actually do it. Because their business model does not rely on attention, and they actually define the playing field on which everyone seeking our attention plays. They define the rules. If you want to say it, they’re like a government. They get to set the rules for everybody else. They set the currency of competition, which is currently attention and engagement. App stores rank things based on their success in number of downloads or how much they get used. Imagine if instead they said, “We’re going to change the currency.” They could move it from the current race to the bottom to creating a race to the top for what most helps people with different parts of their lives. I think they’re in an incredible position to do that.

Read the story