Author Archives

Catherine Cusick
Catherine Cusick is the audience development editor of Longreads. She was previously a rep at the American Booksellers Association, as well as the social editor for IndieBound, a nationwide local-first movement.

When Op-Eds Relitigate Facts

What year were we taught the difference between facts and opinions in grade school? Was it an election year?

To review: The bar for an opinion is low. The bar for a fact is higher. Statements of fact need to be verifiable, substantiated, and proven. An opinion doesn’t need to meet any standards at all. The bar for what constitutes an opinion — sans corroboration, sans evidence, sans proof — is, indeed, low. The bar for who will listen to it is somewhere else.

A published opinion doesn’t need to meet any particular standard, either, other than an editor deeming an opinion piece worthy of publication. In opinion journalism, the publisher sets the bar. And no publisher’s bar placement comes under more scrutiny than The New York Times’.

At Splinter, David Uberti asks: “Who Is The New York Times‘ Woeful Opinion Section Even For?” If the paper of record is to remain any kind of standard-bearer in our current political moment, what should its opinion section look like? How rigorous should its standards be? Uberti advocates for raising the bar, preferably one or two notches above the denial of facts that have been painstakingly reported on the other side of the Times‘ news-opinion firewall:

In his initial column, in late April, Stephens questioned the predictions about the effects of climate change that the Times has reported on extensively. This slickly branded “climate agnostic” approach stuck a finger in the eye of both the Times’s readership and its newsroom. It risked mimicking the pundit-reporter dynamic seen at CNN, where in-house bloviators are paid to spout opinions that at times directly contradict the network’s own news reporting. Bennet defended the column as part of a “free exchange of ideas,” in what Washington Post media critic Erik Wemple described as a “Boilerplate Kumbaya Response to Public Outrage.”

The op-ed page—opposite of the editorial page—was unveiled by the Times in 1970 to foster a true “conflict of ideas,” as onetime Editorial Page Editor John B. Oakes put it. Points of view clashing with the Times’ institutional perspective or biases would be especially welcome. Names floated as potential contributors ranged from Communists to members of the John Birch Society.

“They really wanted diversity when they came out—they really prized it,” said University of Maine media scholar Michael Socolow, who authored a 2010 paper on the origins of the op-ed page. Its debut contributors included a staff column on the need for super-sonic air travel; a Chinese novelist describing Beijing during the Cultural Revolution; a political scientist and former LBJ aide analyzing U.S. policy in Asia; and a New Republic contributing editor slamming Vice President Spiro Agnew. It was a radical expansion of the Times’s opinion offerings that other newspapers soon emulated, and it hasn’t fundamentally changed since then besides expanded publishing space and formats online.

“In general, we’re looking to challenge our own and our readers’ assumptions, and, we hope, put people who disagree on important questions into conversation with each other in order to sharpen everyone’s thinking,” Bennet wrote to Splinter.

Some recent attempts to do so, however, seemed to trade intellectual rigor or true diversity for the appearance thereof.

Read the story

180 Overdoses, 18 Deaths, One Week

In July, The Cincinnati Enquirer sent 60 reporters, photographers, and videographers into the community to chronicle an ordinary week during the height of the heroin epidemic in Ohio and Kentucky.

In the interactive feature, “Seven Days of Heroin,” Terry DeMio and Dan Horn piece together a timeline from dozens of videos, transcripts, and field notes. It starts on Monday, July 10, and ends on Sunday, July 16, 2017.

It’s a little after sunrise on the first day of another week, and Cincinnati is waking up again with a heroin problem. So is Covington. And Middletown. And Norwood. And Hamilton. And West Chester Township. And countless other cities and towns across Ohio and Kentucky.

This particular week, July 10 through 16, will turn out to be unexceptional by the dreary standards of what has become the region’s greatest health crisis.

This is normal now, a week like any other. But a terrible week is no less terrible because it is typical. When heroin and synthetic opiates kill one American every 16 minutes, there is little comfort in the routine.

The accounts are harrowing. Vivid, often silent videos punctuate paragraph after paragraph of breathless bodies, emergency dispatches, orphaned children, and death tallies. Loved ones look on as lips turn blue, turn purple. As soon as the reader becomes accustomed to the rhythm of hourly tragedy, each story, like the drug, takes a turn for the worse.

Gaffney, 28, quit cold turkey after learning she was pregnant. She’s living now with the baby at First Step Home, a treatment center in Walnut Hills. They plan to move into an apartment together soon.

After years of addiction, Gaffney’s goals are modest. She wants to raise her child in a normal home. She wants a normal life.

Uebel finishes the examination. “She looks real, real good,” she says.

Gaffney is relieved. She scoops Elliana into her arms and takes her appointment card for her next visit to the clinic in December.

“See you then,” she says.

(Ten days later, Gaffney is dead from a heroin overdose.)

Peter Bhatia, editor and vice president for audience development at the Enquirer and Cincinnati.com, shares why they took on this 7-day project in a postscript:

We undertook this work – spreading our staff throughout courtrooms, jails, treatment facilities, finding addicts on the streets and talking to families who have lost love ones – to put the epidemic in proportion. It is massive. It has a direct or indirect impact on every one of us. It doesn’t discriminate by race, gender, age or economic background. Its insidious spread reaches every neighborhood, every township, every city, regardless of demographics. And it is stressing our health-care systems, hospitals and treatment capacity.

We set out to do this project not to affirm or deny differing views on the cost of battling addiction and its impact. Rather, we set out to understand how it unfolds day in and day out. I believe you will find what we found to be staggering. In the weeks ahead, The Enquirer will build on this effort, devoting more attention to actions our communities can take to make a difference against heroin’s horrible impact.

Hence the title of this ongoing project: “Heroin: Reclaiming Lives.”

Read the story

Kevin Smith’s Second Act

To the untrained eye — one without twenty years of 20/20 hindsight — it probably seemed as if Kevin Smith was just building a body of work. From 1994 to 1999, he wrote and directed Clerks, Mallrats, Chasing Amy, and Dogma. If critics and audiences took that as the beginning of a film career, hey, that’s their bad. What Kevin Smith was really doing in the 1990s was building a platform for the business of Kevin Smith.

“He’s still the casual, improvisational creator who slapped together Clerks,” Abraham Riesman writes in his profile of Smith for Vulture, “only now, his professional project isn’t a movie. It’s his existence.”

To call him a filmmaker as of now would be either misleading or misguided. Sure, he still makes movies on occasion — weird ones, deliciously weird, completely unlike the slacker comedies that made him the peer of Tarantino, Linklater, and other indie luminaries — but they’re intermittent affairs. Nowadays, his primary stream of income comes from live performances to sold-out theaters, ones where he typically just gets on stage and talks about whatever for a few hours.

His other outlet for work is similarly based on rambling: podcasts, six of which he personally hosts — discussing topics ranging from Batman to addiction recovery — and many more of which he distributes as part of his imperial “SModcast” brand. He produces and appears on an AMC reality show that just got renewed for a seventh season. He preaches to a congregation of 3.24 million on Twitter and 2.8 million on Facebook. He tours the world. He’s in the business of giving his followers more and more Kevin Smith, and business is quite good.

That’s the key: everything comes back to Smith’s talent for and fixation on talking. Directing a TV show, going on a trip, having sex with his wife — it all provides content for him to speak to his devotees in live shows and on podcasts. “Especially at the theater shows, I think he puts it all out there and he doesn’t really shy away from talking about every aspect of his life,” says friend and Comic Book Men co-host Walt Flanagan. “And bringing it honesty, along with humor. I just think it makes an audience just sit there and become fully engulfed in what he’s saying.”

In other words, Smith has transformed himself into the perfect figure for our current media landscape. Audiences have a decreasing tolerance for entertainment that feels practiced and rehearsed — they want people who shoot from the hip, say what they mean, and mean what they say. Smith delivers all of that. In an informational ecosystem where there are far too many chattering voices, people want someone who speaks loudly and directly to their interests and worldview, and Smith and the SModcast empire do that. We’re all forced to self-promote and self-start these days, and Smith is a patron saint in that realm. Even if his time in the spotlight is in the past, few artists have more expertly navigated the present.

Read the story

Working Class Jilts America’s Sweetheart Deal

Inequalities in employment are making America’s favorite business transaction, heterosexual marriage, less and less attractive.

At The Atlantic, Victor Tan Chen — an assistant professor of sociology and author of Cut Loose: Jobless and Hopeless in an Unfair Economy — brings together the latest research on income inequality and education to break down the marriageable-man theory. While marriage rates had previously increased in working class regions in the 1970s and 80s as male earnings rose, Chen finds that this only holds today if women’s earnings also remain relatively flat or depressed. The case now, more often, is that as good jobs for working class men disappear, women are indeed less likely to marry them — unless the bride(-or-not)-to-be is laid off, too, in which case she’ll head to a more gainfully-employed man’s altar.

Here Chen’s examination of income inequality, gender-bending breadwinners, social safety nets, and more illustrates how unemployment disproportionately affects the business of romance in America:

Why are those with less education—the working class—entering into, and staying in, traditional family arrangements in smaller and smaller numbers? Some tend to stress that the cultural values of the less educated have changed, and there is some truth to that. But what’s at the core of those changes is a larger shift: The disappearance of good jobs for people with less education has made it harder for them to start, and sustain, relationships.

What’s more, the U.S.’s relatively meager safety net makes the cost of being unemployed even steeper than it is in other industrialized countries—which prompts many Americans to view the decision to stay married with a jobless partner in more transactional, economic terms. And this isn’t only because of the financial ramifications of losing a job, but, in a country that puts such a premium on individual achievement, the emotional and psychological consequences as well. Even when it comes to private matters of love and lifestyle, the broader social structure—the state of the economy, the availability of good jobs, and so on—matters a great deal.

In doing research for a book about workers’ experiences of being unemployed for long periods, I saw how people who once had good jobs became, over time, “unmarriageable.” I talked to many people without jobs, men in particular, who said that dating, much less marrying or moving in with someone, was no longer a viable option: Who would take a chance on them if they couldn’t provide anything?

And for those already in serious relationships, the loss of a job can be devastating in its own way. One man I met, a 51-year-old who used to work at a car plant in Detroit, had been unemployed on and off for three years. (As is standard in sociology, my interviewees were promised confidentiality.) Over that period, his marriage fell apart. “I’ve got no money and now she’s got a job,” he told me. “All credibility is out the tubes when you can’t pay the bills.” The reason his wife started cheating on him and eventually left him, he said, was that “a man came up with money.”

His loss of “credibility” wasn’t just about earnings. He worried that, like his wife, his two young kids looked down on him. He’d always been working before; now they wondered why he was always home. In his own mind, being out of work for so long had made him less of a man. “It’s kinda tough when you can’t pay the bills, you know. So I have been going through a lot of depression lately,” he told me. Unemployment makes you unable to “be who you are, or who you once were,” he added, and that state of mind probably didn’t him make an appealing person to live with.

Read the story

Joss Whedon and the Feminist Pedestal: A Reading List

I don’t remember when Joss Whedon went from being a garden-variety household name to being someone I refer to on a first-name basis. I quote Joss, I verb Joss, I adjective Joss. As a woman who was once a teenage girl who grew up with Buffy, I’ve internalized more than my fair share of lessons from Our Lady of Buffdom. For the better part of twenty years, I’ve known Joss Whedon as the creator of a feminist hero.

For the better part of the same twenty years, Kai Cole knew Joss Whedon as her partner and husband. He was just Joss to her, too — far more intimately Joss than to any of his first-name-basis-ing fans.

This weekend, Cole wrote about her divorce with Joss in a post on The Wrap. She writes about how, on their honeymoon in England in 1995, she encouraged him to turn his script for Buffy the Vampire Slayer — which had just been misinterpreted as a film — into a television show. Joss apparently hadn’t wanted to work in television anymore. I repeat: As of 1995, Joss Whedon “didn’t want to work in television anymore.”

Yet on March 10, 1997 — two years after their honeymoon — Buffy aired on The WB.

According to Cole’s post, Joss had his first affair on the set of Buffy, and continued to have affairs in secret for fifteen years. I believe Cole. I believe that when she quotes Joss in her post, she is quoting him verbatim. I’ve quoted him verbatim, too.

(Or have I? I wonder, knowing more now than I did then about writers rooms, whether every line I attribute to episodes credited as “Written by Joss Whedon” were, in fact, written by Joss Whedon. Every time Jane Espenson tweets credit for specific lines to specific writers on Once Upon a Time — or retroactively to Buffy quotes — I wonder. Every time I watch UnREAL, a show co-created by Sarah Gertrude Shapiro and Marti Noxon that sends up how often women are discredited in television, I wonder. I don’t doubt that Joss was responsible for the vast majority of what I’d call classic Joss dialogue. I’ll just never know which lines weren’t actually his.)

After I saw Joss Whedon trending and read Cole’s post, I scrolled through other longtime fans’ and non-fans’ reactions on Twitter. Many were not surprised. I texted friends about my own lack of surprise, punctuated with single-tear emojis: “I almost can’t even call it disappointed. As though it would be actually inhuman to expect something else.”

Cole quotes a letter Joss wrote to her when their marriage was falling apart, when he was “done with” lying to her about the truth of his affairs. He invokes the inhuman in his confession, too — or, as is so often the case with Joss, the superhuman: “When I was running ‘Buffy,’ I was surrounded by beautiful, needy, aggressive women. It felt like I had a disease, like something from a Greek myth. Suddenly I am a powerful producer and the world is laid out at my feet and I can’t touch it.”

Was it superhuman for Cole to expect her husband to resist that kind of power? Would Joss have been running Buffy, if he hadn’t married Cole? “I was a powerful influence on the career choices Joss made during the 20 years we were together,” Cole writes. “I kept him grounded, and helped him find the quickest way to the success he so deeply craved. I loved him. And in return, he lied to me.”

As Marianne Eloise notes below in Dazed, it remains to be seen whether Cole’s letter will impact Joss’s career, most notably as director of the upcoming Batgirl. In the meantime, his fans are left to resolve tense, charged questions, none of which have easy answers: How do we come to personal decisions about whether or not we can separate the art from the artist? Will consequences come in the form of a public fall from feminist grace, or cost Joss professional opportunities he’s been enjoying for decades as a self-proclaimed feminist artist? Do feminists, male or female, need to be perfect to count?

In “Lie to Me” — Season 2 Episode 7, “Written by Joss Whedon” — Angel asks Buffy if she loves him. Buffy answers, “I love you. I don’t know if I trust you.” For fans and collaborators who are working through hard questions about love and the loss of trust this week, here is some guided reading on feminism, fandom, and fidelity for Whedonverse enthusiasts:
Read more…

Can Apple End Smartphone Addiction?

According to Tristan Harris, it’s going to take more than infinite willpower for billions of people to resist the infinite scroll of the attention economy. It’s going to take regulation, reform, and Apple becoming something of an acting government.

Harris — a former Google design ethicist and the founder of Time Well Spent, a nonprofit that encourages tech companies to put users’ best interests before limitless profit models — insists that our minds have been hijacked in an arms race for our attention. He also insists that, with the help of a Hippocratic Oath for software designers, we can win.

“YouTube has a hundred engineers who are trying to get the perfect next video to play automatically,” Harris says in a new interview with WIRED‘s editor in chief Nicholas Thompson. “Their techniques are only going to get more and more perfect over time, and we will have to resist the perfect.”

See? This is me resisting:

In an interview with WIRED, Thompson and Harris discuss why now is the moment to invest in reforming the attention economy.

THOMPSON: At what point do I stop making the choice [to use Facebook or Google or Instagram]? At what point am I being manipulated? At what point is it Nick and at what point is it the machine?

HARRIS: Well I think that’s the million-dollar question. First of all, let’s also say that it’s not necessarily bad to be hijacked, we might be glad if it was time well spent for us. I’m not against technology. And we’re persuaded to do things all the time. It’s just that the premise in the war for attention is that it’s going to get better and better at steering us toward its goals, not ours. We might enjoy the thing it persuades us to do, which makes us feel like we made the choice ourselves. For example, we forget if the next video loaded and we were happy about the video we watched. But, in fact, we were hijacked in that moment. All those people who are working to give you the next perfect thing on YouTube don’t know that it’s 2 am and you might also want to sleep. They’re not on your team. They’re only on the team of what gets you to spend more time on that service.

Again, the energy analogy is useful. Energy companies used to have the same perverse dynamic: I want you to use as much energy as possible. Please just let the water run until you drain the reservoir. Please keep the lights on until there’s no energy left. We, the energy companies, make more money the more energy you use. And that was a perverse relationship. And in many US states, we changed the model to decouple how much money energy companies make from how much energy you use. We need to do something like that for the attention economy, because we can’t afford a world in which this arms race is to get as much attention from you as possible.

The opportunity here, is for Apple. Apple is the one company that could actually do it. Because their business model does not rely on attention, and they actually define the playing field on which everyone seeking our attention plays. They define the rules. If you want to say it, they’re like a government. They get to set the rules for everybody else. They set the currency of competition, which is currently attention and engagement. App stores rank things based on their success in number of downloads or how much they get used. Imagine if instead they said, “We’re going to change the currency.” They could move it from the current race to the bottom to creating a race to the top for what most helps people with different parts of their lives. I think they’re in an incredible position to do that.

Read the story

‘You Start Hiring Job-Quitters’

When everyone is encouraged to think of herself as a business, working for anyone else can only ever be considered training ground.

As companies have divested themselves of long-term obligations to workers (read: pensions, benefits, paths to advancement), employees (read: job-seekers) have developed an in-kind taste for short-term, commitment-free work arrangements. Their aim in landing any given job has since become landing another job elsewhere, using the job as an opportunity to develop transferable skills — and then to go ahead and transfer. The appeal of any given job becomes how lucrative it will be to quit.

At Aeon, Ilana Gershon describes how this calculus of quitting changes workplace dynamics, management techniques, division of labor, and the nature of being co-workers. “After all,” Gershon writes, “everyone works in the quitting economy, and everyone knows it.”

If you are a white-collar worker, it is simply rational to view yourself first and foremost as a job quitter – someone who takes a job for a certain amount of time when the best outcome is that you quit for another job (and the worst is that you get laid off). So how does work change when everyone is trying to become a quitter? First of all, in the society of perpetual job searches, different criteria make a job good or not. Good jobs used to be ones with a good salary, benefits, location, hours, boss, co-workers, and a clear path towards promotion. Now, a good job is one that prepares you for your next job, almost always with another company.

Your job might be a space to learn skills that you can use in the future. Or, it might be a job with a company that has a good-enough reputation that other companies are keen to hire away its employees. On the other hand, it isn’t as good a job if everything you learn there is too specific to that company, if you aren’t learning easily transferrable skills. It isn’t a good job if it enmeshes you in local regulatory schemes and keeps you tied to a particular location. And it isn’t a good job if you have to work such long hours that you never have time to look for the next job. In short, a job becomes a good job if it will lead to another job, likely with another company or organisation. You start choosing a job for how good it will be for you to quit it.

Read the story

New York City’s Housing Emergency

Despite having some of the most progressive housing laws in the country, New York City is in the throes of a humanitarian emergency: a man-made and large-scale “displacement of populations” from their homes.

In an essay for The New York Review of Books, Michael Greenberg breaks down four aspects of the city’s current housing crisis: homelessness, rent stabilization loopholes, Mayor de Blasio’s housing plan, and alternatives for reform. Nestled within every terrifying statistic are heartbreaking personal stories — landlords grinding down tenants financially and emotionally until they give in, families with children bought out of apartments they’ve lived in for decades after the rent “perfectly legally” doubles overnight. “I put up with these streets when you had to be half-crazy to go out to the bodega for a quart of milk after dark,” one renter says. “Why should we have to leave?”

An artist I know in South Williamsburg took flight after her landlord paid a homeless man to sleep outside her door, defecate in the hallway, invite friends in for drug-fueled parties, and taunt her as she entered and left the building. In East New York a mother tells of a landlord who, after claiming to smell gas in the hallway, gained entry to her apartment and then locked her out. In January, a couple with a three-month-old baby in Bushwick complained to the city because they had no heat. In response, the landlord threatened to alert the Administration for Children’s Services that they were living with a baby in an unheated apartment. Fearful of losing their child, they left, leaving the owner with what he wanted: a vacant unit.

Stories like these move through the city like an underground stream. I repeat them not because they are extraordinary, but because they are a fact of life for thousands of New Yorkers. For the most part they go unnoticed. The displaced slink away, crouched into their private misfortune, seeking whatever solution they can find. Many experience displacement as a personal failure; they dissolve to the fringes of the city, forced to travel two or three hours to earn a minimum wage, or out of the city altogether, to depressed regions of Long Island, New Jersey, or upstate New York. If they have roots in the Caribbean, as some residents of Central Brooklyn do, they may try to start again there. Or they may join the growing number of people who are officially homeless, dependent on the city for shelter.

Read the story

Seeing and Being Seen in Shakespeare

In Hazlitt, Nicole Chung writes about taking her eight-year-old daughter to see last year’s production of The Winter’s Tale (dir. Desdemona Chiang) at the Oregon Shakespeare Festival. The play, which featured a predominantly Asian American cast and creative team, offered Chung an all-too-rare opportunity to give her daughter a chance to see herself in the characters onstage — which happens, Chung estimates, “probably less than one percent” of the time.

In a culture that whitewashes Asian and Asian American characters out of so many stories, Chung hopes that this night out at the theater can create a memory that fuels her daughter’s imagination — and her ability to imagine herself as a protagonist in her own life — for years to come.

As we watched actors of three different generations portray mother, father, daughter, and little son, I tried to remember the last time I saw so many Asian American women in a single work. After a while, though, I realized I was focusing less and less on the fact that they were Asian. It wasn’t that I stopped noticing or caring. But after the initial surprise wears off, seeing so many Asian American actors at once becomes utterly unexceptional. They simply are their characters, as all skilled actors are when performing; their presence makes a perfect kind of sense. As we watched not one but so many Asian American artists command the stage, feuding and scheming and falling in love as great characters do, it made me wonder why something so easy has to be so rare.

Stars shone high above the stage by the time the company took their bows. My sleepy child told me that she didn’t believe Hermione was alive all along, in hiding and pretending to be a statue. She thought the queen had died, and then been revived by magic. “You said this story was kind of like a fairy tale,” she said, “and in fairy tales, magic isn’t strange at all. It’s just normal.”

Read the story

Tennessee Williams’ Catastrophe of Success

Four days before the 1947 Broadway opening of A Streetcar Named Desire, the New York Times published an essay by Tennessee Williams on the depression he’d experienced after the success of The Glass Menagerie summarily ended life as he’d known it.

Fame had turned Williams into a “public Somebody” overnight, a crisis that ultimately landed him in the hospital, “mainly because of the excuses it gave me to withdraw from the world behind a gauze mask.”

The sort of life that I had had previous to this popular success was one that required endurance, a life of clawing and scratching along a sheer surface and holding on tight with raw fingers to every inch of rock higher than the one caught hold of before, but it was a good life because it was the sort of life for which the human organism is created.

I was not aware of how much vital energy had gone into this struggle until the struggle was removed. I was out on a level plateau with my arms still thrashing and my lungs still grabbing at air that no longer resisted. This was security at last.

I sat down and looked about me and was suddenly very depressed.

After spending three months witnessing inequities that felt wrong in a luxury hotel, let alone in a functioning democracy, Williams sought salvation from fame’s spiritually-bankrupt life of leisure, hoping to distance himself from a toxic setup he believed hurt everyone it touched:

The sight of an ancient woman, gasping and wheezing as she drags a heavy pail of water down a hotel corridor to mop up the mess of some drunken overprivileged guest, is one that sickens and weighs upon the heart and withers it with shame for this world in which it is not only tolerated but regarded as proof positive that the wheels of Democracy are functioning as they should without interference from above or below. Nobody should have to clean up anybody else’s mess in this world. It is terribly bad for both parties, but probably worse for the one receiving the service.

Williams suggests we should let machines take up some of humanity’s unwanted tasks, then takes a poetic detour into the consequences of that automation. Removing work from the equation of living, he observes, creates a void of paranoid inertia. Just as he concludes that outsourcing this work to fellow humans breeds depression, he notes that advances in technology designed to lighten the load often render the average person fearful of struggle itself.

We are like a man who has bought up a great amount of equipment for a camping trip, who has the canoe and the tent and the fishing lines and the axe and the guns, the mackinaw and the blankets, but who now, when all the preparations and the provisions are piled expertly together, is suddenly too timid to set out on the journey but remains where he was yesterday and the day before and the day before that, looking suspiciously through white lace curtains at the clear sky he distrusts. Our great technology is a God-given chance for adventure and for progress which we are afraid to attempt.

The essay is available online as part of The New School History Project, a site where students curate a trove of recovered archival material to provoke critical and informed discussion.

Read the story