Search Results for: David Hill

The American Worth Ethic

Getty / Photo Illustration by Longreads

Bryce Covert | Longreads | April 2019 | 13 minutes (3,374 words)

“The American work ethic, the motivation that drives Americans to work longer hours each week and more weeks each year than any of our economic peers, is a long-standing contributor to America’s success.” Thus reads the first sentence of a massive report the Trump administration released in July 2018. Americans’ drive to work ever harder, longer, and faster is at the heart of the American Dream: the idea, which has become more mythology than reality in a country with yawning income inequality and stagnating upward economic mobility, that if an American works hard enough she can attain her every desire. And we really try: We put in between 30 to 90 minutes more each day than the typical European. We work 400 hours more annually than the high-output Germans and clock more office time than even the work-obsessed Japanese.

The story of individual hard work is embedded into the very founding of our country, from the supposedly self-made, entrepreneurial Founding Fathers to the pioneers who plotted the United States’ western expansion; little do we acknowledge that the riches of this country were built on the backs of African slaves, many owned by the Founding Fathers themselves, whose descendants live under oppressive policies that continue to leave them with lower incomes and overall wealth and in greater poverty. We — the “we” who write the history books — would rather tell ourselves that the people who shaped our country did it through their own hard work and not by standing on the shoulders, or stepping on the necks, of others. It’s an easier story to live with. It’s one where the people with power and money have it because they deserve it, not because they took it, and where we each have an equal shot at doing the same.

Because for all our national pride in our puritanical work ethic, the ethic doesn’t apply evenly. At the highest income levels, wealthy Americans are making money passively, through investments and inheritances, and doing little of what most would consider “work.” Basic subsistence may soon be predicated on whether and how much a poor person works, while the rich count on tax credits and carve-outs designed to protect stockpiles of wealth created by money begetting itself. It’s the poor who are expected to work the hardest to prove that they are worthy of Americanness, or a helping hand, or humanity. At the same time, we idolize and imitate the rich. If you’re rich, you must have worked hard. You must be someone to emulate. Maybe you should even be president.

* * *

Trump has a long history of antipathy to the poor, a word which he uses as a synonym for “welfare,” which he understands only as a pejorative. When he and his father were sued by the Department of Justice in 1973 for discriminating against black tenants in their real estate business, he shot back that he was being forced to rent to “welfare recipients.” Nearly 40 years later, he called President Obama “our Welfare & Food Stamp President,” saying he “doesn’t believe in work.” He wrote in his 2011 book Time To Get Tough, “There’s nothing ‘compassionate’ about allowing welfare dependency to be passed from generation to generation.”

Perhaps. But Trump certainly knows about relying on things passed from generation to generation. His self-styled origin story is that he got his start with a “small” $1 million loan from his real estate tycoon father, Fred C. Trump, which he used to grow his own empire. “I built what I built myself,” he has claimed. “I did it by working long hours, and working hard and working smart.”

It’s an interesting interpretation of “myself”: A New York Times investigation in October reported that, instead, Trump has received at least $413 million from his father’s businesses over the course of his life. “By age 3, Mr. Trump was earning $200,000 a year in today’s dollars from his father’s empire. He was a millionaire by age 8. By the time he was 17, his father had given him part ownership of a 52-unit apartment building,” reporters David Barstow, Susanne Craig, and Russ Buettner wrote. “Soon after Mr. Trump graduated from college, he was receiving the equivalent of $1 million a year from his father. The money increased with the years, to more than $5 million annually in his 40s and 50s.” The Times found 295 different streams of revenue Fred created to enrich his son — loans that weren’t repaid, three trust funds, shares in partnerships, lump-sum gifts — much of it further inflated by reducing how much went to the government. Donald and his siblings helped their parents dodge taxes with sham corporations, improper deductions, and undervalued assets, helping evade levies on gifts and inheritances.

If you’re rich, you must have worked hard. You must be someone to emulate. Maybe you should even be president.

Even the money that was made squarely owed a debt to the government. Fred Trump nimbly rode the rising wave of federal spending on housing that began with the New Deal and continued with the G.I. Bill. “Fred Trump would become a millionaire many times over by making himself one of the nation’s largest recipients of cheap government-backed building loans,” the Times reported. Donald carried on this tradition of milking government subsidies to accumulate fortunes. He obtained at least $885 million in perfectly legal grants, subsidies, and tax breaks from New York to build his real estate business.

Someone could have taken this largesse and worked hard to grow it into something more, but Donald Trump was not that someone. Much of his fortune comes not from the down and dirty work of running businesses, but from slapping his name on everything from golf courses to steaks. Many of these deals entail merely licensing his name while a developer actually runs things. And as president, he still doesn’t seem inclined to clock much time doing actual work.

That hasn’t stopped him from putting work at the center of his administration’s poverty-related policies. In the White House Council of Economic Advisers’ lengthy tome, it argued for adding work requirements to a new universe of public benefits. These requirements, which up until the Trump administration only existed for direct cash assistance and food stamps, require a recipient not just to put in a certain number of hours at a job or some other qualifying activity, but to amass paperwork to prove those hours each month. The CEA report is focused, supposedly, on “the importance and dignity of work.” But the benefits of engaging in labor are only deemed important for a particular population: “welfare recipients who society expects to work.” Over and over, it takes for granted that our country only expects the poorest to work in order to prove themselves worthy of government funds, specifically targeting those who get food stamps to feed their families, housing assistance to keep roofs over their heads, and Medicaid to stay healthy.

* * *

The report doesn’t just represent an ethos in the administration; it was also a justification for concrete actions it had already taken and more it would soon roll out. Last April, Trump signed an executive order that ordered federal agencies to review public assistance programs in order to see if they could impose work requirements unilaterally to “ensure that they are consistent with principles that are central to the American spirit — work, free enterprise, and safeguarding human and economic resources,” as the document states, while also “reserving public assistance programs for those who are truly in need.”

The administration has also pushed forward on its own. In 2017, it announced that states could apply for waivers that would allow them to implement work requirements in Medicaid for the first time, and so far more than a dozen states have taken it up on the offer, with Arkansas’s rule in effect since June 2018. (It has now been halted by a federal judge.) In that state, Medicaid recipients had to spend 80 hours a month at work, school, or volunteering, and report those activities to the government in order to keep getting health insurance. And in April 2018, Housing and Urban Development Secretary Ben Carson unveiled a proposal to let housing authorities implement work requirements for public housing residents and rental assistance recipients. Trump pushed Congress to include more stringent work requirements in the food stamp program as it debated the most recent farm bill, arguing it would “get America back to work.” When that effort failed, the Agriculture Department turned around and proposed a rule to impose the requirements by itself.

These aren’t fiscal necessities — they’re crackdowns on the poor, justified by the idea that they should prove themselves worthy of the benefits that help them survive, that are not just cruel but out of step with real life. Most people who turn to public programs already work, and those who don’t often have good reason. More than 60 percent of people on Medicaid are working. They remain on Medicaid because their pay isn’t enough to keep them out of poverty, and many of the low-wage jobs they work don’t offer health insurance they can afford. Of those not working, most either have a physical impairment or conflicting responsibilities like school or caregiving.

Enrollment in food stamps tells the same story. Among the “work-capable” adults on food stamps, about two thirds work at some point during the year, while 84 percent live in a household where someone works. But low-wage work is often chaotic and unpredictable. Recipients are more likely to turn to food stamps during a spell of unemployment or too few hours, then stop when they resume steadier employment. Many of those who are supposedly capable of work but don’t have a job have a health barrier or live with someone who has one; they’re in school, they’re caring for family, or they just can’t find work in their community.

Work requirements, then, fail to account for the reality of poor people’s lives. It’s not that there’s a widespread lack of work ethic among people who earn the least, but that there’s a lack of steady pay and consistent opportunities that allow someone to sustain herself and her family without assistance. We also know work requirements just don’t work. They’ve existed in the Temporary Assistance for Needy Families cash-assistance program for decades, yet they don’t help people find meaningful, lasting work; instead they serve as a way to shove them out of programs they desperately need. The result is more poverty, not more jobs.

If this country were so concerned about helping people who might face barriers to working get jobs, we might not be the second-lowest among OECD member countries by percentage of GDP spent on labor-market programs like job-search assistance or retraining. The poor in particular face barriers like affordable childcare and reliable transportation, and could use education or training to reach for better-paid, more meaningful work. But we do little to extend these supports. Instead, we chastise them for not pulling on their frayed bootstraps hard enough.

We also seem content with the notion that a person who doesn’t work — either out of inability or refusal — doesn’t deserve the building blocks of staying alive. The programs Trump is targeting, after all, are about basic needs: housing to stay safe from the elements, food to keep from going hungry, healthcare to receive treatment and avoid dying of neglect. Even if it were true that there was a horde of poor people refusing to work, do we want to condemn them to starvation and likely death? In one of the world’s richest countries, do we really balk at spending money on keeping our people — even lazy ones — alive?

We also know work requirements just don’t work. They’ve existed in the Temporary Assistance for Needy Families cash-assistance program for decades, yet they don’t help people find meaningful, lasting work; instead they serve as a way to shove them out of programs they desperately need. The result is more poverty, not more jobs.

Plenty of other countries don’t do so. Single mothers experience higher rates of destitution than coupled parents or people without children all over the world. But the higher poverty rate in the U.S. as compared to other developed countries isn’t because we have more single mothers; instead, it’s because we do so little to help them. Compare us to Denmark, which gives parents unconditional cash benefits for each of their children regardless of whether or how much they work, on top of generously subsidizing childcare, offering universal health coverage, and guaranteeing paid leave. It’s no coincidence that they also have a lower poverty rate, both generally and for single mothers specifically. A recent examination of poverty across countries found that children are at higher risk in the U.S because we have a sparse social safety net that’s so closely tied to demanding that people work. It makes us an international outlier, the world’s miser that only opens a clenched fist to the poor if they’re willing to demonstrate their worthiness first.

Here, too, America’s history of slavery and ongoing racism rears its head. According to a trio of renowned economists, we don’t have a European-style social safety net because “racial animosity in the U.S. makes redistribution to the poor, who are disproportionately black, unappealing to many voters.” White people turn against funding public benefit programs when they feel their racial status threatened, particularly benefits they (falsely) believe mainly accrue to black people. The black poor are seen as the most undeserving of help and most in need of proving their worthiness to get it. States with larger percentages of black residents, for example, focus less on TANF’s goal of providing cash to the needy and have stingier benefits with higher hurdles to enrollment.

* * *

The CEA’s report on work requirements claimed that being an adult who doesn’t work is particularly prevalent among “those living in low-income households.” But that’s debatable. The more income someone has, the less likely he is to be getting it from wages. In 2012, those earning less than $25,000 a year made nearly three quarters of that money from a job. Those making more than $10 million, on the other hand, made about half of their money from capital gains — in other words, returns on investments. The bottom half of the country has, on average, just $826 in income from capital investments each; the average for those in the top 1 percent is more than $16 million.

The richest are the least likely to have their money come from hard labor — yet there’s no moral panic over whether they’re coddled or lacking in self reliance. Instead, government benefits help the rich protect and grow idle wealth. Capital gains and dividends are taxed at a lower rate than regular salaried income. Inheritances were taxed at an average rate of 4 percent in 2009, compared to the average rate of 18 percent for money earned by working and saving. When investments are bequeathed, the recipient owes no taxes on any asset appreciation.


Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.

Sign up


In fact, government tax benefits that increase people’s take-home money at the expense of what the government collects for its own coffers overwhelmingly benefit the rich over the poor (or even the middle class). More than 60 percent of the roughly $900 billion in annual tax expenditures goes to the richest 20 percent of American families. That figure dwarfs what the government expends on many public benefit programs. The government spends more than three times as much on tax subsidies for homeowners, mostly captured by the well-to-do, than it does on rental assistance for the poor. The three benefit programs the Trump administration is concerned with — Medicaid, food stamps, and housing assistance — come to about $705 billion in combined spending.

While the administration has been concerned with what it can do to compel the poor to work, it’s handed out more largesse to the idle rich. Its signature tax-cut package, the Tax Cuts and Jobs Act, offered an extra cut for so-called “pass-through” businesses, like law or real estate firms. But the fine print included a wrinkle: If someone is considered actively involved in his pass-through business, only 30 percent of his earnings could qualify for the new discount. If someone is passively involved, however — a shareholder who doesn’t do much about the day-to-day work of the company — then he gets 100 percent of the new benefit.

Then there’s the law’s significant lowering of the estate tax. The tax is levied on only the biggest, most valuable inheritances passed down from wealthy parent to newly wealthy child. Before the Republicans’ tax bill, only the richest 0.2 percent of estates had to pay the tax when fortunes changed hands. Now it’s just the richest 0.1 percent, or a mere 1,800 very wealthy families worth more than $22 million. The rest get to pass money to their heirs tax-free. Those who do pay it will be paying less when tax time comes due — $4.4 million less, to be exact.

Despite the Republican rhetoric that lowering the estate tax is about saving family farms, it’s really about allowing an aristocracy to calcify — one in which rich parents ensure their children are rich before they lift a single finger in work. As those heirs receive their fortunes, they also receive the blessing that comes with riches: the halo of success and, therefore, deservedness without having to work to prove it. Yet there’s evidence that increasing taxes on inheritances has the potentially salutary effect of getting heirs to work more. The more their inheritances are taxed, the more they end up paying in labor taxes — evidence that they’re working harder for their livings, not just coasting on generational wealth. Perhaps our tax code could encourage rich heirs to experience the dignity of work.

* * *

Trump’s CEA report is accurate about at least one thing: Our country has a history of only offering public benefits to the poor either deemed worthy through their work or exempt through old age or disability. An outlier was the Aid to Families with Dependent Children program, which became Temporary Assistance for Needy Families after Bill Clinton signed welfare reform into law in the ’90s. But the 1996 transformation of the program took what was a promise of cash for poor mothers and changed it into an obstacle course of proving a mother’s worth before she can get anywhere close to a check. It paved the way for the current administration’s obsession with work requirements.

Largesse for the rich, on the other hand, has rarely included such tests. No one has been made to pee in a cup for tax breaks on their mortgages, which cost as much as the food stamp program but overwhelmingly benefit families that earn more than $100,000. No one has had to prove a certain number of work hours to get a lower tax rate on investment income or an inheritance. They get that discount on their money without having to do any work at all.

We haven’t always been so extreme in our dichotomous treatment of the rich and poor; throughout the 1940s, ’50s, and ’60s, we coupled high marginal taxes on the wealthy with a minimum wage that ensured that people who put in full-time work could rise out of poverty. The estate tax has been as high as 77 percent. As Dutch historian Rutger Bregman recently told an audience of the ultrawealthy at Davos, we’re living proof that high taxes can spread shared prosperity. “The United States, that’s where it has actually worked, in the 1950s, during Republican President Eisenhower,” he pointed out. “This is not rocket science.” It was during the same era that we also created significant anti-poverty programs such as Social Security, Medicare, and Medicaid. In fact, this country pioneered the idea of progressive taxation and has always had some form of tax on inheritance to avoid creating an aristocracy. But we’ve papered over that history as tax rates have cratered and poverty has climbed.

Instead, as Reaganomics and neoliberal ideas took hold of our politics, we turned back to the Horatio Alger myth that success is attained on an individual basis by hard work alone, and that riches are the proof of a dogged drive. Lower tax rates naturally follow under the theory that the rich should keep more of their deserved bounty. And if you’re poor, coming to the government seeking a helping hand up, you failed.

The country is due for a reckoning with our obsession with work. There are certainly financial and emotional benefits that come from having a job. But why are we only concerned with whether the poor reap those benefits? Is working ourselves to the bone the best signifier of our worth — and are there basic elements of life that we should guarantee regardless of work? It doesn’t mean dropping all emphasis on work ethic. But it does require a deeper examination of who we expect to work — and why.

* * *

Bryce Covert is an independent journalist writing about the economy and a contributing op-ed writer at The New York Times.

Editor: Michelle Weber
Fact checker: Ethan Chiel
Copy editor: Jacob Z. Gross   

Orwell’s Last Neighborhood

Barnhill on the Isle of Jura, Scotland. (David Brown)

David Brown | The American Scholar | April 2019 | 23 minutes (5,796 words)

It’s hard to know what would be a good place from which to imagine a future of bad smells and no privacy, deceit and propaganda, poverty and torture. Does a writer need to live in misery and ugliness to conjure up a dystopia?

Apparently not.

We’d been walking more than an hour. The road was two tracks of pebbled dirt separated by a strip of grass. The land was treeless as prairie, with wildflowers and the seedless tops of last year’s grass smudging the new growth.

We rounded a curve and looked down a hillside to the sea. A half mile in the distance, far back from the water, was a white house with three dormer windows. Behind it, a stone wall cut a diagonal to the water like a seam stitching mismatched pieces of green velvet. Far to the right, a boat moved along the shore, its sail as bright as the house.

This was where George Orwell wrote Nineteen Eighty-Four. The house, called Barnhill, sits near the northern end of Jura, an island off Scotland’s west coast in the Inner Hebrides. It was June 2, sunny, short-sleeve warm, with the midges barely out, and couldn’t have been more beautiful.

Orwell lived here for parts of the last three years of his life. He left periodically (mostly in the winter) to do journalism in London and, for seven months in 1947 and 1948, to undergo treatment for pulmonary tuberculosis. Although he rented Barnhill and didn’t own it, he put in fruit trees and a garden, built a chicken house, bought a truck and a boat, and invested numberless hours of labor in what he believed would be his permanent home. When he left it for the last time, in January 1949, he never again lived outside a sanatorium or hospital.

I came to Jura after a two-week backpacking trip across Scotland. My purpose was to drink single-malt on Islay, the island to the south, and enjoy two nights of indulgence at Ardlussa House, where Orwell’s landlord had lived. I was not on a literary pilgrimage. Barnhill is not open to the public, and no one among the island’s 235 residents remembers Orwell. Read more…

On Flooding: Drowning the Culture in Sameness

A 37-meter-long floating sculpture by U.S. artist Kaws in Victoria Harbor, Hong Kong, March 2019. (Imaginechina via AP Images)

Soraya Roberts | Longreads | March 2019 | 7 minutes (2,006 words)

In 1995, the Emmy nominees for Best Drama were Chicago Hope, ER, Law & Order, NYPD Blue, and The X-Files. In 1996, the Emmy nominees for Best Drama were Chicago Hope, ER, Law & Order, NYPD Blue, and The X-Files. In 1997, the Emmy nominees for Best Drama were Chicago Hope, ER, Law & Order, NYPD Blue, and The X-Files. That is: Two cop shows set in New York, two medical shows set in Chicago, and some aliens, spread across four networks, represented the height and breadth of the art form for three years running.

I literally just copied that entire first paragraph from a Deadspin article written by Sean T. Collins. It appeared last week, when every site seemed to be writing about Netflix. His was the best piece. Somehow, within that flood of Netflix content, everyone found that article — it has almost 300,000 page views. I may as well have copied it for all the traffic my actual column — which was not about Netflix — got.

There was definitely a twang of why bother? while I was writing last week, just as there is every week. Why bother, and Jesus Christ, why am I not faster? The web once made something of a biblical promise to give all of us a voice, but in the ensuing flood — and the ensuing floods after that — only a few bobbed to the top. With increased diversity, this hasn’t changed — there are more diverse voices, but the same ones float up each time. There remains a tension that critics, and the larger media, must balance, reflecting what’s in the culture in all its repetitive glory while also nudging it toward the future. But we are repeatedly failing at this by repeatedly drowning ourselves in the first part. This is flooding (a term I just coined, so I would know): the practice of unleashing a mass torrent of the same stories by the same storytellers at the same time, making it almost impossible for anyone but the same select few to rise to the surface.
Read more…

Twitter Won’t Miss You: A Digital Detox Reading List (and Roadmap)

Follow the crowds to a world with less screen time. (Photo by davity dave via Flickr, CC BY-SA 2.0)

Sara Benincasa is a quadruple threat: she writes, she acts, she’s funny, and she has truly exceptional hair. She also reads, a lot, and joins us to share some of her favorite stories. 

Have you ever needed a break, but just not known from what? Everything seems fine…ish. Your job is OK, your friendships are all right, your health is decent, nothing dramatic to report. And yet, you’re stressed. Dissatisfied. Bored. Sometimes you even feel exhausted and overwhelmed. Maybe you should distract yourself by looking at Instagram. Maybe you should find someone with whom to argue on Twitter. Maybe you should see what your ex is up to on Snapchat.

Or maybe you should get the hell off social media for awhile.

At least, that’s the prescription issued by an increasingly vocal crowd of psychiatrists, psychologists, sociologists, writers, philosophers, performers, and general opinion-havers. The common term is “digital detox,” whereby an individual commits to a cessation of specific actions on one’s Internet-enabled devices for a finite period of time. One can go on this adventure with friends, family, or a likeminded group of strangers from, you guessed it, the internet.

I’ve been an enthusiastic and sometimes addicted social media user since approximately 2003. But after beginning my research for this column, I went on a digital detox of my own. It is small and manageable, and nothing so impressive as author Cal Newport’s suggested 30-day detox from all nonessential online functions. But it has improved my life already in measurable ways. Here are some writers whose approaches to their own vacations from the Matrix helped me shape mine.

1. “Unplugged: What I Learned By Logging Off and Reading 12 Books in a Week.” (Lois Beckett, The Guardian, December 2018)

Beckett nabbed what must’ve been the plum journalistic gig of the year: head to a tiny cabin in the foothills of the Sierra Nevada, and read. Books. Made of paper. “This was a perfect assignment,” she writes. “For journalists on many beats — including mine, which includes the far right and gun policy — it had been a year of escalating violence during which conspiracy theories had moved into the mainstream.” And off she went, blissfully unencumbered by wifi. She brought a stack critically acclaimed books purchased at different independent bookshops and a plan was to read 30 books in a week, a number that sounds patently insane to me. She read 12. I’m still impressed — and envious.

The ensuing story is littered with gentle shade, which I always appreciate, and she’s a damn good writer: “I was not going to finish all 30 books at any cost, skimming to the right section of the right chapter in order to say one smart thing — in the U.S., we call this skill a ‘liberal arts education’ — but instead wanted the books’ authors and their protagonists to collide and argue with each other, to give me some different understanding of what had happened in 2018.”

2. “#Unplug: Baratunde Thurston Left The Internet For 25 Days, And You Should, Too.” (Baratunde R. Thurston, Fast Company, June 2013)

I adore my longtime friend Baratunde, though perhaps not as much as my mother, who has met the man twice and still has a copy of his 2013 Fast Company cover story somewhere in her house. He’s a great human.

And now that we’ve established my utter lack of objectivity, let’s hear from his 2013 self: “I’m an author, consultant, speechifier, and cross-platform opiner on the digital life. My friends say I’m the most connected man in the world. And in 2012, I lived like a man running for president of the United States, planet Earth, and the Internet all at once.” That very accurate description is exactly why it was so interesting that Baratunde Rafiq Thurston, of all freaking people, did a digital detox.

At the time, I remember worrying that he might burn out or possibly just suddenly up and die due to lack of sleep, so it was clearly a good move. I can’t imagine replicating what he did (no email?!), but since he was self-employed with a personal assistant and has an incredible amount of willpower, he was able to pull it off. His nine-point digital detox preparation checklist is incredibly helpful, and I intend to use it the next time I do one. My favorite line? “She transmitted this data by writing down the names on a piece of paper.” And yes, he was happier and healthier by the end of the experience. To this day, he goes on regular social media vacations, and I believe he’d tell you his life is better for it.

3. “Quit Social Media. Your Career May Depend On It.” (Cal Newport, New York Times, November 2016)

“I’m a millennial computer scientist who also writes books and runs a blog,” Newport writes. “Demographically speaking I should be a heavy social media user, but that is not the case. I’ve never had a social media account.” Newport lays out in plain, accessible language the notion that social media distracts from good work because it is designed to be addictive. It’s a notion with which I agree, based in no small part on my own lived experience; I have no doubt my writing output has suffered as I’ve devoted more and more time to social media. As Newport writes, “It diverts your time and attention away from producing work that matters and toward convincing the world that you matter.”

4. “Cal Newport on Why We’ll Look Back at Our Smartphones Like Cigarettes.” (Clay Skipper, GQ, January 2019)

Fast forward two and a half years. Newport, by now an in-demand speaker and author of two books — the latest is Digital Minimalism: Choosing a Focused Life in a Noisy World — expands on his November 2016 Op-Ed. Newport is a reluctant self-help guru who would undoubtedly reject that label. In this interview (as in the one I heard with him on fellow PoB (Pal of Baratunde) Lewis Howes’s podcast “The School of Greatness”), Newport stresses that he doesn’t typically offer a program or prescription. However, his recommendation for a 30-day digital detox seems simple in concept and necessarily jarring to execute: one dispenses with all digital products that are unnecessary to one’s career and personal health. Check your work email and log into your bank app to ensure a direct deposit has gone through, but let Facebook, Twitter, and Instagram accounts lie fallow for 30 days. Skipper is an able interviewer and Newport is a clear, experienced, and intelligent interviewee.

5. I Quit Social Media for 65 Weeks. This Is What I Learned. (Kareem Yasin, Healthline, February 2018)

Yasin interviews David Mohammadi, who left social media for over a year and loved the experience. A newly minted New Yorker, he abandoned the online pseudo-friendship industrial complex because he was worried he’d obsess over what was happening back in San Francisco. And he had good reason to suspect he’d be homesick — he’d tried the East Coast thing once, been endlessly captivated by his Bay Area friends’ Facebook updates, and ended up moving back to San Francisco. Years later, a more mature Mohammadi quit his job and decided to start a new career in New York with a clear mind unclouded by social media-induced FOMO. You likely won’t be surprised to hear his take: “The first week was hard. The second week was nice. And as I got closer to the end date, I just was like: ‘Wow. It feels great to be so present, and not just on my phone.’” But the benefits didn’t just extend to mental health — he made more money, too! Yasin writes, “Working as a boutique manager, [Mohammadi] noticed how his coworkers would constantly check their phones. Those two-minute breaks from the real world robbed them of opportunities to get more commissions — opportunities that would be theirs if they would just look up and notice the customers.”

* * *

Like you, probably, I have a personal Instagram account. Except it isn’t personal, really — with 14,200 followers, it is ostensibly a way to cultivate and grow an online brand based on me, myself, and I. I write essays and books; I do comedy shows; I lecture on mental health awareness at colleges; I pop up as a talking head in various capacities in various venues. Like you, probably, I want to be seen as an attractive person, so sometimes I use filters or put on more makeup than is absolutely necessary for a selfie. Like you, probably, I want to be seen as a capable person worthy of being hired, so I do my best to seem witty and fun but chill, man. Given that I want to write more for television and that a lot of my work falls under the category of “entertainment,” I have followed the conventional thinking in my industry, which boils down to “Always be selling (yourself).”

This thinking extends to my “personal” Twitter account (77,400 followers), despite my many qualms about the ethics of its overseers with regard to threats and harassment. It extended to my Facebook fan page, until I quit Facebook altogether because I don’t care what my least-favorite racist relative ate for breakfast — if I want to know what’s up with a boring person from high school, I’ll make private inquiries. When the current Russian government really loves something, I have to ask myself if I need that something in my life. (Note: I am aware that Facebook owns Instagram, and that I’m a hypocrite sometimes.)

Then there’s the Instagram account for my podcast (679 followers) and the Twitter account for my podcast (457 followers) and the Instagram account for my progressive lady-coat art project (26,200 followers). I don’t use Snapchat, because once I joined for 24 hours and my drunk friend sent me a dick pic framed by monogrammed his-and-hers towels in the master bathroom he shares with his girlfriend; I’m a Scorpio, and pseudoscience and common sense immediately told me the power of the Snap was too great for my personal constitution to handle. I also recently joined a few dating apps. And that led to more swiping, more clicking, more texting, more aggravation of writing-induced carpal tunnel issues. When an ex-NFL star asked me on what I’m sure would have been a super safe and not-gross date to his house at 3 a.m., I decided that Tinder was also too much for me.

At this point, and considering my sore wrists, the signals seemed to say, “SARA. TAKE SOME TIME OFF THE SOCIAL MEDIA.” I had 104,000 followers across social media, some of whom were double or triple followers and some of whom were robots, and while I loved each of them like my very own imaginary baby, Mommy needed a vacay.

First, I enabled the Screen Time function on my phone and discovered that I use it, on average, over seven hours a day. This horrifying fact led me to design the parameters of my moderate digital detox: I’d continue to use my email for work and social reasons. I would continue to use Twitter, but only to share my work or the work of a friend or charity. I would post a note announcing that I was taking an Instagram break until April 9, the day the second season of my podcast debuts, to give both a heads up to any former professional athletes that I wouldn’t be interacting with them there and to announce the premiere date. I would text when I felt like it, but leave my phone facing down when I wasn’t using it. I would remove Instagram from my phone, just as I’d done with Twitter months prior. At night and during my daily meditation practice, I would put the phone on airplane mode.

Following those simple rules, and only occasionally breaking them, I managed to reduce my phone time by 10 percent in the first week. I resumed the regular at-home yoga practice I’d attempted a month prior. I finished the outline of an hour-long TV drama pilot. I went on actual face-to-face dates with humans during daylight and appropriate evening hours. I visited with two friends. I got the “annual” physical I’d put off for two years. And I wrote this column.

While I intend to resume using Instagram on April 9, I will do as Cal Newport recommends: use social media like a professional, for specific purposes, and do not stray from said purposes. Twitter and Instagram will remain places for me to share my work and the work of friends and charities I admire. Sometimes, I will use these places to discover great writing, music, and more. Moving forward, I want to reduce my screen time by 10 percent each week until I average under four hours per day on my phone — and then I’ll try to reduce it even more.

I’m pleased with my progress. It may seem meager, but it’s a start. And I feel better already. So if you’ve considered quitting social media but have some qualms, do what I did: start small. Pop your head above the churning surface of our wild, untrammeled internet, and take a look around. Stay awhile. Your eyes will grow accustomed to real sunlight soon enough, and it’ll be easier to breathe. It’s pretty nice up here.

* * *

Sara Benincasa is a stand-up comedian, actress, college speaker on mental health awareness, and the author of Real Artists Have Day JobsDC TripGreat, and Agorafabulous!: Dispatches From My Bedroom. She also wrote a very silly joke book called Tim Kaine Is Your Nice Dad. Recent roles include “Corporate” on Comedy Central, “Bill Nye Saves The World” on Netflix, “The Jim Gaffigan Show” on TVLand and critically-acclaimed short film “The Focus Group,” which she also wrote. She also hosts the podcast “Where Ya From?”

Editor: Michelle Weber

Queens of Infamy: Josephine Bonaparte, from Martinique to Merveilleuse

Illustration by Louise Pomeroy

Anne Thériault | Longreads | March 2019 | 22 minutes (5,569 words)

From the notorious to the half-forgotten, Queens of Infamy, a Longreads series by Anne Thériault, focuses on badass world-historical women of centuries past.

* * *

Looking for a Queens of Infamy t-shirt or tote bag? Choose yours here.

In 1768, a 15-year-old girl traveled to the hills near her family home in Martinique to visit a local wise woman. Desperately curious to know what her future held, the girl handed a few coins to the Afro-Caribbean obeah, Euphémie David, in exchange for a palm reading. Euphémie obligingly delivered an impressive-sounding prediction: the girl would marry twice — first, unhappily, to a family connection in France, and later to a “dark man of little fortune.” This second husband would achieve undreamed of glory and triumph, rendering her “greater than a queen.” But before the girl had time to gloat over her thrilling fate, Euphémie delivered a parting blow: in spite of her incredible success, the girl would die miserable, filled with regret, pining for the “easy, pleasant life” of her childhood. This prophecy would stay with the girl for the rest of her life, and she would think of it often — sometimes with fervent hope, sometimes with despair, always with unwavering belief that it would come true.

That girl was the future Empress Josephine Bonaparte. Everything Euphémie predicted would come to pass, but young Josephine could not have imagined the events that would propel her to her zenith: the rise through Paris society, the cataclysm of the French Revolution, the brutal imprisonment during the Reign of Terror, the transformation into an infamous Merveilleuse, the pivotal dinner at her lover’s house where she would meet her second husband.

She wouldn’t even have recognized the name Josephine — that sobriquet would be bestowed by Napoleon some 18 years hence. The wide-eyed teenager who asked Euphémie to tell her fortune still went by her childhood nickname, Yeyette.

Read more…

The Good Bad Wives of Ozark and House of Cards

Illustration by Zoë van Dijk

Sara Fredman | Longreads | March 2019 | 11 minutes (3,057 words)

 

What makes an antihero show work? In this Longreads series, It’s Not Easy Being Mean, Sara Fredman explores the fine-tuning that goes into writing a bad guy we can root for, and asks whether the same rules apply to women.

 
The antihero shows of the early aughts relied on wives as antagonists. A wife became another hurdle to leap in her husband’s quest to run a criminal organization/become an undisputed drug king/sleep with whomever he wanted in an attempt to outrun his past. More recently, however, there have been shows that seem to push back on the impulse to pit husbands and wives against each other. What if, they asked, an antihero and his wife were partners instead of rivals? But in giving their wives a promotion of sorts, shows like Ozark and House of Cards also open the door to female ambition, which can become as problematic for fictional women as it has been for their real-life counterparts.

The first post in this series, “The Blaming of the Shrew,” discusses Breaking Bad’s Skyler White, among other TV wives.

The first episode of Netflix’s Ozark follows the antihero script to a T. It’s all there: a talented main character, a sad backstory to which we slowly become privy, and foils more villainous by several degrees designed to make our main guy look good in comparison. That guy is Marty Byrde, a financial planner who launders money for the second largest drug cartel in Mexico. His wife Wendy is initially set up as the Skyler to Marty’s Walt: her days are all Costco, Zumba, and cheating on her husband with a man she calls “Sugarwood.” Soon after we meet Marty, he fantasizes about an encounter with a prostitute that gets oddly specific about his life: “Let me guess, your wife won’t do what you want her to do. If you were my man, working all day so I could stay at home — which, uh, let’s face it, it was a bitch when they were little but now they’re both teens and in school all day … not only would I not cheat on you, I’d let you do anything you wanted.” This is the kind of interiority, indispensable to the antihero genre, that lets us know that Marty is doing everything right despite Wendy’s worst efforts.

But when Marty is forced to move his family to southern Missouri and launder $8 million to save them from the cartel, Wendy shifts from antagonist to helpmate. She isn’t excited about the plan but, unlike antihero wives of yore, she hasn’t been kept in the dark about Marty’s criminality and she willingly presents a united front to their children and the FBI. The important thing, Wendy and Marty agree, is that the family stays together and safe, and they’re prepared to do anything to keep it that way. Family as sacrosanct, as the highest good, is a theme of this show. Versions of “I did it for our family” are repeated like a mantra throughout the series. Marty and Wendy both use it as a rhetorical justification and also as a kind of mystical prayer meant to insulate them from their own internal critics.

Ozark offers us an antihero team but finds a different way to humanize a flawed man, with a wife so helpful that she eclipses the antihero himself.

By season two, however, the family becomes a battleground, with Marty and Wendy developing a low-grade rivalry. They operate less as a team than as dueling pianos, each taking turns making decisions “for the good of the family” without consulting the other. It turns out that Wendy has her own expertise to contribute from her years working in Chicago politics, which makes their partnership more equal but also more fraught, and the show’s almost pathological focus on the family becomes yet another way to make an antagonist out of a wife. Ozark’s initial bait and switch turns Wendy from an antagonist into a helpmate who recognizes the necessity of her husband’s infelicities but a more cunning reversal has Marty become the one to stand in opposition to the show’s plotline. The final episodes of season two see him preparing an escape plan for his family only to be thwarted by his wife, who makes the unilateral decision that they will stay. It’s not clear when Wendy makes the decision because she doesn’t get the kind of interiority that Marty does — only long, meaningful looks out onto the horizon. Naturally, she frames the decision as the best thing for their family. But the show’s writers have already given Marty the insight that this kind of rationalization, the very premise of the show, has been undermined: “We’re not fit to be parents. It’s not even a family, it’s a goddamn group of criminals.”

Explaining her decision to stay in the Ozarks, in danger, in criminality, Wendy says: “This is who I am, and this is who I want to be.” Marty was only ever portrayed as a reluctant criminal, a serf in service to his family. Wendy’s first-person declaration is ambition, which we should know by now isn’t usually a good look on a woman. Ozark offers us an antihero team but finds a different way to humanize a flawed man, with a wife so helpful that she eclipses the antihero himself. It turns Marty into the hero who wants to save his family, if only his wife would let him.

* * *

Wendy Byrde isn’t the first wife of an antihero to have higher aspirations. House of Cards was always a show about two people with naked ambition. Frank and Claire Underwood didn’t have any children so their nefarious deeds were never in the service of providing for, or saving, anyone other than themselves. They wanted power and they were going to get it as a team. Until they weren’t. Things start to unravel at the end of season three. Frank walks into the Oval Office to find Claire sitting behind the desk: “Look at us, Francis, we used to make each other stronger, or at least I thought so, but that was a lie. We were making you strong and now I’m just weak and small and I can’t stand that feeling any longer.” House of Cards could be extremely woke about power and gender. More than any other antihero show, it seemed to be aware of the conventions of its genre and what those conventions meant for women. What family is for Ozark, power was for House of Cards, and it recognized what it meant to want power as a man and as a woman, that there was a difference between the two. The show could also be extremely meta, especially the final season, in which lines like “Are you telling me she knew nothing of what he was up to?” and “Are you even capable of defining her on her own terms?” could be talking about the characters, the actors who play them, or the tropes they were called on to embody for six seasons.

If nothing else, the power struggles between the two Underwoods over the course of the series can help us see how the roles of antihero’s wife and politician’s wife overlap. Both kinds of wives are at once essential to their husbands’ stories and outside of them. They are tasked with humanizing the men with whom they partner, but it is understood that the partnership is premised on a withholding of their own humanity; their story must remain the B plot. So when House of Cards suddenly found itself an antihero show without an antihero, you would think the solution would have been simple since, as it turned out, Claire’s ambition was to become a main character.

And an antihero marriage, like a political campaign, does not easily accommodate a woman at the top of the ticket.

Claire’s struggle to move beyond the helpmate/antagonist paradigm of her foremothers and become the antihero of her show is a major plot point of the show’s later seasons. The season four finale has Claire and Frank look at the camera together, her first fourth wall break. This is Frank’s signature move so there is reason to believe that Claire is finally gaining the strength she craves. And, indeed, season five in many ways seemed to be about setting the stage for Claire to eclipse her husband. This is signified, in the show’s mallet-to-the-head way, by Frank’s fascination with the app that turns his face into Claire’s and back again. But there continues to be friction: “We have one rule Francis,” Claire rails, “I cannot be your ally if I don’t know what you’re thinking … You should have talked to me instead of making a last-minute decision like this.” Frank has just let her in on his plan to resign the presidency and make Claire the leader of the free world. You would think Claire would be pleased with this turn of events but she knows, as we do from Ozark, that where you are matters less than who made the decision to put you there.


Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.

Sign up


Just as concepts like “leader” and “free world” don’t mean quite what they used to, so too Frank has emptied out the presidency of its power before handing it over to Claire. “Where does the real power lie? The power behind the power?” he asks. The answer is the private sector, the existence of which the supposedly brilliant politician Frank Underwood is apparently just learning. Exaggerated eyerolls aside, this show is one in which a woman finally gets her hands on some agency, only to discover that the rules have changed and she’s not holding anything at all. “It’s no longer about who lives in the White House,” Frank’s civic lesson continues, “it’s about who owns the White House … the real power isn’t here.” And when he says, “I wanted you to be the president, I’ve made you the president,” Claire realizes that for an antihero and his wife, there is no such thing as equal partnership. And an antihero marriage, like a political campaign, does not easily accommodate a woman at the top of the ticket. While wives may humanize presidents and antiheroes alike, for this wife at least, a husband is only a liability.

So season five ends with Claire ignoring Frank’s calls about a presidential pardon and turning to the camera to declare: “my turn.” This could have become more true than anyone had planned once the allegations against Kevin Spacey became public and Netflix cut ties with him when the show was already into production on season six. But the show wasn’t prepared to become a female antihero show. Frank had already told us that “If she doesn’t pardon me, I’ll kill her,” and season six was supposed to be a showdown between the two Underwoods. Instead of coming up with a new story line, we get Frank by proxy. Unable to use his face or his voice, the show’s writers turn Frank into a series of human horcruxes, transposing his malintent onto several new characters we are supposed to care about but don’t. Oh, and Doug. Poor loyal, murderous Doug, who is like if the legion of antihero fans sticking with Tony, Walt, and Don to the bitter end became one person with a weakness for sad brunettes. While the final season can identify the predicament of the antihero’s wife who yearns to break free — it begins with a reading of threatening tweets and other online content, including a contest for the most creative way to kill Claire — it never comes close to resolving it. Instead it centers on Frank’s absence. Claire spends most of her time as commander-in-chief trying to figure out how to distance herself from Frank’s crimes and escape Frank’s shadowy posthumous vendetta against her. She never gets a chance to be a president, or an antihero, on her own terms.

Even when Claire makes it to the Oval Office, she is only, as Frank tells her in their very last conversation, ‘the most powerful woman in the world.’

It wasn’t just the writers who couldn’t seem to let Frank go. In December of last year, Kevin Spacey, who had no qualms about using Frank’s face or voice, released a video in which he blurred the line between himself and the character he played for five seasons. Looking straight into the camera, he attempted to recreate the camaraderie with the audience that made his House of Cards character so unique and effective:

I know what you want. Oh sure, they may have tried to separate us but what we have is too strong, it’s too powerful. I mean after all, we shared everything, you and I. I told you my deepest darkest secrets. I showed you exactly what people are capable of. I shocked you with my honesty, but mostly I challenged you and made you think. And you trusted me even though you knew you shouldn’t.

This is Frank’s shtick of making us feel like we’re in on a secret while also implicating us in the violence necessary to keep it. Spacey’s inhabiting of his character as a response to the real-life allegations against him shines an unflattering light on the cultural power of the antihero, particularly our complicity in enabling bad behavior if the person is good enough at what they do. In taking his case to the public this way, Spacey was betting on the magnetism of the fictional Frank Underwood to insulate the real-life Kevin Spacey from the bad things he did, kind of like what must have happened during the first season of House of Cards, when he had only to participate in a “training process” after allegedly harassing someone on set, a training that does not seem to have had its desired effect. The sheer brazenness of the video, that it ends with a play for a Spacey-led House of Cards revival (“wait a minute, now that I think of it, you never actually saw me die, did you? Conclusions can be so deceiving”) and hit the internet on the very day that it was announced that he would be charged with indecent assault and battery, suggests that Spacey must have really believed that his character could save his career. The video has almost 250,000 likes, which isn’t enough to bring Frank Underwood back from the dead, but is yet another testament to the power of the male antihero — in this case the character and the man who plays him — to command adoration in spite of the destruction he leaves in his wake.

The Kevin Spacey/Frank Underwood mash-up video can’t help but point out that “all this presumption made for such an unsatisfying ending,” an opinion held by mostly everyone. But what was it that made the final season so anticlimactic? Was it, as Kevin/Frank implied, the absence of its antihero? Was it because, as FX network president John Landgraf argued back in 2013, a female antihero just isn’t the same? Is the antihero genre, ultimately, a male one? Kind of. Like presidential politics, antihero shows have been built for men. Claire never got a clean break and she spent the final season fighting off the ghost of Frank. But even if she had, the show was never calibrated to make her its centerpiece. In an interview with the magazine Capitol File, Robin Wright recounts that the only note David Fincher gave her when she started on the show was to be still:

People were suggesting to base the character on Hillary Clinton or other strong women personas, and I didn’t want to do that. When we shot the first couple of scenes, David would come over to me and say, “Don’t move. Don’t move. Claire is a bust.”

Statues are memories of heroes, not the heroes themselves. House of Cards was built around Frank’s dynamism; Claire’s steely mystery could stoke or temper that dynamism but was meant to always exist alongside it. The show was about seeing Frank work and he kept us close, bringing us in and making us complicit. Even after Claire promises us that it’s going to be different (“I’m going to tell you the truth”), she keeps us at a distance. This is partially because the show wants to preserve the mystery of who killed Frank until the very end, but it’s also because that’s who Claire has always been: a stoic and a secret keeper. Instead of finding the right formula that would allow her to become the antihero she’s always wanted to be, the show shoehorns her into Frank’s.

* * *

In writing wives who don’t fit neatly into the antagonist/enabler binary of shows like Breaking Bad and The Sopranos, Ozark and House of Cards allow them to operate in the gray alongside their husbands. By bringing their wives into the fold instead of shutting them out, these shows get us thinking about what would have to be true for a woman to step into the role of an antihero herself. But while both give their wives more to do and the ability to exercise their own ambition, they ultimately handicap that ambition. Even when Claire makes it to the Oval Office, she is only, as Frank tells her in their very last conversation, “the most powerful woman in the world.” For the wife of an antihero, the glass ceiling is her husband. Perhaps Ozark will surprise us and turn Wendy into the show’s new antihero rather than an antagonist standing in the way of her family’s well-being, but season two hinted at the way a wife in control might go. Local drug lords Jacob and Darlene Snell are two of the more villainy foils who serve to humanize Marty and Wendy in season one. They initially operate as a well-oiled machine: when he asks for more lemonade, she knows it’s time to murder the man who launders their money through a strip club. But eventually, caught in a standoff with the cartel, the fissures appear. Darlene wants to keep fighting while Jacob wants to live in peace. “What do you do, Martin,” Jacob asks, “when the bride who took your breath away becomes the wife who makes you hold your breath in terror?” The show has already emphasized the parallels between the two couples: “What deals did you just make behind my back?” Darlene asks Jacob; “You made these plans without me?” Wendy demands of Marty. Darlene out-villains her husband, killing him before he can kill her, and the Snells’ storyline influences how we see Wendy’s season two arc. The lesson is that your helpmate can eventually become your killer and what is exciting and intoxicating in a man — quick thinking and smart, strategic maneuvering — is off-putting and unsettling in his wife.

Is there any hope for the wife of an antihero? Will we ever see a female antihero we can actually root for? Does having a family make a female antihero more effective, or less? Does Soviet Russia hold the key to one or all of these questions? Maybe! Tune in to the next installment on The Americans.
 

The first installment in this series: The Blaming of the Shrew

* * *

Sara Fredman is a writer and editor living in St. Louis. Her work has been featured in LongreadsThe RumpusTablet, and Lilith.

 

Editor: Cheri Lucas Rowlands
Illustrator: Zoë van Dijk

Uncertain Ground

Getty / Photo illustration by Katie Kosma

Grace Loh Prasad | Longreads | March 2019 | 16 minutes (4,021 words)

In early October, I noticed my Taiwanese and Chinese American friends posting photos of large family gatherings and moon cakes. Others posted photos of visiting the graves of family members. I felt a wave of panic and guilt. Had I missed Tomb Sweeping Day, when I should have been honoring my deceased parents? On the other hand, I remembered and looked forward to Dia de los Muertos, a holiday I hadn’t grown up with but learned about over more than 20 years of living in California. How could I feel such a strong affinity for a Mexican cultural tradition, while being so ignorant of the holidays observed by the Taiwanese and Chinese diaspora?

A quick Wikipedia search revealed that I had gotten my holidays mixed up. Mid-Autumn Festival celebrates the full moon at harvest time, with families reuniting for a traditional feast and moon cakes. Tomb Sweeping Day (Qing Ming) is one of several holidays to remember your ancestors, but it’s observed in spring. I could not remember which was which because my family did not really celebrate these holidays. Although I was born in Taiwan, I spent my early childhood in New Jersey, and then from fourth grade through high school graduation, we lived in Hong Kong.

We were a curious cultural hybrid: a family of Taiwanese origin living as American expatriates in a British territory where we resembled the local Chinese population, but did not speak the same language and had little in common with them. I attended an American school full of American and international students. One of the advantages of attending Hong Kong International School was that we got American, British and Chinese holidays off: Thanksgiving, the Queen’s Birthday and Lunar New Year.

I’m sure we learned about Mid-Autumn Festival and Qing Ming, but they weren’t as memorable as Lunar New Year, the biggest holiday of the year when everyone got a week off from school or work. Children and younger relatives received lai see (hong bao), red envelopes filled with spending money, and employees received their annual bonuses. I remember going with my parents to join the enormous crowds down in Causeway Bay, pushing for a spot close to the harbor to get the best view of the spectacular fireworks. Stores and restaurants tried to outdo each other with elaborate “Kung Hei Fat Choy” decorations and special menus and promotions. Everywhere you went, people were in a festive good mood.

Since we did not have any relatives in Hong Kong, there were no family obligations during Lunar New Year. It was only the four of us — my mom, dad, brother Ted and me — so at most we would go out for a fancy restaurant meal. We did not go from house to house with bottles of Johnny Walker or baskets of tangerines. We did not make hundreds of homemade dumplings or go to the bank to request a wad of crisp new bills to stuff into red envelopes for my younger cousins, nieces and nephews. My parents might have hung up modest decorations outside our apartment door, but I think it was just for show, so we would not appear strange to our neighbors.

Once I asked my parents why we didn’t do more to celebrate the Taiwanese and Chinese holidays. “Well,” my dad said, “it’s because we are Christian. From when we were little, we only celebrated Christmas and Easter. Your grandpa was very strict. We were forbidden from observing any of the non-Christian, Taiwanese traditions because that was considered superstitious.”

I was relieved that my ignorance was not my fault. But I still felt a void.
Read more…

How the Guardian Went Digital

Newscast Limited via AP Images

Alan Rusbridger | Breaking News | Farrar, Straus and Giroux | November 2018 | 31 minutes (6,239 words)

 

In 1993 some journalists began to be dimly aware of something clunkily referred to as “the information superhighway” but few had ever had reason to see it in action. At the start of 1995 only 491 newspapers were online worldwide: by June 1997 that had grown to some 3,600.

In the basement of the Guardian was a small team created by editor in chief Peter Preston — the Product Development Unit, or PDU. The inhabitants were young and enthusiastic. None of them were conventional journalists: I think the label might be “creatives.” Their job was to think of new things that would never occur to the largely middle-aged reporters and editors three floors up.

The team — eventually rebranding itself as the New Media Lab — started casting around for the next big thing. They decided it was the internet. The creatives had a PC actually capable of accessing the world wide web. They moved in hipper circles. And they started importing copies of a new magazine, Wired — the so-called Rolling Stone of technology — which had started publishing in San Francisco in 1993, along with the HotWired website. “Wired described the revolution,” it boasted. “HotWired was the revolution.” It was launched in the same month the Netscape team was beginning to assemble. Only 18 months later Netscape was worth billions of dollars. Things were moving that fast.

In time, the team in PDU made friends with three of the people associated with Wired. They were the founders, Louis Rossetto, and Jane Metcalfe; and the columnist Nicholas Negroponte, who was based at the Massachusetts Institute of Technology and who wrote mindblowing columns predicting such preposterous things as wristwatches which would “migrate from a mere timepiece today to a mobile command-and-control center tomorrow . . . an all-in-one, wrist-mounted TV, computer, and telephone.”

As if.

Both Rossetto and Negroponte were, in their different ways, prophets. Rossetto was a hot booking for TV talk shows, where he would explain to baffled hosts what the information superhighway meant. He’d tell them how smart the internet was, and how ethical. Sure, it was a “dissonance amplifier.” But it was also a “driver of the discussion” towards the real. You couldn’t mask the truth in this new world, because someone out there would weigh in with equal force. Mass media was one-way communication. The guy with the antenna could broadcast to billions, with no feedback loop. He could dominate. But on the internet every voice was going to be equal to every other voice.

“Everything you know is wrong,” he liked to say. “If you have a preconceived idea of how the world works, you’d better reconsider it.”

Negroponte, 50-something, East Coast gravitas to Rossetto’s Californian drawl, was working on a book, Being Digital, and was equally passionate in his evangelism. His mantra was to explain the difference between atoms — which make up the physical artifacts of the past — and bits, which travel at the speed of light and would be the future. “We are so unprepared for the world of bits . . . We’re going to be forced to think differently about everything.”

I bought the drinks and listened.

Over dinner in a North London restaurant, Negroponte started with convergence — the melting of all boundaries between TV, newspapers, magazines, and the internet into a single media experience — and moved on to the death of copyright, possibly the nation state itself. There would be virtual reality, speech recognition, personal computers with inbuilt cameras, personalized news. The entire economic model of information was about to fall apart. The audience would pull rather than wait for old media to push things as at present. Information and entertainment would be on demand. Overly hierarchical and status-conscious societies would rapidly erode. Time as we knew it would become meaningless — five hours of music would be delivered to you in less than five seconds. Distance would become irrelevant. A UK paper would be as accessible in New York as it was in London.

Writing 15 years later in the Observer, the critic John Naughton compared the begetter of the world wide web, Sir Tim Berners-Lee, with the seismic disruption five centuries earlier caused by the invention of movable type. Just as Gutenberg had no conception of his invention’s eventual influence on religion, science, systems of ideas, and democracy, so — in 2008 — “it will be decades before we have any real understanding of what Berners-Lee hath wrought.”

The entire economic model of information was about to fall apart.

And so I decided to go to America with the leader of the PDU team, Tony Ageh, and see the internet for myself. A 33-year-old “creative,” Ageh had had exactly one year’s experience in media — as an advertising copy chaser for The Home Organist magazine — before joining the Guardian. I took with me a copy of The Internet for Dummies. Thus armed, we set off to America for a four-day, four-city tour.

In Atlanta, we found the Atlanta Journal-Constitution (AJC), which was considered a thought leader in internet matters, having joined the Prodigy Internet Service, an online service offering subscribers information over dial-up 1,200 bit/second modems. After four months the internet service had 14,000 members, paying 10 cents a minute to access online banking, messaging, full webpage hosting and live share prices.

The AJC business plan envisaged building to 35,000 or 40,000 by year three. But that time, they calculated, they would be earning $3.3 million in subscription fees and $250,000 a year in advertising. “If it all goes to plan,’ David Scott, the publisher, Electronic Information Service, told us, ‘it’ll be making good money. If it goes any faster, this is a real business.”

We also met Michael Gordon, the managing editor. “The appeal to the management is, crudely, that it is so much cheaper than publishing a newspaper,” he said.

We wrote it down.

“We know there are around 100,000 people in Atlanta with PCs. There are, we think, about one million people wealthy enough to own them. Guys see them as a toy; women see them as a tool. The goldmine is going to be the content, which is why newspapers are so strongly placed to take advantage of this revolution. We’re out to maximize our revenue by selling our content any way we can. If we can sell it on CD-ROM or TV as well, so much the better.”

“Papers? People will go on wanting to read them, though it’s obviously much better for us if we can persuade them to print them in their own homes. They might come in customized editions. Edition 14B might be for females living with a certain income.”

It was heady stuff.

From Atlanta we hopped up to New York to see the Times’s online service, @Times. We found an operation consisting of an editor plus three staffers and four freelancers. The team had two PCs, costing around $4,000 each. The operation was confident, but small.

The @Times content was weighted heavily towards arts and leisure. The opening menus offered a panel with about 15 reviews of the latest films, theatre, music, and books – plus book reviews going back two years. The site offered the top 15 stories of the day, plus some sports news and business.

There was a discussion forum about movies, with 47 different subjects being debated by 235 individual subscribers. There was no archive due to the fact that — in one of the most notorious newspaper licensing cock-ups in history — the NYT in 1983 had given away all rights to its electronic archive (for all material more than 24 hours old) in perpetuity to Mead/Lexis.

That deal alone told you how nobody had any clue what was to come.

We sat down with Henry E. Scott, the group director of @Times. “Sound and moving pictures will be next. You can get them now. I thought about it the other day, when I wondered about seeing 30 seconds of The Age of Innocence. But then I realized it would take 90 minutes to download that and I could have seen more or less the whole movie in that time. That’s going to change.”

But Scott was doubtful about the lasting value of what they were doing — at least, in terms of news. “I can’t see this replacing the news- paper,” he said confidently. “People don’t read computers unless it pays them to, or there is some other pressing reason. I don’t think anyone reads a computer for pleasure. The San Jose Mercury [News] has put the whole newspaper online. We don’t think that’s very sensible. It doesn’t make sense to offer the entire newspaper electronically.”

We wrote it all down.

“I can’t see the point of news on-screen. If I want to know about a breaking story I turn on the TV or the radio. I think we should only do what we can do better than in print. If it’s inferior than the print version there’s no point in doing it.”

Was there a business plan? Not in Scott’s mind. “There’s no way you can make money out of it if you are using someone else’s server. I think the LA Times expects to start making money in about three years’ time. We’re treating it more as an R & D project.”


Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.

Sign up


From New York we flitted over to Chicago to see what the Tribune was up to. In its 36-storey Art Deco building — a spectacular monument to institutional self-esteem — we found a team of four editorial and four marketing people working on a digital service, with the digital unit situated in the middle of the newsroom. The marketeers were beyond excited about the prospect of being able to show houses or cars for sale and arranged a demonstration. We were excited, too, even if the pictures were slow and cumbersome to download.

We met Joe Leonard, associate editor. “We’re not looking at Chicago Online as a money maker. We’ve no plans even to break even at this stage. My view is simply that I’m not yet sure where I’m going, but I’m on the boat, in the water — and I’m ahead of the guy who is still standing on the pier.”

Reach before revenue.

Finally we headed off to Boulder, Colorado, in the foothills of the Rockies, where Knight Ridder had a team working on their vision of the newspaper of tomorrow. The big idea was, essentially, what would become the iPad — only the team in Boulder hadn’t got much further than making an A4 block of wood with a “front page” stuck on it. The 50-something director of the research centre, Roger Fidler, thought the technology capable of realizing his dream of a ‘personal information appliance’ was a couple of years off.

Tony and I had filled several notebooks. We were by now beyond tired and talked little over a final meal in an Italian restaurant beneath the Rocky Mountains.

We had come. We had seen the internet. We were conquered.

* * *

Looking back from the safe distance of nearly 25 years, it’s easy to mock the fumbling, wildly wrong predictions about where this new beast was going to take the news industry. We had met navigators and pioneers. They could dimly glimpse where the future lay. Not one of them had any idea how to make a dime out of it, but at the same time they intuitively sensed that it would be more reckless not to experiment. It seemed reasonable to assume that — if they could be persuaded to take the internet seriously — their companies would dominate in this new world, as they had in the old world.

We were no different. After just four days it seemed blindingly obvious that the future of information would be mainly digital. Plain old words on paper — delivered expensively by essentially Victorian production and distribution methods — couldn’t, in the end, compete. The future would be more interactive, more image-driven, more immediate. That was clear. But how on earth could you graft a digital mindset and processes onto the stately ocean liner of print? How could you convince anyone that this should be a priority when no one had yet worked out how to make any money out of it? The change, and therefore the threat, was likely to happen rapidly and maybe violently. How quickly could we make a start? Or was this something that would be done to us?

In a note for Peter Preston on our return I wrote, “The internet is fascinating, intoxicating . . . it is also crowded out with bores, nutters, fanatics and middle managers from Minnesota who want the world to see their home page and CV. It’s a cacophony, a jungle. There’s too much information out there. We’re all overloaded. You want someone you trust to fillet it, edit it and make sense of it for you. That’s what we do. It’s an opportunity.”

Looking back from the safe distance of nearly 25 years, it’s easy to mock the fumbling, wildly wrong predictions about where this new beast was going to take the news industry.

I spent the next year trying to learn more and then the calendar clicked on to 1995 — The Year the Future Began, at least according to a recent book by the cultural historian W. Joseph Campbell, who used the phrase as his book title twenty years later. It was the year Amazon.com, eBay, Craigslist, and Match.com established their presence online. Microsoft spent $300m launching Windows 95 with weeks of marketing hype, spending millions for the rights to the Rolling Stones hit “Start Me Up,” which became the anthem for the Windows 95 launch.

Cyberspace — as the cyber dystopian Evgeny Morozov recalled, looking back on that period — felt like space itself. “The idea of exploring cyberspace as virgin territory, not yet colonized by governments and corporations, was romantic; that romanticism was even reflected in the names of early browsers (‘Internet Explorer,’ ‘Netscape Navigator’).”

But, as Campbell was to reflect, “no industry in 1995 was as ill-prepared for the digital age, or more inclined to pooh-pooh the disruptive potential of the Internet and World Wide Web, than the news business.” It suffered from what he called “innovation blindness” — “an inability, or a disinclination to anticipate and understand the consequences of new media technology.”

1995 was, then, the year the future began. It happened also to be the year in which I became editor of the Guardian.

* * *

I was 41 and had not, until very recently, really imagined this turn of events. My journalism career took a traditional enough path. A few years reporting; four years writing a daily diary column; a stint as a feature writer — home and abroad. In 1986 I left the Guardian to be the Observer’s television critic. When I rejoined the Guardian I was diverted towards a route of editing — launching the paper’s Saturday magazine followed by a daily tabloid features section and moving to be deputy editor in 1993. Peter Preston — unshowy, grittily obstinate, brilliantly strategic — looked as if he would carry on editing for years to come. It was a complete surprise when he took me to the basement of the resolutely unfashionable Italian restaurant in Clerkenwell he favored, to tell me he had decided to call it a day.

On most papers the proprietor or chief executive would find an editor and take him or her out to lunch to do the deal. On the Guardian — at least according to tradition dating back to the mid-70s — the Scott Trust made the decision after balloting the staff, a process that involved manifestos, pub hustings, and even, by some candidates, a little frowned-on campaigning.

I supposed I should run for the job. My mission statement said I wanted to boost investigative reporting and get serious about digital. It was, I fear, a bit Utopian. I doubt much of it impressed the would-be electorate. British journalists are programmed to skepticism about idealistic statements concerning their trade. Nevertheless, I won the popular vote and was confirmed by the Scott Trust after an interview in which I failed to impress at least one Trustee with my sketchy knowledge of European politics. We all went off for a drink in the pub round the back of the office. A month later I was editing.

“Fleet Street,” as the UK press was collectively called, was having a torrid time, not least because the biggest beast in the jungle, Rupert Murdoch, had launched a prolonged price war that was playing havoc with the economics of publishing. His pockets were so deep he could afford to slash the price of The Times almost indefinitely — especially if it forced others out of business.

Reach before revenue — as it wasn’t known then.

The newest kid on the block, the Independent, was suffering the most. To their eyes, Murdoch was behaving in a predatory way. We calculated the Independent titles were losing around £42 million (nearly £80 million in today’s money). Murdoch’s Times, by contrast, had seen its sales rocket 80 per cent by cutting its cover prices to below what it cost to print and distribute. The circulation gains had come at a cost — about £38 million in lost sales revenue. But Murdoch’s TV business, BSkyB, was making booming profits and the Sun continued to throw off huge amounts of cash. He could be patient.

But how on earth could you graft a digital mindset and processes onto the stately ocean liner of print.

The Telegraph had been hit hard — losing £45 million in circulation revenues through cutting the cover price by 18 pence. The end of the price war left it slowly clawing back lost momentum, but it was still £23 million adrift of where it had been the previous year. Murdoch — as so often — had done something bold and aggressive. Good for him, not so good for the rest of us. Everyone was tightening their belts in different ways. The Independent effectively gave up on Scotland. The Guardian saved a million a year in newsprint costs by shaving half an inch off the width of the paper.

The Guardian, by not getting into the price war, had “saved” around £37 million it would otherwise have lost. But its circulation had been dented by about 10,000 readers a day. Moreover, the average age of the Guardian reader was 43 — something that pre-occupied us rather a lot. We were in danger of having a readership too old for the job advertisements we carried.

Though the Guardian itself was profitable, the newspaper division was losing nearly £12 million (north of £21 million today). The losses were mainly due to the sister Sunday title, the Observer, which the Scott Trust had purchased as a defensive move against the Independent in 1993. The Sunday title had a distinguished history, but was hemorrhaging cash: £11 million losses.

Everything we had seen in America had to be put on hold for a while. The commercial side of the business never stopped reminding us that only three percent of households owned a PC and a modem.

* * *

But the digital germ was there. My love of gadgets had not extended to understanding how computers actually worked, so I commissioned a colleague to write a report telling me, in language I could understand, how our computers measured up against what the future would demand. The Atex system we had installed in 1987 gave everyone a dumb terminal on their desk — little more than a basic word processor. It couldn’t connect to the internet, though there was a rudimentary internal messaging system. There was no word count or spellchecker and storage space was limited. It could not be used with floppy disks or CD-ROMs. Within eight years of purchase it was already a dinosaur.

There was one internet connection in the newsroom, though most reporters were unaware of it. It was rumored that downstairs a bloke called Paul in IT had a Mac connected to the internet through a dial-up modem. Otherwise we were sealed off from the outside world.

Some of these journalist geeks began to invent Heath Robinson solutions to make the inadequate kit in Farringdon Road to do the things we wanted in order to produce a technology website online. Tom Standage — he later became deputy editor of the Economist, but then was a freelance tech writer — wrote some scripts to take articles out of Atex and format them into HTML so they could be moved onto the modest Mac web server — our first content management system, if you like. If too many people wanted to read this tech system at once the system crashed. So Standage and the site’s editor, Azeem Azhar, would take it in turns sitting in the server room in the basement of the building rebooting the machines by hand — unplugging them and physically moving the internet cables from one machine to another.

What would the future look like? We imagined personalized editions, even if we had not the faintest clue how to produce them. We guessed that readers might print off copies of the Guardian in their homes — and even toyed with the idea of buying every reader a printer. There were glimmers of financial hope. Our readers were spending £56 million a year buying the Guardian but we retained none of it: the money went on paper and distribution. In the back of our minds we ran calculations about how the economics of newspapers would change if we could save ourselves the £56 million a year “old world” cost.

By March 1996, ideas we’d hatched in the summer of 1995 to graft the paper onto an entirely different medium were already out of date. That was a harbinger of the future.

On top of editing, the legal entanglements sometimes felt like a full-time job on their own. Trying to engineer a digital future for the Guardian felt like a third job. There were somehow always more urgent issues. By March 1996, ideas we’d hatched in the summer of 1995 to graft the paper onto an entirely different medium were already out of date. That was a harbinger of the future. No plans in the new world lasted very long.

It was now apparent that we couldn’t get away with publishing selective parts of the Guardian online. Other newspapers had shot that fox by pushing out everything. We were learning about the connectedness of the web — and the IT team tentatively suggested that we might use some “offsite links” to other versions of the same story to save ourselves the need to write our own version of everything. This later became the mantra of the City University of New York (CUNY) digital guru Jeff Jarvis — “Do what you do best, and link to the rest.”

We began to grapple with numerous basic questions about the new waters into which we were gingerly dipping our toes.

Important question: Should we charge?

The Times and the Telegraph were both free online. A March 1996 memo from Bill Thompson, a developer who had joined the Guardian from Pipex, ruled it out:

I do not believe the UK internet community would pay to read an online edition of a UK newspaper. They may pay to look at an archive, but I would not support any attempt to make the Guardian a subscription service online . . . It would take us down a dangerous path.

In fact, I believe that the real value from an online edition will come from the increased contact it brings with our readers: online newspapers can track their readership in a way that print products never can, and the online reader can be a valuable commodity in their own right, even if they pay nothing for the privilege.

Thompson was prescient about how the overall digital economy would work — at least for players with infinitely larger scale and vastly more sophisticated technology.

What time of day should we publish?

The electronic Telegraph was published at 8 a.m. each day — mainly because of its print production methods. The Times, more automated, was available as soon as the presses started rolling. The Guardian started making some copy available from first edition through to the early hours. It would, we were advised, be fraught with difficulties to publish stories at the same time they were ready for the press.

Why were we doing it anyway?

Thompson saw the dangers of cannibalization, that readers would stop buying the paper if they could read it for free online. It could be seen as a form of marketing. His memo seemed ambivalent as to whether we should venture into this new world at all:

The Guardian excels in presenting information in an attractive easy to use and easy to navigate form. It is called a “broadsheet newspaper.” If we try to put the newspaper on-line (as the Times has done) then we will just end up using a new medium to do badly what an old medium does well. The key question is whether to make the Guardian a website, with all that entails in terms of production, links, structure, navigational aids etc. In summer 1995 we decided that we would not do this.

But was that still right a year later? By now we had the innovation team — PDU — still in the basement of one building in Farringdon Road, and another team in a Victorian loft building across the way in Ray Street. We were, at the margins, beginning to pick up some interesting fringe figures who knew something about computers, if not journalism. But none of this was yet pulling together into a coherent picture of what a digital Guardian might look like.

An 89-page business plan drawn up in October 1996 made it plain where the priorities lay: print.

We wanted to keep growing the Guardian circulation — aiming a modest increase to 415,000 by March 2000 — which would make us the ninth-biggest paper in the UK — with the Observer aiming for 560,000 with the aid of additional sections. A modest investment of £200,000 a year in digital was dwarfed by an additional £6 million cash injection into the Observer, spread over three years.

As for “on-line services” (we were still hyphenating it) we did want “a leading-edge presence” (whatever that meant), but essentially we thought we had to be there because we had to be there. By being there we would learn and innovate and — surely? — there were bound to be commercial opportunities along the road. It wasn’t clear what.

We decided we might usefully take broadcasting, rather than print, as a model — emulating its “immediacy, movement searchability and layering.”

If this sounded as if we were a bit at sea, we were. We hadn’t published much digitally to this point. We had taken half a dozen meaty issues — including parliamentary sleaze, and a feature on how we had continued to publish on the night our printing presses had been blown up by the IRA — and turned them into special reports.

It is a tribute to our commercial colleagues that they managed to pull in the thick end of half a million pounds to build these websites. Other companies’ marketing directors were presumably like ours — anxious about the youth market and keen for their brands to feel “cool.” In corporate Britain in 1996, there was nothing much cooler than the internet, even if not many people had it, knew where to find it or understood what to do with it.

* * *

The absence of a controlling owner meant we could run the Guardian in a slightly different way from some papers. Each day began with a morning conference open to anyone on the staff. In the old Farringdon Road office, it was held around two long narrow tables in the editor’s office — perhaps 30 or 40 people sitting or standing. When we moved to our new offices at Kings Place, near Kings Cross in North London, we created a room that was, at least theoretically, less hierarchical: a horseshoe of low yellow sofas with a further row of stools at the back. In this room would assemble a group of journalists, tech developers and some visitors from the commercial departments every morning at about 10 a.m. If it was a quiet news day we might expect 30 or so. On big news days, or with an invited guest, we could host anything up to 100.

A former Daily Mail journalist, attending his first morning conference, muttered to a colleague in the newsroom that it was like Start the Week — a Monday morning BBC radio discussion program. All talk and no instructions. In a way, he was right: It was difficult, in conventional financial or efficiency terms, to justify 50 to 60 employees stopping work to gather together each morning for anything between 25 and 50 minutes. No stories were written during this period, no content generated.

But something else happened at these daily gatherings. Ideas emerged and were kicked around. Commissioning editors would pounce on contributors and ask them to write the thing they’d just voiced. The editorial line of the paper was heavily influenced, and sometimes changed, by the arguments we had. The youngest member of staff would be in the same room as the oldest: They would be part of a common discussion around news. By a form of accretion and osmosis an idea of the Guardian was jointly nourished, shared, handed down, and crafted day by day.

You might love the Guardian or despise it, but it had a definite sense of what it believed in and what its journalism was.

It led to a very strong culture. You might love the Guardian or despise it, but it had a definite sense of what it believed in and what its journalism was. It could sometimes feel an intimidating meeting — even for, or especially for, the editor. The culture was intended to be one of challenge: If we’d made a wrong decision, or slipped up factually or tonally, someone would speak up and demand an answer. But challenge was different from blame: It was not a meeting for dressing downs or bollockings. If someone had made an error the previous day we’d have a post-mortem or unpleasant conversation outside the room. We’d encourage people to want to contribute to this forum, not make them fear disapproval or denunciation.

There was a downside to this. It could, and sometimes did, lead to a form of group-think. However herbivorous the culture we tried to nurture, I was conscious of some staff members who felt awkward about expressing views outside what we hoped was a  fairly broad consensus. But, more often, there would be a good discussion on two or three of the main issues of the day. We encouraged specialists or outside visitors to come in and discuss breaking stories. Leader writers could gauge the temperature of the paper before penning an editorial. And, from time to time, there would be the opposite of consensus: Individuals, factions, or groups would come and demand we change our line on Russia, bombing in Bosnia; intervention in Syria; Israel, blood sports, or the Labor leadership.

The point was this: that the Guardian was not one editor’s plaything or megaphone. It emerged from a common conversation — and was open to internal challenge when editorial staff felt uneasy about aspects of our journalism or culture.

* * *

Within two years — slightly uncomfortable at the power I had acquired as editor — I gave some away. I wanted to make correction a natural part of the journalistic process, not a bitterly contested post-publication battleground designed to be as difficult as possible.

We created a new role on the Guardian: a readers’ editor. He or she would be the first port of call for anyone wanting to complain about anything we did or wrote. The readers’ editor would have daily space in the paper — off-limits to the editor — to correct or clarify anything and would also have a weekly column to raise broader issues of concern. It was written into the job description that the editor could not interfere. And the readers’ editor was given the security that he/she could not be removed by the editor, only by the Scott Trust.

On most papers editors had sat in judgment on themselves. They commissioned pieces, edited and published them — and then were supposed neutrally to assess whether their coverage had, in fact, been truthful, fair, and accurate. An editor might ask a colleague — usually a managing editor — to handle a complaint, but he/she was in charge from beginning to end. It was an autocracy. That mattered even more in an age when some journalism was moving away from mere reportage and observation to something closer to advocacy or, in some cases, outright pursuit.

Allowing even a few inches of your own newspaper to be beyond your direct command meant that your own judgments, actions, ethical standards and editorial decisions could be held up to scrutiny beyond your control. That, over time, was bound to change your journalism. Sunlight is the best disinfectant: that was the journalist-as-hero story we told about what we do. So why wouldn’t a bit of sunlight be good for us, too?

The first readers’ editor was Ian Mayes, a former arts and obituaries editor then in his late 50s. We felt the first person in the role needed to have been a journalist — and one who would command instant respect from a newsroom which otherwise might be somewhat resistant to having their work publicly critiqued or rebutted. There were tensions and some resentment, but Ian’s experience, fairness and flashes of humor eventually won most people round.

One or two of his early corrections convinced staff and readers alike that he had a light touch about the fallibility of journalists:

In our interview with Sir Jack Hayward, the chairman of Wolverhampton Wanderers, page 20, Sport, yesterday, we mistakenly attributed to him the following comment: “Our team was the worst in the First Division and I’m sure it’ll be the worst in the Premier League.” Sir Jack had just declined the offer of a hot drink. What he actually said was: “Our tea was the worst in the First Division and I’m sure it’ll be the worst in the Premier League.” Profuse apologies.

In an article about the adverse health effects of certain kinds of clothing, pages 8 and 9, G2, August 5, we omitted a decimal point when quoting a doctor on the optimum temperature of testicles. They should be 2.2 degrees Celsius below core body temperature, not 22 degrees lower.

But in his columns he was capable of asking tough questions about our editorial decisions —  often prompted by readers who had been unsettled by something we had done. Why had we used a shocking picture which included a corpse? Were we careful enough in our language around mental health or disability? Why so much bad language in the Guardian? Were we balanced in our views of the Kosovo conflict? Why were Guardian journalists so innumerate? Were we right to link to controversial websites?

In most cases Mayes didn’t come down on one side or another. He would often take readers’ concerns to the journalist involved and question them — sometimes doggedly — about their reasoning. We learned more about our readers through these interactions; and we hoped that Mayes’s writings, candidly explaining the workings of a newsroom, helped readers better understand our thinking and processes.

It was, I felt, good for us to be challenged in this way. Mayes was invaluable in helping devise systems for the “proper” way to correct the record. A world in which — to coin a phrase —  you were “never wrong for long” posed the question of whether you went in for what Mayes termed “invisible mending.” Some news organizations would quietly amend whatever it was that they had published in error, no questions asked. Mayes felt differently: The act of publication was something on the record. If you wished to correct the record, the correction should be visible.

But we had some inkling that the iron grip of centralized control that a newspaper represented was not going to last.

We were some years off the advent of social media, in which any error was likely to be pounced on in a thousand hostile tweets. But we had some inkling that the iron grip of centralized control that a newspaper represented was not going to last.

I found liberation in having created this new role. There were few things editors can enjoy less than the furious early morning phone call or email from the irate subject of their journalism. Either the complainant is wrong — in which case there is time wasted in heated self-justification; or they’re right, wholly or partially. Immediately you’re into remorseful calculations about saving face. If readers knew we honestly and rapidly — even immediately — owned up to our mistakes they should, in theory, trust us more. That was the David Broder theory, and I bought it. Readers certainly made full use of the readers’ editor’s existence. Within five years Mayes was dealing with around 10,000 calls, emails, and letters a year — leading to around 1,200 corrections, big and small. It’s not, I think, that we were any more error-prone than other papers. But if you win a reputation for openness, you’d better be ready to take it as seriously as your readers will.

Our journalism became better. If, as a journalist, you know there are a million sleuth-eyed editors out there waiting to leap on your tiniest mistake, it makes you more careful. It changes the tone of your writing. Our readers often know more than we do. That became a mantra of the new world, coined by the blogger and academic Dan Gillmor, in his 2004 book We the Media8 but it was already becoming evident in the late 1990s.

The act of creating a readers’ editor felt like a profound recognition of the changing nature of what we were engaged in. Journalism was not an infallible method guaranteed to result in something we would proclaim as The Truth — but a more flawed, tentative, iterative and interactive way of getting towards something truthful.

Admitting that felt both revolutionary and releasing.

***

Excerpted from Breaking News: The Remaking of Journalism and Why It Matters Now by Alan Rusbridger. Published Farrar, Straus and Giroux November 27, 2018. Copyright © 2018 by Alan Rusbridger. All rights reserved.

Longreads Editor: Aaron Gilbreath

After the Tsunami

Annykos / Getty

Matthew Komatsu | LongreadsMarch 2019 | 24 minutes (6,092 words)

This piece was supported by the Pulitzer Center. 

Ichi (One)

Obā-san tasted ash. Yes: ash and dust. Her youngest son’s kanji and hiragana on paper could not assuage the bitter news the letter delivered: that her youngest son would not return from America to his hometown of Kesennuma, Japan. He would stay to marry the American woman who carried his child. Dishonor. Shame. Betrayal. And I was the ash she tasted: the end of the pure line of the Komatsu name. Nothing more than an accidental flutter in the brine of my mother’s womb.

My grandmother would not have considered this metaphor of the sea, despite the proximity of her home to it, the wind-borne scent of the waterfront fish market and processing plants mere blocks away, burbling down the streets, seeping through the window and door cracks of her home. And beyond, the vast blue-gray of the Pacific Ocean, heaving and rolling the life it contained. She would not have thought of the sea’s power to both create and destroy.

***

A soccer ball washes ashore on Middleton Island in the Gulf of Alaska. On it, handwritten script in permanent marker that identifies its origin as a grade school in Rikuzentakata, Japan, 30 minutes north of Kesennuma. Its owner, Misaki Murakami, survived the tsunami but his family lost their home. It is a personal effect recovered from his home. On one of the panels are kanji characters inscribed by a classmate that read Ganbatte. Good luck.

***

I can only imagine what changed Obā’s heart. Perhaps it was my grandfather. According to my father, Ojī was more sympathetic. It was Ojī who responded to my father’s letter to say that he understood. Or maybe the simple need of a grandparent to hold her grandchild eroded her pride. But these are all, in a way, little fictions: my American need to emote in conflict with a Japanese inclination to accept.

Regardless, Obā and Ojī came to the United States. I wonder what they thought when they held this chubby black-haired infant boy, whether they struggled to pronounce my English first name. What it felt like to stare into the deep, brown eyes of a grandchild whose blood ran mixed. Or if any of this mattered at all.

What I do know: When Ojī and Obā journeyed halfway across the globe to the unlikely destination of Duluth, Minnesota, they didn’t know my parents arranged to leave me with a family friend at the beginning of a cross-country road trip across America that doubled as both honeymoon and getting-to-know-the-in-laws. When Ojī said goodbye to me, he wept. It was the last time we were together and the only time my dad saw his own father cry. My grandfather died in Japan, in 1987.

The only Japanese uttered in my home was spoken into the telephone on holidays. On those days, I rushed to answer the phone in the hope of hearing the voices of my Japanese relatives. Moshi moshi, came the greeting. When I answered in English, the caller usually responded, Ahhhhh… Toshifumi-san?

Dad, for you.

If my mother answered, the single phrase she knew: Chōttō matte, kudasai. One moment, please. I would sit on the brown shag carpet speckled with gold and red and yellow, my back to the heat vent, shirt lifted so the hot air blew up my skin and ruffled the black hairs on my neck. The book on my lap stayed open to the same page as I listened to one half of a conversation, mouthed words whose accented syllables I will never utter with any meaning. A pause for the delay, then the muffled return. A smile, a laugh, an imperceptible head bow from my father.

***

A Canadian finds the rusted hulk of a Harley-Davidson motorcycle on the shores of British Columbia and traces its license plate to its owner, Ikuo Yokoyama. Photos of the bike reveal a year at sea: spokes rusting away and missing, corrosion widespread across a frame whose gleam has been replaced with a forlorn absorption of the light that reflects upon it. Yokoyama resists an outpouring of internet-fueled financial support to restore the bike and repatriate it. Instead he asks that it be preserved in a museum as is, a memorial to what was lost.

***

During a precious summer break from the Air Force Academy, I joined a family trip to Japan. Eager to show the Japanese I’d picked up over two years of college classes, I greeted Obā. My father told her that I knew Japanese now, that she should speak to me. We sat down in the living room of the small family home in Kesennuma. The air was heavy with the smell of the nearby ocean, mothballs, dust, and paper. But when she spoke, I could not understand.

***

Here is a list of Japanese words. Tsunami. Pronounced “tsoo-nah-mee.” Translation: “harbor wave.” E. Pronounced “a-ay.” Interrogative. Translation: “What?” Hayaku. Pronounced “hi-yah-koo.” Translation: “hurry.” Hashitte. Pronounced “hah-shht-ay.” Imperative. Translated to English: “Run.”

 

Ni (Two)

At 2:46 p.m. on Friday, 11 March 2011, a 100-mile-long section of the Pacific tectonic plate 19 miles deep thrusted beneath Japan. Richter scale needles twitched. Japan shifted eight feet east. The Earth shuddered off-axis. The seabed rose, lifting the ocean above it by 25 feet. All that water had to go somewhere. And it did — away, in a series of waves that raced west at 86 miles per hour. The tsunami made landfall roughly 45 minutes later on the shores of my father’s hometown of Kesennuma in northeast Japan’s Miyagi Prefecture.

My 11 March dawned no different than any other. I woke up and checked Facebook over coffee. My sister posted something about a big earthquake in Japan, but the family was fine. Big earthquake, Japan: happens all the time. I didn’t think much of it during the 45-minute drive from Columbia, South Carolina, to Shaw Air Force Base, NPR now revising the magnitude, the Richter climbing. I paid it no mind during my 12-mile run before work. It was spring in South Carolina, flowers opening under a rising sun, the air heavy with their dewy scent.

The tsunami made landfall on the shores of my father’s hometown of Kesennuma in northeast Japan’s Miyagi Prefecture.

It wasn’t until after I showered and changed into my uniform that the narrative unraveled. I turned on the car and the radio cascaded breaking news of a large tsunami in Japan. But even then, I did not think of the risk to my father’s hometown, a fishing city in northeastern Miyagi Prefecture directly in the tsunami’s path.

At work, I punched a code into a keypad and walked through a door into the cubicled space I shared with close to 50 other officers. The room was quiet, all eyes glued to the televisions on the wall. I looked over my shoulder and from the second floor of the Air Forces Central Command Headquarters, I watched 22,000 Japanese die.

***

In the years that follow 3/11, I will often open my laptop to type “Japan Tsunami” into a search engine. In a half second, tens of millions of results cascade down the screen, many of them videos.

***

No phones were allowed in my office. I left to use the bathroom, checked my phone: a missed call and a voicemail from my mother: Matt, call home. My gut twisted.

My mother answered. They were driving from their home, nestled in the green pines and gray popple outside Duluth, to an aunt who had cable. My parents had never paid for cable television — considering it either unaffordable or unnecessary. Now, for the first time in their lives, a luxury became a necessity. The internet was too slow; they needed to see.

Yes, I’ve seen the news, I said. But Lauren posted something on Facebook. Everyone is fine.

No. Uncle Kazafumi called from his office in Kesennuma — it lasted eight seconds — to say he was okay. Then the call ended.

And he tried to call him back?

Yes.

And?

Nothing. Dad can’t get a hold of him, or anyone else.

***

11 March passed. Friday. 12 and 13, Saturday and Sunday. Monday, 14 March. Still nothing. I watched the same scenes looping on the office televisions.

A coworker blurted, “I’m just waiting for some Japanese person to show up on the TV and yell, ‘Godzilla! Godzilla!’” Someone nearby laughed mirthlessly.

The morning of the 15 March, my youngest sister, Lydia, received the news from our cousin in Tokyo. She spoke no Japanese and his English was broken but somehow he conveyed the news.

My uncle and aunt had survived. Tokuno Komatsu, our grandmother, was dead.

***

Sendai, a city two hours south of Kesennuma: Empty cars wash across the airport tarmac. The reporter flying above an ocean-covered Minami-sanriku: Where have all the people gone? Rikuzentakata. Ōshima. Ishinomaki. Miyako. Natori. And finally, Kesennuma, now burning an orange horizon of flame into the black pall of night.

***

Ten days after the tsunami, I boarded a flight to Japan. The U.S. military mobilized a relief effort called Operation Tomodachi. Friend. I called in every favor I had to deploy as a Tomodachi rescue planning officer.

Before the flight, my father told me that he was proud that a member of the family would be in Japan to help. He asked what I’d be doing there, but I didn’t know. I told him I sold my language abilities hard, maybe oversold them. That I was worried. Don’t worry, he said. It will all come back.


Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.

Sign up


The flight from Dulles to Narita International Airport was all but empty. Once aboard, I reviewed old Japanese textbooks and watched Harry Potter once in English, then twice in Japanese. I tried to sleep, but nightmares woke me with linguistic versions of the naked dream: Me, aside the American general to whom I’ve been assigned as a translator. His Japanese counterpart speaks a torrent of Japanese, then pauses to look at me and await the translation. The American nods intently, casting ever-increasing looks my way. I recall one word in 10, try to divine meaning from inflection and posture. My mouth works, but the words do not come.

The bus ride from Narita to Yokota Air Base on the outskirts of Tokyo bore no witness to the quake and tsunami. No billboards hung precariously, no cracks split the roadways, and the lights were on. It was as if nothing happened at all. At Yokota, I disembarked to a cold, snowy night and entered a hangar to process into the Tomodachi task force. Airmen, clad in multiple layers, walked between different stations in the hangar, pausing at powered space heaters to warm themselves in the frigid night. I thought of the thousands of Japanese shoved into tiny makeshift evacuation centers. I imagined how they huddled, warmed only by blankets and each other.

***

Yokota fell away from my window of an Air Force HH-60G helicopter as it lifted off and flew east. I needed to see affected Japan for myself. It wasn’t until we were out over the ocean, flying outside an imaginary bubble around Fukushima that I did.

Rivers of debris from the tsunami appeared on the surface of the Pacific and streamed to the horizon, a flotsam road of shattered wood and plastic. We flew low, eyes out and scanning for life. The last survivor had been pulled from the water a week prior, but we hoped despite the odds, knowing we were far more likely to spot the dead.

A crew member saw something, and the helo banked hard. Over the intercom, he admitted it was probably nothing but worth investigating. Lower, slower, we orbited until the rotor wash beat the sea into mist over what turned out to be a white sheet rippling into the depths.

The farther from Japan, the larger the debris. Refrigerators and freezers. Orange tiled roofs bobbed in the blue and gray, impossibly buoyant. The wall of a home, the glass of a window somehow intact, offered a view into the saltwater beneath. All of it surrounded by a mass of splintered wood.

***

The shivering woke me again. I blinked into the darkness of the Sendai Airport first class lounge and pressed a button on my watch. 0300. I retreated further into the insulation of my puffy coat. Snores came from airmen off-shift from their post on the airport roof. Periodically throughout the night one would return and hand off a radio the size of two stacked laptops, then pop a sleeping pill while the other ran air traffic.

It was supposed to be a short visit, an hour or less. Just enough to make contact with the senior officer on the ground and determine what, if any, help I could provide as a planner. But the sound of the helicopter was only audible long enough to make radio contact with the airman on the roof: Tell Major Komatsu that we have to return to Yokota. We’ll be back when we can.

The cold shook me awake every 15 minutes until I stood up at 0600 and crept out of the dark room and into the daybreak of the terminal. Behind glass windows stories high, I wandered the vacant space, pausing at the vendor stands. The airmen were initially ordered not to take any food, but soon after they arrived, vendors themselves showed up and told them to take what they wished. The stacks of dried cuttlefish and shrimp-flavored crackers vanished, leaving only inscrutable books of manga and the assorted comforts required to heel the modern traveler. I lifted one of the books and perused a few of the oddly colored pages, taking in black and white lines of manga from back to front. I set it back in its place and looked out the glass.

Refrigerators and freezers. Orange tiled roofs bobbed in the blue and gray, impossibly buoyant.

In between the east end of the runway and the coast, a road once connected Kesennuma with Sendai; I’d made the drive twice during family trips. Now, I thought about packing my ruck, stuffing it with MREs and walking north, picking my way through the detritus until I reached my father’s hometown. My grandmother lay in the freezer of a morgue. The old family home, gone. Dozens of extended family — great uncles and third cousins and aunties once-removed — missing.

***

The morning of 27 March, I sat in my room back at Yokota alone after a run inside the confines of the base perimeter, under the pink-white beginnings of the cherry tree bloom washing the country from south to north. A rebirth of spring, of hope, of all things green and full of life.    

Three hundred miles away, my relatives cremated Ōba’s remains.

***

Our rescue helicopters and crews went home, the work of finding and extracting the living long over. Only the dead remained missing, and the Japanese government politely declined U.S. military support to the search. My job as a rescue planner turned to playing games of what if. What if an American aircraft transporting radiation measurement crews crashes inside the Fukushima no-fly zone? Who will rescue them and how will we coordinate between Japanese and American operations centers?

These questions could only be answered in conversation with my Japanese counterpart at the Japanese Rescue Coordination Center, located 53 minutes down the Ome train line, on Fuchu Air Base. When we met in the lobby of the Japanese Air Self Defense headquarters building, a fellow American officer acting as my linguist introduced Okahashi-san. We smiled and bowed, then he presented me with his meishi (business card) in the manner I learned in my sophomore Japanese class at the Academy: Both hands present, both receive. Study the card, then place it only in a chest pocket; never, ever in a disrespectful pants pocket.

Fatigue lined his face and eyes — Okahashi-san has worked twenty hours every day since the tsunami. Lt Col Okahashi said something, smiled and gestured toward an imaginary flat surface a few feet off the ground. He sleeps on a cot in the back of the Rescue Coordination Center.

As we ate pork katsu at the Japanese dining facility, I attempted Japanese the best I could. I explained my last name, and when I said Kesennuma, he said, haltingly, “Your daddy. From Kesennuma?” Yes, I said. He simply frowned, lowered his eyes, shook his head and said no more.

***

Cell phones document the tsunami’s arrival in Minami-sanriku from ground level. A woman’s voice reverberates across the town, alternating with sirens to warning the residents over a citywide loudspeaker system. Impossibly, it continues even as the tsunami piles into the streets and people scream to those who’ve not yet made it to high ground, continues even as the ocean continues its inexorable rise. Until it falls silent. And all that remains are the cries of the Japanese who have survived.

***

When I met my Japanese cousins for dinner, I’d been asking my father for weeks to arrange for me to visit Kesennuma at the end of my deployment. I missed my stop on the train from Yokota, had to double back at the next, then wait at the eki for the only cousin who spoke any English to walk from the restaurant. All around me, life streamed through automated ticketing gates amid the wall of sound that is a Tokyo train station during evening rush hour. And yet, not so far away, their countrymen were digging through rubble with their bare hands. Posting desperate signs for missing persons.

We did our best to converse around our sukiyaki. They showed me pictures from Kesennuma. The old family home, gone. My uncle’s two-story office, first floor hollowed by the tsunami. My uncle, passed out on his floor with an empty bottle of whiskey nearby. Uncle drink lot now.

When I asked my cousins about my request to visit Kesennuma, their eyes dropped and they picked at their food. Mizuki — the English speaker — pulled out his phone. We call your daddy. He dialed, spoke Japanese when my father answered. I could not interpret Mizuki’s body language. He handed me the phone. My father talked around the question — his mother’s death, the family shock, the loss of the business and deaths of two employees, the destruction, how his brother wouldn’t say no to my visit but wouldn’t say yes either — until I interrupted him.

“Dad, what’s the bottom line?”

“Culturally, they would lose face if they said no. But the timing is bad.”

“I’d be a burden.”

“Yes.”

“But I have to make the decision.”

“Yes. You will have to tell them you do not want to go.”

“OK, then. I’m not going.” I handed the phone back to my cousin, and the relief on his face told me everything I needed to know.

***

Of the 12 million tsunami videos, I will not watch them all. And yet it will be too much, as well as somehow not enough.

***

On my last day in Japan, I sat with the Air Force colonel who led my shift. He was a pilot without a cockpit anymore, his jet long mothballed. He’d flown a desk for years now, he said as he smiled and removed his glasses; this was his last hurrah. Then he asked about what drew me to volunteer for this. When I told him, he fell silent.

“I’m sorry,” he said. “We should have found a way to get you to Kesennuma.” Then he handed me his card, thanked me for what I’d done, and I walked out of the operations center for the last time.

Before boarding the bus to Narita, I walked to a nearby cherry tree whose branches drooped under a blooming mantel. It stood above a patchwork of dirt and a browning white carpet of fallen blossoms. I found a living flower within reach and pinched its green stem, careful not to disrupt the delicate petals above it. Once free, I carried it two-handed; one pinching its base, the other cradling the bloom in my palm until I was back in my room. A book of devotions lay open on my desk, a gift from my parents. I placed the flower in the book, closed it.

 

San (Three)

 

2018. The shinkansen pitches us north from Tōkyō, picking up speed until the bullet train hits 200 mph and the endless series of the Tōhoku region’s ubiquitous rice paddies visible through my window blur green, flickering as dike-top roads come and go. I have returned to hear, yes, but also to touch. Taste, smell, and once again: see.   

We strategize. Three of us: my father, the linguist I’ve hired, and me. A cousin produced the name of the rest home where my grandmother perished: Shunpo. A classmate worked at Shunpo on 3/11, but my cousin is unwilling to connect us. So the linguist puts on her fixer hat and determines the former manager not only survived, but rebuilt Shunpo in a new location and now speaks internationally on tsunami readiness. It’s as good a lead on determining how my grandmother died as we’re going to get. Anticipation builds as we get off the bullet at Ichinoseki for the drive to Kesennuma until I’m straining against my seatbelt and we finally get where I could not go seven years ago.

I have returned to hear, yes, but also to touch. Taste, smell, and once again: see.

Kesennuma. No longer confined by glass or screen, I step from a cousin’s car in front of the vacant lot that was once 2-13-16 Nakamachi-cho. My father and he speak quietly in Japanese. The home I remember. His home. From where I stand, I could have reached over the street’s gutter and touched the house’s wall, perhaps taken in that odd mothball scent that seems to accompany my few memories of the texture of the place. But there is nothing but the tang of salt air in between me and the violet dusk of a sun long since set behind the hills of tall pine that mark Kesennuma’s western edge.

***

The tsunami is everywhere.

Blue placards on buildings show its maximum height with typical Japanese simplicity: a horizontal line and measurement in meters, in white lettering. Buildings still slated for demolition next to the orange-brown of cleared earth. Construction signs and workers and new roads unimpeded by human artifice. Signs along the sides of the road that undulates up and down through the endless series of ria (“bay”) that pocket the Sanriku coastline mark the tsunami’s maximum inundation points. Dystopian reconstructed landscapes behind massive seawalls that stretch across the horizon. The “Dragon Tree” of Kesennuma — a gnarled pine that survived the tsunami only to later die and be preserved where it stands on the cape of the Iwaisaki area of the city. The “Miracle Pine” of Rikuzentakata: the sole remaining tree of an estimated 70,000 that made up a coastal forest, eventually felled by the saltwater left in the ground by the tsunami, then preserved in detail at an estimated cost of 150 million yen (close to 2 million dollars based on the exchange rate at the time). O-tsunami, the survivors say, applying the honorific “o-” prefix because they cannot adequately capture in words a full integration of all senses. It roared. Smelled of salt. It burned, pulled, swept.

It was incomprehensible in a way that can only be assembled by a comprehension of  what it left behind.

***

We climb a path beneath old-growth pine and cedar until a panorama of the city reveals the tsunami’s reach, still clear, even now. Gray and green mark the untouched. Yellow earth, the scar of the destroyed, the still-being-rebuilt. My cousin guides my father and me to the family gravesite. A light breeze, cool with the ocean across my skin, the sound of traffic. The smell of needle and ocean. I grasp at the sensory through the mantle of jet lag and culture shock, hoping to hold on to this moment. My father stands in front of a polished granite marker, brings his palms together and lowers his head to offer a silent prayer.

It’s been a decade and a half since I last saw my Aunt Fumiko, but her face remains cherubic, her skin pale and smooth. She apologizes for not having the snack she recalls as a favorite: a mix of salted peanuts and chili-flavored rice cracker crescents. She looks thin but well. I show her pictures of my family. When I produce an app on my phone that lets her see my infant daughter at that very moment sleeping halfway around the globe, she smiles.

Kawaii, ne. So cute.

She tells me that the earthquake found her in the midst of shopping. When the world ceased shaking, she felt an overwhelming urge to immediately head home. Something horrible was going to happen. She followed her instinct and drove straight to the new house, three miles inland from the old one that no longer exists. Her son called at about 3:15 p.m. after seeing tsunami warnings on the news. Obā was at Shunpo, but my aunt thought it would be safe. It had two floors, a good flat roof, was a fair distance from the ocean. She worried about my uncle, whose office was on the downtown waterfront at the tip of Kesennuma Bay.

Read more…

The Teen Idol Vanishes

Adam Scull / AP, Getty, Photo illustration by Katie Kosma

Soraya Roberts | Longreads | March 2019 | 8 minutes (2,230 words)

Dylan McKay was never quite there. In the physical sense — he would kind of just turn up out of nowhere on Beverly Hills, 90210, in random staircases or under random cars, and disappear just as fast — but also, like, existentially. He was supposed to be a high schooler, but you could never imagine him in class, getting bored, learning. He seemed to know everything there was to know anyway, even though he was only 17 — but he wasn’t really 17. He had this sort of aged face, with the eyebrow scar, the never-ending stacks of lines on his forehead, the throwback pompadour, the Homeric sideburns, and seemed to have the sexual history of a middle-aged playboy. Depending on the circumstances, he exuded the hard-partying past of a retired rock star or the bodhisattva-like wisdom of an ancient yogi. Even though he had supposedly hit puberty only four years ago.

All of this resulted in an otherworldly, ageless icon of adolescence that was impossible to grasp completely because of the way it constantly vacillated between poles — old and young, violent and gentle, smart and goofy, rich and poor, public and private. But Dylan McKay’s was the kind of mythic narrative that could only float along on a dearth of details, the holes filled in by our imaginations. By a pubescent girl, for instance, who thought the “Dylan” in her new class would be something like the Dylan at West Beverly only to find he was as acne-ridden and awkward as she was because he was an actual teenager. As opposed to Luke Perry, a 24-year-old actor whose biography was so elusive that Dylan McKay stood in for him, turning both of them into this perennial abstract symbol of romantic teenage-hood. Read more…