The Longreads Blog

Apocalypse Now? Now? How About Now?

Getty Images

For Harper’s Magazine, Lauren Groff attends Prepper Camp hoping to learn useful, climate-friendly survival tactics to help herself and others to weather the aftermath of hurricane season in Florida. While she encounters the expected — gun enthusiasts, fear of the “other,” and a fair bit of snake oil — she also realizes that she has a lot more in common with her “prepper brethren” than she first thought.

Perhaps I should have expected to feel wildly out of place at Prepper Camp. I am a vegetarian agnostic feminist in a creative field who sits to the left of most American socialists: I want immediate and radical action to halt climate change; Medicare and free public higher education for all; abortion pills offered for pennies in pharmacies and gas stations; the eradication of billionaires; the destruction of capitalism; and the rocketing of all the planet’s firearms into the sun.

And yet I am also, in the darkest corners of my heart, a doomsday prepper myself.

It should not have been a surprise to me—though it was—how rarely actual facts were invoked at Prepper Camp: instead I had heard a great deal of fear mitigated by practical-seeming ideas, lots of baseless venom spat in the direction of imagined liberals, but almost no science to give weight to any assertions, no analysis of the larger state of the world, no evidence of statistical knowledge. Survivalists had revealed themselves to be romantics. Prepper Camp was a castle built on emotion: fear of the inchoate other was so great that the survivalists felt justified in being prepared to kill other humans to protect their material goods.

But then I saw, to my horror, an uglier truth: that I was no better than my prepper brethren. And that because of my hypocrisy, I was probably even worse.

Perhaps doomsday libertarians do secretly long for a chance to rid the earth of people who threaten their supremacy; but there is something equally anarchic in me that longs for society to break so that we can rebuild it to be kinder, more generous, more equitable. Deep down, perhaps I am a prepper because I believe that the only way we are going to pry the world’s wealth out of the greedy, grasping hands of the billionaires who are willfully killing the environment is through a total collapse of the status quo. Perhaps I am a prepper because I have had enough: I am goddamn ready for the guillotines.

Read the story

‘What’s this guy doing loose in Malheur County?’

Image by Curtis Perry via Flickr (CC BY-NC-SA 2.0)

Anthony Montwheeler spent 20 years in an Oregon mental healthy facility after being found not guilty of kidnapping his ex-wife by reason of insanity. He was released after claiming that he’d been faking the whole time, then immediately kidnapped another ex-wife, eventually stabbing her and killing another person during an ensuing car chase, all in full view of witnesses. And yes, he’s going to plead insanity again. How did he get the “not guilty” verdict 20 years ago? How did he get out? Is he mentally ill; what even is “mental illness” in the criminal justice context? In Rolling Stone, Rob Fischer walks us through Montwheeler’s case and the many blurry lines and troubling policies around the insanity defense in the U.S.

The hearing lasted more than two hours, but Montwheeler testified for only eight and a half minutes. When a state official asked if he ever had trouble sleeping, Montwheeler said, “No. I’ve always been able to sleep at night.” Had he ever been depressed, or felt that life is not worth living? “I’ve always been happy,” Montwheeler said. “I mean, I’ve never been depressed.” So then, the official pressed, you’ve never had any trouble getting out of bed and going about your activities? “No,” Montwheeler replied. “I’ve always showed up for work. I’ve always been Johnny on the spot.”

After a brief recess, the review board found Montwheeler was “no longer affected by a qualifying mental disease or defect,” which meant the state was legally required to discharge him. Offenders who are discharged from the state hospital, even those, like Montwheeler, released before the completion of their full term, are not diverted into penitentiaries. They are set free without additional oversight or guaranteed access to state mental health care.

The board’s chair, Kate Lieber, a Portland-based attorney, was clearly upset. “I don’t even know where to start,” she said. While maintaining a lie for 20 years, she noted, Montwheeler had avoided prison, lived rent-free, and received expensive care from trained professionals. “I mean, that is troubling on all sorts of levels,” Lieber said. “I’m assuming somebody in the system might do a forensic look at this and figure out what the hell happened. But as of now, you’re discharged.” Before Montwheeler walked out the door, she added, “My hope is that you’ll do the right thing. I am sincerely worried that you won’t.”

Read the story

“What Do I Know To Be True?”: Emma Copley Eisenberg on Truth in Nonfiction, Writing Trauma, and The Dead Girl Newsroom

Sylvie Rosokoff / Hachette Books

Jacqueline Alnes | Longreads | February 2020 | 21 minutes (5,966 words)

 
Am I a journalist?” I found myself asking Emma Copley Eisenberg. On a sunny day in mid-October, Eisenberg sat adjacent to me at the dining room table in her West Philadelphia home, a spread of sliced tomatoes, chicken, and perfectly steamed asparagus she prepared on a plate between us. I am certainly not a journalist in any meaningful sense of the word — outside of an MFA in creative nonfiction, during which I learned to conduct research, I have no formal schooling or training — but Emma and I are both infatuated with the boundaries between subject and writer, research and lived experience, and how we classify it all. How does who we are and our own lived experiences affect the types of research we reach for? Is there such a thing as objectivity, or do we land closer to the truth if we expose our own flaws and biases and complicated histories on the page? And what is truth, after all? 

Eisenberg, in her debut book, The Third Rainbow Girl, wrestles meaningfully with these questions and many others. Though her book is marketed as true crime, and though a major thread within the narrative is the murder of Vicki Durian and Nancy Santomero, two women on their way to a festival known as the Rainbow Gathering, Eisenberg undermines many features of the subgenre by centering place as a major subject. Her descriptions of Pocahontas County, both in memoir sections, in which Eisenberg relays her time living in Appalachia, and reported sections, in which Eisenberg offers insight into the ways in which the murders of Durian and Santomero brought to the surface harmful stereotypes perpetuated against the region, complicate perceptions rather than flatten them into any packageable or easy narrative. In prose that brims with empathy, and through research that illuminates narratives that have long been hidden by problematic representation, Eisenberg exposes the kinds of fictions we tell ourselves often enough that we believe them to be true.  

During the course of our sprawling conversation, one punctuated only by friendly interruptions from a gray house cat named Gabriel, Eisenberg and I talked about what it means to seek truth in nonfiction, and how writing the personal can allow for more complicated realities to emerge; how undermining conventions of genre can impact the way a book is both marketed and read; and what it means to find clarity — or at least community — while writing into murky, and often traumatizing subject matter.  Read more…

Can Mickey Mouse Coexist with Bears, Panthers, and Alligators?

Further reading: How pet reptiles are flourishing in Florida by preying on bird populations.

In Florida, 12 acres are developed each hour. In the land of snow birds, theme parks, golf courses, and ever-expanding terra cotta tract housing, is there a way for wetlands and wild animals to not just survive, but thrive alongside man’s ravenous appetite for development?

At The Bitter Southerner, Will Wellman follows a small team and their documentary crew through forests and swamps as they study the potential to create a wildlife corridor connecting the last remaining wild places in the Sunshine State.

Even in my own Floridian imagination, my home state has shifted from wild green to lifeless gray. Florida is no longer “dotted” with development, but with wilderness. Come to Florida, the advertisements say, there are gators, tropical flowers, wide open oceans. BUT DON’T WORRY, they can’t reach you from your air-conditioned hotel room, restaurant, Disney vacation. Come and look! You definitely don’t have to touch.

Joe continued his research on a small bear population in Glades and Highlands counties, attempting to understand how these bears managed to live in an area so heavily affected by human development. One of the bears Joe was tracking, a male given the colorful name M34, went on a journey of nearly 500 miles — wandering from Lake Placid through the Everglades Headwaters, then toward Celebration, a planned community outside of Disney World. M34 bumped up against I-4 many times but was never able to cross; he eventually made his way back south to the ranches and natural land of the Lake Wales Ridge area.

M34’s problem is a common issue for animals throughout the state of Florida. Growing development and infrastructure across the state means isolated habitats, and there are scant pathways connecting these wild areas.

The swamp along Reedy Creek is relatively dry. The trunks of trees throughout the swamp bear the marks of both seasonal flooding and drought. In a month, when the summer rains begin, the waters will quickly rise to the higher water lines. For now, though, the ground is a mucky labyrinth of dead vegetation, fallen trees, and downed branches. The humidity here is palpable; it presses against you, as does the heat.

This is no place for claustrophobics. But of all the landscapes I’ve had the good fortune to explore, none makes me feel as alive as a swamp does. I don’t mean exuberance or joy. It is a sense of life fed by ever-present danger. Swamps are marked by death — all the rotting organic matter that mars its floor and gives it life — and by risk — every nook and cranny could hide snakes, gators, and more. A swamp jars you from default, autopilot amble and into an alertness of a dark, living world around you. Rilke’s words reverberate as a mantra for this wooden morass: “Let everything happen to you: beauty and terror.”

Read the story

Don’t Pretend Like You Don’t Love Wikipedia

Wikipedia: it’s the best trash fire on the internet. It’s sexist. There’s rampant trolling. You can lose hours of your life falling down a rabbit hole looking for information on that one guy from that movie. And you can learn about anything at all in recorded history, and it’ll probably be mostly true. At Wired, Richard Cooke penned a loving paean to the playground for pedants that’s the closest thing the internet has to a public square.

The site’s innovations have always been cultural rather than computational. It was created using existing technology. This remains the single most underestimated and misunderstood aspect of the project: its emotional architecture. Wikipedia is built on the personal interests and idiosyncrasies of its contributors; in fact, without getting gooey, you could even say it is built on love. Editors’ passions can drive the site deep into inconsequential territory—exhaustive detailing of dozens of different kinds of embroidery software, lists dedicated to bespectacled baseball players, a brief but moving biographical sketch of Khanzir, the only pig in Afghanistan. No knowledge is truly useless, but at its best, Wikipedia weds this ranging interest to the kind of pertinence where Larry David’s “Pretty, pretty good!” is given as an example of rhetorical epizeuxis. At these moments, it can feel like one of the few parts of the internet that is improving.

And because I know it’s the thing you seized on while reading, here is Khanzir’s entry, and here is Khanzir:

In this photograph taken on July 12, 2016, an Afghan veterinarian administers an injection to a pig inside an enclosure at Kabul Zoo in Kabul. (WAKIL KOHSAR/AFP via Getty Images)

You’re welcome.

Read the story

I Have No Idea What You Corporate People Are Talking About

Ted S. Warren / AP Photo

Corporate lingo is all about obfuscation, group-think, and creating unnecessary work rather than clarity. For New York Magazine, Molly Young examines corporate jargon like “futureproofing” and “level-setting” to try and understand where it came from, why corporate employees opt-in (ha) to this group linguistic delusion, and what such gibberish does and does not do for people. Take, for example, the term “parallel-path,” which more simply means to do two things at once. Office workers already did multiple things at once constantly. Why did anyone need a confusing term for language that was already clear? “It was,” Young says, “in its fakery and puffery and lack of a reason to exist, the perfect corporate neologism.” Young calls all such lingo “garbage language,” a term borrowed from Anna Wiener, author of the new tech life memoir, Uncanny Valley. “The meaningful threat of garbage language,” Young writes, “is that it confirms delusion as an asset in the workplace.”

Another thing this language has in common with garbage is that we can’t stop generating it. Garbage language isn’t unique to start-ups; it’s endemic to business itself, and the form it takes tends to reflect the operating economic metaphors of its day. A 1911 book by Frederick Winslow Taylor called The Principles of Scientific Management borrows its language from manufacturing; men, like machines, are useful for their output and productive capacity. The conglomeration of companies in the 1950s and ’60s required organizations to address alienated employees who felt faceless amid a sea of identical gray-suited toilers, and managers were encouraged to create a climate conducive to human growth and to focus on the self-actualization needs of their employees. In the 1980s, garbage language smelled strongly of Wall Street: leverage, stakeholder, value-add. The rise of big tech brought us computing and gaming metaphors: bandwidth, hack, the concept of double-clicking on something, the concept of talking off-line, the concept of leveling up.

One of the most influential business books of the 1990s was Clayton Christensen’s The Innovator’s Dilemma. Christensen is responsible for the popularity of the word disruptive. (The term has since been diluted and tortured, but his initial definition was narrow: Disruption happens when a small company, such as a start-up, targets a limited segment of an incumbent’s audience and then uses that foothold to attract a bigger segment, by which point it’s too late for the incumbent to catch up.) The metaphors in that book had a militaristic strain: Firms won or lost battles. Business units were killed. A disk drive was revolutionary. The market was a radar screen. The missilelike attack of the desktop computer wounded minicomputer-makers. Over the next decade and a half, the language fully migrated from combative to New Agey: “I am now a true believer in bringing our whole selves to work,” wrote Sheryl Sandberg in Lean In, urging readers to seek their truth and find personal fulfillment. In Delivering Happiness, Zappos CEO Tony Hsieh described making conscious choices and evolving organically. In The Lean Startup, Eric Ries pitched his method as a movement to unlock a vast storehouse of human potential. You can always track the assimilation of garbage language by its shedding of scare quotes; in 1911, “initiative” and “incentive” were still cloaked in speculative punctuation.

Read the story

Novelist Charles Portis Was a True Original

True Grit, poster, John Wayne, Kim Darby, Glen Campbell, 1969. (Photo by LMPC via Getty Images)

For many people, Charles Portis will forever be remembered as the author of the 1968 book that became the 1969 film adaptation with John Wayne as Rooster Cogburn and then the Coen Brothers’ 2010 version. True Grit is a masterpiece. I mean that. It’s a perfect book. I feel the same about his first novel Norwood, which is a hilarious, weird road trip story. Portis’s third novel, The Dog of the South, is almost as good. I rarely say anything is perfect, but Portis’s first two novels strike me as completely satisfying, self-contained worlds that reveal greater wonders on repeat readings and are beyond improvement. I also rarely reread books, but when I’ve reread both of these, their facets only sparkle more brightly, and reveal greater finesse. Portis only published five novels in his lifetime, but by only five, I mean “only.” His legacy lies not in his total output but in his pages. These novels are dense with wit, a distinctive voice, and warped comic vision of the world, with plots driven by bumbling protagonists on long journeys that reward readers with constant laughs and endless surprises.

Portis died on February 17, 2020, at age 86. For The New Yorker, writer Wells Tower examines the author’s literary achievements, paints a brief portrait of a person who revealed little about himself, and celebrates a writer he believes was more than a comic, but a philosopher. Every fan Portis has their favorite passages, but part of his legacy is a tone that Tower calls “a shrug of quiet amusement.” His privacy also shaped his legacy. Portis avoided publicity. He dodged interviewers and kept to himself. Tower writes:

It’s hard to know whether Portis’s work ushered much comfort into his own life. My sense is that he was lonely. I imagine he had a fair bit in common with Jimmy Burns, described in “Gringos” as a “hard worker,” “solitary as a snake,” and, yes, “punctual.” Portis never married and had no children. He never published another novel after “Gringos,” from 1991. The closest he gets to self-portraiture comes in his short memoir “Combinations of Jacksons,” the essay published in The Atlantic. Toward the essay’s close, the author spots an “apparition” of his future self in the form of a geezer idling his station wagon alongside Portis at a traffic light in Little Rock. He wore “the gloat of a miser,” Portis writes. “Stiff gray hairs straggled out of the little relief hole at the back of his cap. . . . While not an ornament of our race, neither was he, I thought, the most depraved member of the gang.”

In his vision of himself at the wheel of the phantom station wagon, Portis goes on to write what feel like fitting instructions for how we ought to cope with this great and overlooked writer’s exit from the scene: “I could see myself all too clearly in that old butterscotch Pontiac, roaring flat out across the Mexican desert and laying down a streamer of smoke like a crop duster, with a goatherd to note my passing and (I flatter myself) to watch me until I was utterly gone, over a distant hill, and only then would he turn again with his stick to the straying flock. So be it.”

After reading Norwood, I fell in love with his narrative voice and wanted to know more about the person who created it. Information was scant.

Portis started his writing life as a journalist, eventually working beside future novelist Tom Wolf. By the time Portis published Norwood in 1966, he’d left the newsroom for what turned out to be forever. True Grit’s 1969 screen adaptation won John Wayne the only Oscar of his career, and generated so much money – $14.25 million at the box office – that Portis could lead a simple, quiet life in Little Rock, Arkansas, writing and frequenting local watering holes, where he was just another regular who smoked cigarettes and wet the four corners of his napkins so they didn’t stick to the bottom of his beer glass and make him look like an idiot. That’s the kind of detail Portis would have included in his books had he not been living it.

His love of beer joints made him sound accessible, so I tried to contact him back in April 2010.

Before Portis’s nonfiction miscellany Escape Velocity was published, I dug up every piece of his short nonfiction and fiction that I could in old issues of magazines like The Atlantic and Oxford American. They provided a biography, but they also generated more questions. I started piecing it all together in an essay about him and his work, where I tried to understand how his masterpieces existed in a biographical information vacuum, generating questions and speculation, what I called “a string of maybes.” His was just such a striking career turn: a lowly journalist sells his first novel to Hollywood and makes huge money, then takes increasing numbers of years to write each subsequent novel, before quiting publishing all together. Whatever his feelings about this transition from journalism to fiction, he seemed to have shared none of them with his fellow reporters. As Tom Wolfe says in The New Journalism, “One day [Portis] suddenly quit as London correspondent for the Herald Tribune. That was generally regarded as a very choice job in the newspaper business. Portis quit cold one day, just like that, without a warning.” And, after writing his first two novels, Portis “actually went on to live out the fantasy,” Wolfe says. “Portis did it in a way that was so much like the way it happens in the dream, it was unbelievable. …He sold both books to the movies…He made a fortune…A fishing shack! In Arkansas! It was too goddamned perfect to be true, and yet there it was. Which is to say that the old dream, The Novel, has never died.”

Knowing Portis refused most interviews, I decided to increase my chances of a response by asking the most pressing question I had: why, after six years as a reporter, did he decide to try writing novels for a living? I was curious about what factors went into his decision to write fiction, what his hopes were, his career concerns or frustrations with reporting, and what effect, if any, that era of literary publishing (at the dawn of the “new journalism”) had on his thinking. The most detailed treatment of the subject appeared in a rare Q&A Portis gave to the University of Arkansas in 2001. In it, he makes his decision seem simple: “As I say, the Tribune people had always treated me very well, but I wanted to try my hand at fiction, so I gave notice and went home.” He just decided to try his hand and went? Just like that? No way, I thought, rereading that; nothing is that simple.

Three months later, the literary agency kindly sent me Portis’s response to my question. It read: “I simply wanted to try my hand at fiction, and if it hadn’t worked out I would have gone back to journalism.”

I laughed out loud reading that: “try my hand at fiction.” He’d used nearly the exact same phrase in that 2001 interview. It was the phrase I was trying to get away from by emailing him. Oh well. Like everything he wrote, even his one-line email amused me. His mystery remained intact.

Read the story

The Misogyny Is the Point

Does this photo not make much sense? Neither does the actual Miss America Pageant in this post-#metoo year of our lord twenty and twenty. (Miss America 2019 Nia Franklin poses with Chewbacca on September 12, 2018 in New York City. Photo by Astrid Stawiarz/Getty Images)

Are you not already a fan of Lyz Lenz? MISTAKE. Exhibit A is this Tucker Carlson profile from 2018, but close on its heels is Exhibit B, her recent Jezebel essay about the flagging Miss America pageant. I’ve never wanted to attend Miss America in person, but I would gladly go if I got to do it in the company of Lyz Lenz. To wit:

The Miss America pageant wasn’t supposed at the Mohegan Sun Casino, but it makes sense that the pageant would end up at a place that’s both a triumph of capitalism and an absolute hellscape. The casino is divided into two main areas: earth and sky. But once inside, both real earth and real sky immediately recede. It is simultaneously soothing and disorienting. Everything anyone could possibly need is right here, especially if need consists of a Sephora and Bobby Flay’s Bar Americain.

(Here, we try not to think about how many people’s needs might actually be fully met by a Sephora and a Bobby Flay restaurant.)

After breakfast, I go to meet Miss Iowa, because that’s where I live and it’s like going to meet a state representative who you didn’t vote for, but somehow is supposed to embody something about your state that you cannot really define. I don’t end up meeting her—“pageant day,” of course. But her mother is there. Miss Iowa’s mom is confused about her daughter’s success. Not that she doesn’t think her daughter shouldn’t win, just that Emily Tinsman just started pageants in college and now here they are, in a casino in Connecticut right before Christmas. What a world.

(Here, we are all Miss Iowa’s mom.)

But whatever else they are, they are still defined by their bodies. Each contestant has to sign a contract saying they’ve never been pregnant and never had children. They can’t be older than 25 years old. They also have to be single. Translation: No abortions. Bodies: Pure. Even if they aren’t donning swimsuits and strutting on stage, their viability in the pageant is about the sanctity of their bodies. Just like Margaret Gorman, the childlike, innocent first winner, they must remain pure objects of desire—tight, poised, flesh vessels for our values.

(Ouch. Awful. And so America.)

If you’re not yet convinced, there’s also an Exhibit C: her name, which is cooler than my name and your name put together, and I say that with confidence despite not knowing what your name is. It’s a sharp name for a sharp writer.

Read this essay, is my point. It’s time well-spent.

Read the story

Wait, What?

Chung Sung-Jun / Getty, Photo illustration by Katie Kosma

Soraya Roberts | Longreads | February 2020 |  9 minutes (2,335 words)

I used to think I was the only one who dealt with this particular existential crisis. It’s the one where every choice you make coincides with the torture of knowing that you didn’t choose something else. And that something else, by virtue of not being chosen, has infinite potential for being the right choice. It’s a fallacy, of course. Because usually there is no right or wrong decision, just a decision. And when that decision is made, it’s not as final as all that. It’s one option in a series of options your life is made up of, some of which have bigger consequences, most of which have smaller ones. But that fallacy is what we bring to any prize or award or, you know, any competition that culminates in a reward of some kind. It makes sense, because it’s binary — you get it or you don’t — but the consequences usually aren’t. It certainly feels like your life will fundamentally change if you win, but more often than not that’s not the case. The choice is made, everyone goes ballistic, and pretty soon after everything goes back to how it was.

A South Korean movie with subtitles was not supposed to win four Oscars, an 18-year-old girl who makes music in her brother’s bedroom wasn’t supposed to take home five Grammys, and a foul-mouthed British woman shouldn’t have bagged three Emmys. There’s a cognitive dissonance to all of this, because, by now, we expect our institutions — Hollywood or otherwise — to make the wrong choices, which we expect because these institutions are populated by people who don’t actually reflect the world, only its most privileged citizens. And what’s a greater distillation of an out-of-touch industry’s allegiances and exclusions than the awards it bestows? The Emmys are The Big Bang Theory, the Grammys are “Shape of You,” the Oscars are Green Book. Filmmaker Bong Joon-ho, the one who took home those four statuettes for Parasite, could have been speaking about any number of ceremonies when he infamously said last year of the Oscars, “They’re very local.” Which I took to mean that the Academy tends to reward not only Americans, but work that expresses the white capitalist values that form American society (and Hollywood within it). When Parasite won, the dissonance didn’t just suddenly resolve itself, because we knew underneath that win that Hollywood itself hadn’t actually changed. So we burdened what should have been a moment of unadulterated joy with analysis — about the work, about the winner, about the voters, about the audience, about cinema. In Parasite terms, we covered it in peach fuzz.

* * *

It’s weird when deserving people win. It’s like a mindfuck. That’s what I thought (and tweeted) after Bong Joon-ho won the final Oscar of the year. What else do you say? It’s like being in the middle of a verbal sparring match with someone and they suddenly spit out something reasonable. You’re struck dumb. The Oscars almost never get it right, and when they get it wrong, it’s wrong (remember Crash?). This year, seeing the stage full of artists who are usually shut out of the ceremony — non-Americans, people of color, people with actual talent — accepting “Hollywood’s biggest honor” infected us all with such a severe case of cognitive dissonance I could hear our brains collectively short-circuit. And because of the way cognitive dissonance works, because it means we do everything we can to reconfigure the situation to align with what we believe to be true — in this case, that the Academy is “local” — Parasite’s Best Picture win was encumbered by mental acrobatics. It was as though no one wanted to get too intoxicated because they had experienced the sobering return to the status quo so many times before. The award became a spoil of war over identity politics, doubly here, because not only is Bong South Korean, but Parasite is also in Korean. That meant no one could just enjoy its triumphs outside the context of its ethnic dynamics.

It was barely more than a month ago that Issa Rae deadpanned, “Congratulations to those men,” while announcing the all-male Oscar nominees for Best Director. In the all-white-but-one category, the best we could hope for was a win by the Asian genius, who, as luck would have it, had also made the best film (enough about The Irishman). And when Bong’s film was announced after a suitably dramatic pause by Jane Fonda, it all went so smoothly, it was like it was meant to be. This wasn’t the Moonlight fiasco, that embarrassing stutter in 2017 where the ceremony juddered with a, yeah, no, the better one, the black one, that’s the one that won, sorry, where’s the trophy? But that historic faux pas is still so fresh that its shadow is still cast across the Academy’s stage. It’s a not-so-distant reminder that stories like those continue to be interlopers, and one that partially but inevitably eclipses wins like Bong’s, which, all things being fair, should not have to answer for it. But he does. Per Adam Nayman at The Ringer, “a skeptic might wonder about the enthusiasm of any filmmaker — even such an obviously wry, self-styled subversive — desiring membership to a club that’s not always open or accommodating.” It’s true, but it is also true that this is a wonder that does not tend to greet the likes of Martin Scorsese or Quentin Tarantino. Because nothing they do, nothing they or their films represent, really clashes with this particular gentlemen’s club. They are white men presenting films focused on white men to a group of white men. There is no dissonance there to correct.

Unless you’re Joaquin Phoenix, who briefly shouldered the dissonance plaguing his marginalized peers. Prior to his Oscar win, the Joker star was extolled on social media for his self-flagellating speech at the diversity-blind BAFTAS. “I think that we send a very clear message to people of color that you’re not welcome here,” he said, reportedly to some uncomfortable silence. “This is not a self-righteous condemnation because I’m ashamed to say that I’m part of the problem.” While Phoenix initially walked off the BAFTAS stage leaving his trophy behind, picking up the Oscar so soon after that implied a tacit acceptance of Hollywood’s problematic politics, if not Britain’s. Engaging in the awards ceremony, being bowled over by a win of any kind,  implies that on some level you respect the institution, you believe in it. The only way around this, really, is full-out rejection.

Several actors have avoided any hint of hypocrisy by extricating themselves from awards proceedings entirely. Marlon Brando infamously sent an Indigenous woman to reject his Oscar on the grounds of the film industry’s mistreatment of the Indigenous community, while George C. Scott preceded him by refusing to participate in 1970 in what he called a “two-hour meat parade, a public display with contrived suspense for economic reasons.” (That he did engage later somewhat undercutting his stance.) This has bled outside the Academy, to other industries where awards act as the ultimate expression of their ideals: Julie Andrews snubbed the Tonys for snubbing the rest of her team, for one, while knighthood after knighthood has been passed over over the years to protest the enduring monarchy. After declining the Nobel Prize for Literature, Jean-Paul Sartre outlined how an award is inextricable from its awarding body and the awarding body’s history. “The writer who accepts an honor of this kind involves as well as himself the association or institution which has honored him,” he wrote. “The writer must therefore refuse to let himself be transformed into an institution, even if this occurs under the most honorable circumstances, as in the present case.”

Increasingly aware that awards doled out by older institutions are misrepresentative of the culture and, in the case of the Grammys at least so committed to misconduct they will essentially fire even the CEO for confronting their sexism, artists have turned to smaller events for direction. Free of institutionalized myopia, they move more fluidly with the times. Before the Nobel committee announced it was awarding genocide denier Peter Handke the literature prize, for instance, The New York Times published a conversation among critics in which the Booker Prize (big in the industry, less outside of it) was floated as more indicative of the literary world’s proclivities; two women, Margaret Atwood and Bernardine Evaristo, shared the award the same year Handke won the Nobel. Meanwhile, the Independent Spirit Awards have openly owned their status as the official alternative, riffing this year — “we recognize female directors — all two of them!” — on the gaping lacunae the Oscar nominations left behind. Lulu Wang’s The Farewell won the top prize, while Adam Sandler secured a long-awaited win for his frenetic, lived-in performance in Uncut Gems. On the podium, the Sandman directly confronted the Academy he had only poked fun at on social media. He compared the situation to being passed over in high school for most good looking — in favor of a “feather-haired douchebag” — and winning best personality instead. “So let all of those feather-haired douchebag motherfuckers get their Oscars tomorrow night,” he said. “Their handsome good looks will fade in time, while our independent personalities will shine on forever.”  

Oscar winner Bong does happen to have feathered hair, but cognitive dissonance still accompanied his victory as a corrective for how unexpected it was. Parasite won four awards, yes, but why no acting prizes? Racism, obviously. The wider skeptical responses to what appeared to be attempts by the Academy to be a little “woker” further unmasked them as shallow performance, sometimes literally. The opening Janelle Monáe–led musical number? “Diversity,” a number of critics of color deadpanned. Natalie Portman’s cape festooned with the cursive names of overlooked female filmmakers? Hypocrisy. Her production company has worked mostly with men. Meanwhile, Renée Zellweger’s win was just a reminder of Judy Garland’s lack of wins, and Joaquin Phoenix’s speech was more like an ad for PETA. The complaints had varying levels of validity, but why the impulse to make them so expediently? There seemed to be this overarching need to expose the flaws in what appeared to be a precarious night based on a set of arbitrary choices — to cast aside these momentary remedies to reveal the foundational faults that cannot in the long run support them. 

This is the drive to push for deeper systemic change where we can, to protest where there is nothing apparent to protest, to miss no chances. To revel in a win is to fleetingly ignore everything that’s wrong, and there’s no time left for that. A symbol of progress like Parasite thus becomes shackled by its own symbolism, dragging along the wider sociocultural implications with its artistry. It then becomes not only a perfectly executed piece of filmmaking, but the Oscar anomaly, the one which bolsters our expectations of the Academy, the foreign film which secures a wider theatrical run post-win, the popular nonwhite release standing in for all the nonwhite releases.

* * *

“Cognitive dissonance is a motivating state of affairs,” wrote social psychologist Leon Festinger, who coined the term. “Just as hunger impels a person to eat, so does dissonance impel a person to change his opinions or his behavior.” Bong didn’t expect to win over the Oscars. The dissonance he felt was clear in the way he admired his trophy on stage, the way he proceeded to lead a standing ovation for fellow nominee Scorsese, who he quoted — “The most personal is the most creative” — and praised along with the remaining nominees: Tarantino, Todd Phillips, and Sam Mendes. “If the Academy allows,” he concluded. “I would like to get a Texas chainsaw, split the Oscar trophy into five and share it with all of you.” That the director from South Korea who made a quintessentially South Korean film felt the need to create a feeling of inclusivity on a quintessentially American stage says something about where America, if not the Oscars, is right now. That is to say, that marginalized communities, while protesting their historical treatment, can also recognize the merits of the institutions that have neglected them, deferring to aspects of their legacies despite their lack of diversity. 

But the opposite is rarely true. The institutions and the people who represent them should be deferring to the populations that they have overlooked for so long. But they don’t; just look at Tarantino’s refusal at Cannes to even engage in a question about gender politics with respect to Once Upon a Time … in Hollywood. Which is why Phoenix’s words at the BAFTAs were so powerful, because he was admitting that in some sense it is a zero-sum game, that his chance denied someone else’s, that he was complicit in this denial. It was groundbreaking when it really shouldn’t be, when for nonwhite filmmakers like Bong this level of discourse is expected.

Generally, it’s up to the outsiders to help other outsiders. On the Oscars red carpet, Bong made sure to mention Lulu Wang’s The Farewell, which had been overlooked, despite taking Best Picture at the Independent Spirit Awards. Insiders seem to miss this heightened urgency around inclusivity because it is not urgent for them. Critics clamored to determine what Parasite’s win could mean for American cinema, but that question was beside the point. The unexpected win by an international artist on domestic soil says less about the cracks in Hollywood’s traditions than it does about the world, which almost imperceptibly but certainly is changing both despite us and because of us, both for the worse and for the better, with marginalized populations leading the biggest changes of all. As always, Bong was already aware of this communal dissonance before everyone else. As he said at the Lumière Festival in October: “When I made Parasite, it was like trying to witness our world through a microscope. The film talks about two opposing families, about the rich versus the poor, and that is a universal theme, because we all live in the same country now: that of capitalism.”  

* * *

Soraya Roberts is a culture columnist at Longreads.

Soli/dairy/ty

The Image Bank / Getty Images Plus, Luis Villasmil / Unsplash, Photo illustration by Katie Kosma

Liza Monroy | Longreads | February 2020 | 15 minutes (3,637 words)

On the verge of turning 40, all my habits felt ingrained. So I was surprised when, late last February, I became vegan one morning, following an intuitive stab out of the ether. It made no sense, not yet, and Joaquin Phoenix’s viral Oscar speech was still a year into the future, but I’d promised myself to always follow my instincts after, 10 years prior, that little voice within had attempted to warn me to hide my laptop before leaving my apartment. Perplexed by the absurdity of this non-thought, I’d ignored it only to return to find the laptop submerged in the bathtub, fallen victim to a vengeful ex-boyfriend’s rage. Life had since quieted and so had the little voice, until it resurfaced whispering, be vegan for the month of March.

As a 20-year ovo-lacto vegetarian-with-a-sushi-exemption, I found the hunch puzzling. Still, the voice had spoken, so I didn’t question it, though I did start searching for reasons. As a second-time mother to an infant, then seven months old, I felt lacking in structure, focus, and goals, and veganism gave me a way to try and put some version of that back into my life. Or perhaps, like a culinary Oulipian, further constraints would spike creativity, breaking my egg-and-cheese-bagel,-salmon-nigiri routine with more colorful vegetables. What I definitely wasn’t thinking: dairy cows, other than to joke that, hooked up to my mechanical breast pump, I felt like one.

Though I couldn’t pinpoint a rationale for my non-choice, I knew what I wasn’t and would never become: one of those unpleasant extremists who espoused “radical vegan propaganda,” who harass you with pamphlets depicting horrifying conditions of factory farms.

And then I went to VegFest. The pamphlet was lying on a table with others containing recipe ideas and shopping lists. But this one, about the practices of the dairy industry, caught my nursing-mama attention in a new way: “A cow must regularly give birth to produce profitable amounts of milk,” it read. Though I was against killing animals, I’d believed dairy was only a matter of taking something that was already there. I’d operated under the assumption that milking a cow was taking a nutritionally beneficial substance that would otherwise go to waste, as if all dairy cows were overproducers like me, milk running in streams. I’d never encountered this simple information about their pregnancy. “Similar to humans,” the pamphlet continued, “a cow’s gestation period is about nine months. In that time she develops a strong desire to nurture her baby calf — a calf that will be taken from her hours or days after birth. Cows can live more than 20 years, however they’re usually slaughtered once lactation decreases at about 5 years of age.”

At first it was the babies being taken away that got me. Motherhood had instilled in me an understanding of the deep, cellular-level, biological attachment to the calf. It must not be entirely true, I insisted to myself. This pamphlet was the dreaded “militant vegan propaganda.” I went online in search of contradictory information, but even meat-industry trade publications indicated this process is but simple fact-of-the-matter, nothing to get worked up about.

An article by rancher Heather Smith Thomas in Beef Magazine states that, “There’s a complex hormone system involved in causing birth and initiating lactation.” Pregnancy and birth for a cow entails a physiological process nearly identical to humans’. The mother’s body produces oxytocin during labor, bonding her to her calf and bringing on a strong desire to nurse. Exactly like the pamphlet said. Exactly like my own experience.

Suddenly, I felt a little, well, militant in spite of myself. The timing of having recently become a small-scale milk producer again made it obvious in retrospect: milk wasn’t just there, in mammals’ mammary glands. You had to have a baby to get it there. I didn’t just happen to have milk in my udders either — I had to get pregnant and give birth before it came and turned my breasts into hot, painful footballs only my baby or a horrible breast-pump could relieve. I’d had no idea my beloved ice cream and pizza were the cause of suffering. But dairy cows with lower production rates are not economically viable. They are sent sooner to slaughter.

Sailesh Rao, a Stanford PhD and former systems engineer who founded Climate Healers, a nonprofit fighting climate change, told me: “During a visit to the Kumbalgarh Wildlife sanctuary in India I observed how the forest was being destroyed by cows eating anything new growing out of the ground while old-growth trees were being cut down. I realized it was even better to eat some beef to finish off the cows after I had exploited them for milk. I resolved to go vegan on the spot.”

Environmental reasons were obvious, but on the compassion front, for years I’d taken imagery on dairy-milk cartons literally: peaceful cows standing in fields beside gentle farmers seated on stools, red barn in the background under a vast open sky. Was that the real propaganda? In YouTube videos of the routine dairy-farm practice of taking newborn calves from their mothers, the distress cries sound chillingly like daycare drop-off, except the afternoon reunion will never come.

I grabbed a couple of magnets and affixed the pamphlet to the fridge.
Read more…