Search Results for: new york times

Falling Stars: On Taking Down Our Celebrity Icons

Illustration by Homestead

Soraya Roberts | Longreads | May 2019 | 7 minutes (1, 868 words)

The shorthand iconography of the star has been the iconography of excess — furs, gold, pearls, diamonds, stacks of cash, lots of lights, lots of people. It’s luxury personified, the human being at its apex, the kind of intermediary between gods and humans that the ancient Egyptians didn’t just dress with jewels, but buried with them, transcending mortality. And who doesn’t want to be immortal? Especially these days, when we are very much the opposite: when aspiration has been replaced with desperation and extinction is the inevitable end, or maybe hell, but definitely not heaven. The old accoutrements of success, the ones that defined celebrity — wealth, power, decadence — are going extinct too. And anyone who continues to buy into them, is either performing satire (see Billy Porter in city-spanning golden wings) — or is, well, Drake.

The “God’s Plan” singer, who upon last estimation was worth around $90 million, unveiled his own private Boeing 767 cargo plane, Air Drake, in an Instagram video last week, a pair of praying hands on the tail fin speaking for us all. “No rental, no timeshare, no co-owners,” he said. No reality check either, apparently. While Drake framed it as his way of supporting a homegrown business (Ontario’s Cargojet), his very own “Heat of the Moment” lyrics — “All the niggas we don’t need anymore / And all the cops are still hangin’ out at the doughnut shops / Talkin ’bout how the weather’s changin’ / The ice is meltin’ as if the world is endin’” — caused a number of people to point out his hypocrisy. (He captioned the video, “Nothing was the same for real,” which I don’t believe is a reference to the planet’s demise, but maybe he was being meta.) It had been only seven months since Kanye and Kim Kardashian West were vilified for flying aboard a 660-seater Boeing. Basically alone. “No big deal,” Kardashian West said on Instagram. “Just like a chill room. This is, like, endless.” No, there’s an end. Their chill trip happened less than two months after the end days climate report came out.

At one point these stars were icons of the kind of success we aspired to. But having seen how the old capitalist system they symbolize has destroyed the world, the movement to destabilize it has also become a movement to destabilize them as its avatars. This includes idols of technology like Mark Zuckerberg, the once-envied wunderkind who is now someone who should be held “accountable”; business giants like Disney CEO Bob Iger, whose compensation is “insane” according to one member of the family dynasty; and political stars like Pete Buttigieg and Beto O’Rourke, both of whom were called out for their campaigns’ big donors. In our culture today, the guy who makes music out of his closet has the No. 1 song on the Billboard Hot 100 chart and the revolutionaries are schoolchildren. “The star is meant to epitomize the potential of everyone in American society,” writes P. David Marshall in Celebrity and Power: Fame in Contemporary Culture. “The dialectical reality is that the star is part of a system of false promise in the system of capital.”

* * *

The debate over whether success should be defined by wealth goes as far back as civilization itself. I asked my brother, a philosophy professor specializing in the ancients (I know), when it first turned up in the literature, and he told me it was “the base note” through most of Plato. Then there was Socrates, who thought knowledge, not wealth, should be the marker of success, versus Aristotle, who thought wealth was essential to the good life. Regardless of their differences, greed, my brother said, was almost always considered pathological. But then along came capitalism, which was popularized (peut-être) by French socialist Louis Blanc, who wrote Organisation du Travail, in which he defined it as “the appropriation of capital by some to the exclusion of others.” Within capitalism, greed became associated with productivity, which was correlated with a successful economy, and so greed was good (you try not to quote Gordon Gekko!). Along with it, those who were greedy were accepted, even admired, under certain conditions. A 2015 study had a bunch of U.K. teenagers excusing Bill Gates’s extreme wealth (more than $100 billion) as merit-based, the necessary evil of a capitalist system in which a hard-working individual can triumph the way they would like to one day.

The celebrity is the ultimate symbol of success, which, under capitalism, becomes the ultimate symbol of greed. “Celebrities reinforce the conception that there are no barriers in contemporary culture that the individual cannot overcome,” writes Marshall. And though Julius Caesar ended up on a coin, dating the monetization of fame back to ancient Rome, you can blame the French Revolution for a modern star like James Charles, who launched a YouTube channel of makeup tutorials at age 16 and within four years had more than 1.7 billion views. After the monarchy was overthrown, power and fame no longer required inheritance, which is why celebrity is sometimes (erroneously) associated with rebellion. But while the common man was ascending, so was individualism, along with mass media and the industrial revolution. The lord and serf were replaced by the businessman and employee and bourgeois culture expanded at the expense of its working-class analog. The icon of this new capitalist society, which had been weaned on the Romantic Era’s cult of personality, was the commodified individual who reinforced consumption: the celebrity. As Milly Williamson explains in Celebrity: Capitalism and the Making of Fame, “Celebrity offers images of inclusion and plenty in a society shaped by exclusion and structured in want.”

Is anyone playing the Kim Kardashian: Hollywood game anymore? The object was to use anything you had access to, whether material, money, or people, to advance. It was clearly a meta-tongue-in-cheek bit of cutesy puff, but it also wasn’t. Kim Kardashian West is you in the game and you in real life. Consumerism isn’t just consumption, it’s emulation. We consume to improve ourselves as individuals — to make ourselves more like Kardashian West, who is presented as the pinnacle of success — as though our self-actualization were directly associated with our purchasing power. And the same way we have commodity selves (I am Coke, not Pepsi; Dell, not Mac) we have celebrity selves. For instance, I’m a Winona Ryder person, not a Gwyneth Paltrow person (is anyone?). So my identity could very well be solidified based on whether I can find that Tom Waits shirt she always wears. And in these days of faces of brands, shaping yourself around Kim Kardashian West can actually mean shaping yourself around a $15,000 dress. “It is pointless to ask what Kim Kardashian does to earn her living: her role is to exist in our minds,” writes George Monbiot in The Guardian. “By playing our virtual neighbour, she induces a click of recognition on behalf of whatever grey monolith sits behind her this week.”

So who cares, right? So what if I want to be a $5,000 Louis Vuitton bag slung over Michelle Williams’s shoulder? It’s a little limiting, I guess, but fine (maybe?) — if we can trust the world to run fairly around us. According to a 2007 study in the International Journal of Cultural Studies, Brits who closely followed celebrity gossip over other types of news were half as likely to volunteer, less politically engaged, and the least likely to vote or protest. “It’s the capacity of these public figures to embody the collective in the individual,” writes Marshall, “which identifies their cultural signs as powerful.” It also identifies them as inert proxies for real community action. There is a veneer of democracy to consumerism, in that we are free to choose what we buy. But we are exercising our freedom only through buying (never mind that the options aren’t infinite); we are not defined as citizens, but as consumers. That the consumer has eclipsed the citizen explains in part why the appeals around climate change have been increasingly directed at the individual, pointing out how they will personally suffer if the world around them does — in a sea of individuals, the planet’s distress was not impetus enough. “The most important democratic achievements have been the result of working-class struggle and collective movements,” writes Williamson. “What is really extraordinary about working-class identity is not the potential celebrity in each of us, but precisely the solidarity and collectivity that is largely hidden from media representations of ordinary people.”

* * *

When Time released its list of the 100 most influential people in the world last month, I noticed that under the Icons category one of the images was a silhouette. Among all of those colourful portraits of famous faces, Mirian G. was an individual erased. I initially thought it was a power move, that this woman had chosen to trade in her identity for a larger cause. It turned out she was a Honduran asylum seeker, part of a class-action suit filed by the ACLU on behalf of families separated at the border, and that she had to be anonymous to protect herself. “In 2018, over 2,700 children were separated from their parents at the U.S.-Mexico border,” wrote Kumail Nanjiani. “Since that number is so unfathomably large, I think it is helpful to focus on one woman’s story.” In essence, the magazine found a way around the individual-as-icon, turning a spot for one into representation for many. It was a timely move.

It’s not that fame has become defunct — one study found that a number of millennials would literally trade their family for it — but celebrity isn’t the opiate it once was. Younger generations side-eye star endorsements, while online influencers, who affect the tone of friendly advice, have acquired monumental cache. (Though James Charles recently lost millions of YouTube subscribers following a very public fallout with fellow beauty vlogger Tati Westbrook, he still has more than 13 million.) It comes with a catch, though: Millennials will actually pay more for brands that are socially responsible. This aligns with the growing number of young activists, not to mention the U.S.’s youth voter turnout in 2018, the highest in a midterm election since 1982. As Williams concludes, “celebrity culture presents the human in commodity form, but it also consists of its opposite — the human can never be fully contained by the self-as-commodity, and the persistence of humanity is, in all circumstances, a cause for hope.”

While the citizen and consumer were once conflated, they now coexist, a separation that sometimes leads them to be at odds. The celebrity, the symbol of the latter, can in the same way clash with the former. In a context like this, Alyssa Milano’s ill-conceived sex strike, the latest case of a celebrity ham-fistedly endorsing feminist activism, is no longer simply swallowed in good faith. There is no good faith left, not even for our stars. They are symbols of an economy that consumes everything in its path, and struggling with them is part of a collective struggle with the inequitable, exploited world we live in, one in which each callout will hopefully add up to some semblance of change.

* * *

Soraya Roberts is a culture columnist at Longreads.

High Expectations: LSD, T.C. Boyle’s Women, and Me

Illustration by Homestead

Christine Ro | Longreads | May 2019 | 16 minutes (4,208 words)

I’m sweaty, exhausted, and red-faced when I finally emerge from my final acid trip. My apartment is a mess of objects my friends and I have tried feeling, smelling, or otherwise experiencing: loose dry pasta, drinks of every kind, hairbrushes, blankets. My voice is hoarse from talking or shouting all night. I’ve had more emotional cycles in the past 12 hours than in the last several months combined.

What made me want to drop acid wasn’t a friend or a festival, but a book. Specifically, T.C. Boyle’s new novel Outside Looking In. The book has its problems, but one thing it gets right is the intensely social experience of LSD. Even taken alone, even as a tool for introspective reflection, it rejigs attitudes towards other people. This can be a gift, or it can be a weapon. And as a woman, I’m especially aware of the potential for the latter. Read more…

Why the Moon Is Suddenly a Hot Commodity

FeatureChina via AP Images

Since astronauts last walked on the moon in 1972, no person has visited this cold lunar body, but a renewed interest in the moon as an economic and scientific resource has launched a new space race. For The New Yorker, Rivka Galchen explores why many countries and private interests, from Boeing to Jeff Bezos, are developing technology and plans to put people back on the moon to mine, inhabit, and use it to launch other space craft.

Now, you will ask me what in the world we went up on the Moon for,” Qfwfq, the narrator of Italo Calvino’s “Cosmicomics,” says. “We went to collect the milk, with a big spoon and a bucket.” In our world, we are going for water. “Water is the oil of space,” George Sowers, a professor of space resources at the Colorado School of Mines, in Golden, told me. On the windowsill of Sowers’s office is a bumper sticker that reads “My other vehicle explored Pluto.” This is because his other vehicle did explore Pluto. Sowers served as the chief systems engineer of the rocket that, in 2006, launched nasa’s New Horizons spacecraft, which has flown by Pluto and continued on to Ultima Thule, a snowman-shaped, nineteen-mile-long rock that is the most distant object a spacecraft has ever reached. “I only got into space resources in the past two years,” he said. His laboratory at the School of Mines designs, among other things, small vehicles that could one day be controlled by artificial intelligence and used to mine lunar water.

Water in space is valuable for drinking, of course, and as a source of oxygen. Sowers told me that it can also be transformed into rocket fuel. “The moon could be a gas station,” he said. That sounded terrible to me, but not to most of the scientists I spoke to. “It could be used to refuel rockets on the way to Mars”—a trip that would take about nine months—“or considerably beyond, at a fraction of the cost of launching them from Earth,” Sowers said. He explained that launching fuel from the moon rather than from Earth is like climbing the Empire State Building rather than Mt. Everest. Fuel accounts for around ninety per cent of the weight of a rocket, and every kilogram of weight brought from Earth to the moon costs roughly thirty-five thousand dollars; if you don’t have to bring fuel from Earth, it becomes much cheaper to send a probe to Jupiter.

Experts predict a host of benefits and problems with this renewed lunar interest, from environmental damage to political tensions involving the 1976 Outer Space Treaty. Some see the moon as more of a solution than a problem.

“There’s the argument that we’ve destroyed the Earth and now we’re going to destroy the moon. But I don’t see it that way,” Metzger said. “The resources in space are billions of times greater than on Earth. Space pretty much erases everything we do. If you crush an asteroid to dust, the solar wind will blow it away. We can’t really mess up the solar system.”

Read the story

The Women Characters Rarely End Up Free: Remembering Rachel Ingalls

Gaia Banks / New Directions Publishing

Ruby Brunton | Longreads | April 2019 | 10 minutes (2,674 words)

Rachel Ingalls, who passed away earlier this year at the age of 78, was a writer who did not seek out the spotlight, but found it not at all unpleasant when at last it came. Beyond a small circle of loyal friends and regular visits to Virginia to see her family, Ingalls lived a fairly reclusive existence after her move from the U.S. to the U.K. in 1965. “I’m not exactly a hermit,” she said, “but I’m really no good at meeting lots of strangers and I’d resent being set up as the new arrival in the zoo. It’s just that that whole clubby thing sort of gives me the creeps.”

A writer of fantastical yet slight works of fiction, with a back catalog numbering 11 titles in total, Ingalls flew more or less under the literary radar until recent years, when the newfound interest that followed the 2017 re-issue of her best-known book, Mrs. Caliban, also finally allowed her readers to learn about her processes and motivations; the attention slowly brought her into the public eye. Reviews across the board revered the oddly taciturn novella, in which mythic elements and extraordinary happenings are introduced into the lives of otherwise normal people by a prose remarkable for its clarity and quickness. “Ingalls writes fables whose unadorned sentences belie their irreducible strangeness.” Wrote Lidija Haas in The New Yorker; in the same piece she described Ingalls as “unjustly neglected.” (Mrs. Caliban was also lightheartedly celebrated as a venerable addition to popular culture’s mysterious year of fish sex stories, a fittingly strange introduction of her work to a broader readership.)  Read more…

A Woman’s Work: The Inside Story

All artwork by Carolita Johnson.

Carolita Johnson | Longreads | April 2019 | 23 minutes (5,178 words)

The subject of my pre-doctoral studies was medieval nuns and their relationship to their menstrual cycles. Long story short: my theory was that this relationship was determined by the very real divide between the early Christians who favored either the Old Testament or the New Testament on the inherent “sinfulness” or absence thereof of the human body. The traditional, Old Testament attitude that menstruation made women “unclean” somehow prevailed. Fancy that. Call me crazy, but I had to believe that the way the Church, the Patriarchy, and all of society saw women’s bodily functions had an effect on women’s relationships with their bodies.

Stories of menstruating women ruining mirrors they looked into, or causing soufflés to fall, causing farm animals to miscarry, mayonnaise to “not take,” etc., and menstrual blood used as an ingredient in cures for leprosy or magic potions, were common. But even if they were all but forgotten by modern times, they merge easily into my being taught, in the 1980s, to call my period “The Curse.”

I’d noticed, in many hagiographies, that one of the first signs a woman might be a saint, besides experiencing ecstatic “visions,” was that she’d barely, if at all, need to eat or drink anymore, and her various bodily secretions would cease. I wondered if nuns might be using herbs, self-starvation, and/or physical exertion to put an end to their secretions, amongst which, their periods.

Compare this to how, in modern times, many women, including myself, would use The Pill without the classic 7-day pause in dosage to skip an inconveniently timed period. This pause was designed to give women on the Pill a “period” that was more symbolic than functional, almost more of a superstition, and totally unnecessary, medically speaking. Recent years have even seen the introduction of contraceptive pills actually designed to limit a woman to 0-4 periods a year — hormonally inducing amenorrhea, or absence of menstruation. There are times when women want to avoid having their periods, for example, during vacations, sports events (with the notable exception of Kiran Ganhi), honeymoons; in other words, times when we want to be at our best and free of physical impairments or, let’s be frank: free from the anxiety of being discovered menstruating. Some of us opt to be free from that anxiety year-round now. I think medieval nuns would have loved to have that option.

Read more…

When Did Pop Culture Become Homework?

Kevin Winter / Getty, Collage by Homestead

Soraya Roberts | Longreads | April 2019 | 6 minutes (1,674 words)

I didn’t do my homework last weekend. Here was the assignment: Beyoncé’s Homecoming — a concert movie with a live album tie-in — the biggest thing in culture that week, which I knew I was supposed to watch, not just as a critic, but as a human being. But I didn’t. Just like I didn’t watch the premiere of Game of Thrones the week before, or immediately listen to Lizzo’s Cuz I Love You. Instead, I watched something I wanted to: RuPaul’s Drag Race. What worse place is there to hide from the demands of pop culture than a show about drag queens, a set of performance artists whose vocabulary is almost entirely populated by celebrity references? In the third episode of the latest season, Vietnamese contestant Plastique Tiara is dragged for her uneven performance in a skit about Mariah Carey, and her response shocks the judges. “I only found out about pop culture about, like, three years ago,” she says. To a comically sober audience, she then drops the biggest bomb of all: “I found out about Beyoncé legit four years ago.” I think Michelle Visage’s jaw might still be on the floor.

“This is where you all could have worked together as a group to educate each other,” RuPaul explains. It is the perfect framing of popular culture right now — as a rolling curriculum for the general populace which determines whether you make the grade as an informed citizen or not. It is reminiscent of an actual educational philosophy from the 1930s, essentialism, which was later adopted by E.D. Hirsch, the man who coined the term “cultural literacy” as “the network of information that all competent readers possess.” Essentialist education emphasizes standardized common knowledge for the entire population, which privileges the larger culture over individual creativity. Essentialist pop culture does the same thing, flattening our imaginations until we are all tied together by little more than the same vocabulary.

***

The year 1987 was when Aretha Franklin became the first woman inducted into the Rock and Roll Hall of Fame, the Simpson family arrived on television (via The Tracey Ullman Show), and Mega Man was released on Nintendo. It was also the year Hirsch published Cultural Literacy: What Every American Needs to Know. None of those three pieces of history were in it (though People published a list for the pop-culturally literate in response). At the back of Hirsch’s book, hundreds of words and quotes delineated the things Americans need to know — “Mary Had a Little Lamb (text),” for instance — which would be expanded 15 years later into a sort of CliffsNotes version of an encyclopedia for literacy signaling. “Only by piling up specific, communally shared information can children learn to participate in complex cooperative activities with other members of their community,” Hirsch wrote. He believed that allowing kids to bathe in their “ephemeral” and “confined” knowledge about The Simpsons, for instance, would result in some sort of modern Tower of Babel situation in which no one could talk to anyone about anything (other than, I guess, Krusty the Klown). This is where Hirsch becomes a bit of a cultural fascist. “Although nationalism may be regrettable in some of its worldwide political effects, a mastery of national culture is essential to mastery of the standard language in every modern nation,” he explained, later adding, “Although everyone is literate in some local, regional, or ethnic culture, the connection between mainstream culture and the national written language justifies calling mainstream culture the basic culture of the nation.”

Because I am not very well-read, the first thing I thought of when I found Hirsch’s book was that scene in Peter Weir’s 1989 coming-of-age drama Dead Poet’s Society. You know the one I mean,  where the prep school teacher played by Robin Williams instructs his class to tear the entire introduction to Understanding Poetry (by the fictional author J. Evans Pritchard) out of their textbooks. “Excrement,” he calls it. “We’re not laying pipe, we’re talking about poetry.” As an alternative, he expects this class of teenagers to think for themselves. “Medicine, law, business, engineering, these are all noble pursuits, and necessary to sustain life,” he tells them. “But poetry, beauty, romance, love, these are what we stay alive for.” Neither Pritchard nor Hirsch appear to have subscribed to this sort of sentiment. And their approach to high culture has of late seeped into low culture. What was once a privileging of certain aspects of high taste, has expanded into a privileging of certain “low” taste. Pop culture, traditionally maligned, now overcompensates, essentializing certain pieces of popular art as additional indicators of the new cultural literacy.

I’m not saying there are a bunch of professors at lecterns telling us to watch Game of Thrones, but there are a bunch of networks and streaming services that are doing that, and viewers and critics following suit, constantly telling us what we “have to” watch or “must” listen to or “should” read. Some people who are more optimistic than me have framed this prescriptive approach as a last-ditch effort to preserve shared cultural experiences. “Divided by class, politics and identity, we can at least come together to watch Game of Thrones — which averaged 32.8 million legal viewers in season seven,” wrote Judy Berman in Time. “If fantasy buffs, academics, TV critics, proponents of Strong Female Characters, the Gay of Thrones crew, Black Twitter, Barack Obama, J. Lo, Tom Brady and Beyoncé are all losing their minds over the same thing at the same time, the demise of that collective obsession is worth lamenting — or so the argument goes.” That may sound a little extreme, but then presidential-hopeful Elizabeth Warren blogs about Game of Thrones and you wonder.

Essentializing any form of art limits it, setting parameters on not only what we are supposed to receive, but how. As Wesley Morris wrote of our increasingly moralistic approach to culture, this “robs us of what is messy and tense and chaotic and extrajudicial about art.” Now, instead of approaching everything with a sense of curiosity, we approach with a set of guidelines. It’s like when you walk around a gallery with one of those audio tours held up to your ear, which is supposed to make you appreciate the art more fully, but instead tends to supplant any sort of discovery with one-size-fits-all analysis. With pop culture, the goal isn’t even that lofty. You get a bunch of white guys on Reddit dismantling the structure of a Star Wars trailer, for instance, reducing the conversation around it to mere mechanics. Or you get an exhaustive number of takes on Arya Stark’s alpha female sex scene in Game of Thrones. One of the most prestige-branded shows in recent memory, the latter in particular often occupies more web space than its storytelling deserves precisely because that is what it’s designed to do. As Berman wrote, “Game of Thrones has flourished largely because it was set up to flourish — because the people who bankroll prestige television decided before the first season even went into production that this story of battles, bastards and butts was worth an episodic budget three times as large as that of the typical cable series.” In this way, HBO — and the critics and viewers who stan HBO — have turned this show into one of the essentials even if it’s not often clear why.

Creating art to dominate this discursive landscape turns that art into a chore — in other words, cultural homework. This is where people start saying things like, “Do I HAVE to watch Captain Marvel?” and “feeling a lot of pressure to read sally rooney!” and “do i have to listen to the yeehaw album?” This kind of coercion has been known to cause an extreme side effect — reactance, a psychological phenomenon in which a person who feels their freedom being constricted adopts a combative stance, turning a piece of art we might otherwise be neutral about into an object of derision. The Guardian’s Oliver Burkeman called it “cultural cantankerousness” and used another psychological concept, optimal distinctiveness theory, to further explain it. That term describes how people try to balance feeling included and feeling distinct within a social group. Burkeman, however, favored his reactance as a form of self-protective FOMO avoidance. “My irritation at the plaudits heaped on any given book, film or play is a way of reasserting control,” he wrote. “Instead of worrying about whether I should be reading Ferrante, I’m defiantly resolving that I won’t.” (This was written in 2016; if it were written now, I’m sure he would’ve used Rooney).

***

Shortly after Beyoncé dropped Homecoming, her previous album, Lemonade, became available on streaming services. That one I have heard — a year after it came out. I didn’t write about it. I barely talked about it. No one wants to read why Beyoncé doesn’t mean much to me when there are a number of better critics who are writing about what she does mean to them and so many others (the same way there are smart, interested parties analyzing Lizzo and Game of Thrones and Avengers: Endgame and Rooney). I am not telling those people not to watch or listen to or read or find meaning there, I understand people have different tastes, that certain things are popular because they speak to us in a way other things haven’t. At the same time, I expect not to be told what to watch or listen to or read, because from what I see and hear around me, from what I read and who I talk to, I can define for myself what I need. After Lemonade came out, in a post titled “Actually,” Gawker’s Rich Juzwiak wrote, “It’s easier to explicate what something means than to illustrate what it does. If you want to know what it does, watch it or listen to it. It’s at your fingertips. … Right is right and wrong is wrong, but art at its purest defies those binaries.” In the same way, there is no art you have to experience, just as there is no art you have to not experience. There is only art — increasingly ubiquitous — and there is only you, and what happens between both of you is not for me to assign.

* * *

Soraya Roberts is a culture columnist at Longreads.

 

Just a Spoonful of Siouxsie

Illustration by Mark Wang

Alison Fields | Longreads | April 2019 | 14 minutes (3,609 words)

She showed up on an overcast Friday afternoon in January. She barreled into the driveway in an old mustard-gold Buick with a black vinyl top, its back dash decorated plastic bats, novelty skulls, and dried flowers. She was wrapped in black sweaters, black tights, black boots. She wore clunky bracelets, loads of them on the outside of her sleeves. Her hair was long and henna red. She carried an Army surplus satchel pinned with old rhinestone brooches and Cure buttons. She was 19 years old. When I opened the front door and she smiled at me, I thought she was the most perfect person I’d ever seen.

“I’m Gwen,” she said. “I’m here to interview for the nanny job.”

That’s when I noticed the nose ring and I blubbered something incoherent, then apologized because I was both overwhelmed and mortified that someone this cool was going to come into my stupid house.

Read more…

‘Midwesterners Have Seen Themselves As Being in the Center of Everything.’

Cincinnati, Ohio, early 20th century. Kraemer's/Cincinnati Museum Center/Getty.

Bridey Heing | Longreads | April 2019 | 10 minutes (2,589 words)

 

The American Midwest is hard to define. Even which states can be considered “Midwestern” depends on who you ask; is it what lies between Ohio and Iowa? Or does the Midwest stretch further west across the Great Plains; north into Wisconsin, Minnesota, and the Dakotas; or east into parts of Pennsylvania and New York state? Perhaps part of the confusion over the term is rooted in the idea that the Midwest represents far more than a geographic space — it represents a vision of the country as a whole, and is a stand-in for nostalgia, despite the fact that the reality of the nation, and the Midwest along with it, has always been far messier than any myth.

In her new book, The Heartland: An American History, University of Illinois professor Kristin L. Hoganson tells the story of the region through its links to the rest of the world. Arguing that the Midwest, centered here on Illinois, has long been misunderstood as far more provincial and isolated than it actually is, Hoganson lays out the ways in which international relationships have shaped the economy and identity of the region. She also examines part of the region’s complicated history with race, and the way some stories have been obscured in a way that has given everyone — outsiders and locals alike — a warped idea of who has a claim to the most all-American of places. Read more…

The Top 5 Longreads of the Week

A U.S. soldier looks towards the military prison known as 'Gitmo.' (Photo by John Moore/Getty Images)

This week, we’re sharing stories from Ben Taub, Paige Blankenbuehler, Alex Horton, Victoria Gannon, and Gustavo Arellano.

Sign up to receive this list free every Friday in your inbox. Read more…

MFA vs. NYC: A Reading List

42nd Street with Chrysler Bulding during Manhattanhenge in 2018, captured in Manhattan, NYC. (Getty Images)

Near the end of my MFA, someone asked what my plans were after graduation. Before allowing me to answer, he said, somewhat wistfully, that he thought I should move to New York City and “live a little” before writing anything else. In the moment, I probably nodded politely and smiled, as I’m prone to doing, but his suggestion frustrated me. How, after living for two years on a barely-sufficient stipend, did he expect that I’d be able — or want — to fling myself across the country to a city with exorbitant rent prices where I had no job, no insurance, and no community? And what did he mean by living? Had I not been living during the two years of my MFA, during which I moved to an unfamiliar-to-me city, taught classes at the university for the first time, learned to edit a journal, found my way into a community of writers, and struggled in draft after draft to improve my own prose?

Instead of moving to New York City, I did what might be considered the opposite; I started a PhD in creative writing in the middle of Oklahoma, which I’m finishing up now. During my years here, I’ve certainly grown as a writer and a teacher, and had the opportunity to build lasting relationships with people who have supported me in innumerable ways. But I also have remained aware of the problems within academia: there is a food pantry for graduate students in the room across from my office, for example, a lack of diversity within my program and many others, and a job market that dwindles every year. Sometimes I think back to that person telling me to move to NYC, and I wonder who I might be now — as a writer, as a person, as a professional — had I “lived life” rather than pursuing another degree. I’ve probably thought about his offhand comment more than I should, but it also seems to encapsulate some of the larger conversations about the function of MFA and PhD creative writing programs and the various pros and cons of making a life as a writer within or outside of academia.

More interesting to me than prescribing one way of life over another, however, is to examine the challenges and sources of nourishment in each, and to wonder about the possibilities that exist beyond a reductive dichotomy. The essays curated in this reading list illuminate problems that exist within MFA and PhD creative writing programs, explore the idea of mentorship both within and outside of the academy, and offer insight on how to live a fruitful writing life without the support and constraints of a formal program.

1. MFA vs. NYC (Chad Harbach, November 26, 2010, Slate)

Chad Harbach theorizes about how MFA programs are influencing both the craft and professional development of fiction writers, as well as impacting the landscape of publishing, in this viral essay.

It’s time to do away with this distinction between the MFAs and the non-MFAs, the unfree and the free, the caged and the wild. Once we do, perhaps we can venture a new, less normative distinction, based not on the writer’s educational background but on the system within which she earns (or aspires to earn) her living: MFA or NYC.

Related read: Which Creates Better Writers: An MFA Program or New York City? (Leslie Jamison, February 27, 2014, The New Republic) and “MFA vs NYC”: Both, Probably (Andrew Martin, March 28, 2014, The New Yorker)

2. Going Hungry at The Most Prestigious MFA in America (Katie Prout, Lit Hub)

The idea of writers living without substantial income is one that’s sometimes romanticized, as Katie Prout notes while listening to an audiobook of A Moveable Feast, in which Hemingway says that “he and Pound agreed that the best way to be a writer is to live poorly.” One month away from turning 30, Prout writes about the realities — which include food banks and multiple jobs — of living with very little money while pursuing her MFA at Iowa.

I’m an instructor at the university where I attend the best nonfiction writing program in the country, and I make approximately $18,000 a year before taxes. When I was denied a second teaching assistantship at the university this summer for the upcoming school year even though I already had signed a contract with the offering department, my director explained that it was in the school’s best interests to look after my best interests, and my best interest was to make sure that I had time [to] write my thesis.

3. Every Day is a Writing Day, With or Without an MFA (Emily O’Neill, November 27, 2018, Catapult)

The requirement to relocate and the insufficiency of fully-funded spots are just two of many reasons why MFA degrees are not possible for many people, as Emily O’Neill explains in this essay about how she nurtures a writing life outside of the academy.

I don’t have an MFA. It often makes me feel like the man on that mortifying date to admit this to writers I don’t know well. So many people who write are academics or at least aspiring to an MFA or PhD, and mentioning I don’t feel specifically drawn to the demands of graduate school is often seen as a sin against literature.

4. Woman of Color in Wide Open Spaces (Minda Honey, March 2017, Longreads)

After two years, Minda Honey longs to escape from the whiteness of her MFA program, and plans a trip to four national parks, not realizing that “80% of National Parks visitors and employees are white.” Weaving together moments from her travels and memories from her writing program, Honey lays bare the lack of diversity in both spaces.

When I’d first started my MFA program, I thought it would be an escape from the oppressive whiteness of Corporate America. I thought without suits to button my body into, I would be free to exist. But Academia proved to be just as oppressive.

5. How Applying to Grad School Becomes a Display of Trauma for People of Color (Deena ElGenaidi, April 17, 2018, Electric Lit)

When consulting with people about how to apply to PhD programs, Deena ElGenaidi’s advisor tells her to play up her minority status in her personal statement. ElGenaidi explores the problematic and pervasive nature of this advice, while also discussing what it means that minority students and people of color are encouraged to use their trauma in order to be admitted into academic programs.

The experience taught me that society, white America specifically, regularly asks minorities and people of color to tokenize and exploit themselves, talking about their cultural backgrounds in a marketable way in order to gain acceptance into programs and institutions we are otherwise barred from.

6. The Mentor Series: Allie Rowbottom and Maggie Nelson (Allie Rowbottom, ed. Monet Patrice Thomas, March 25, 2019, The Rumpus)

How do writers balance the challenge of seeking publication in a difficult fast-paced market while nurturing their craft? And what role do mentors play in a writer’s development? In the inaugural installment of “The Mentor Series,” a series of interviews between mentors and students curated by Monet Patrice Thomas, Allie Rowbottom and Maggie Nelson ruminate on these questions and more.

Allie Rowbottom: I remember once, after I finished my MFA thesis, you advised I take my time and sit on the project. You said something about not publishing too young, or rushing out of the gate, and I’ve thought about that a lot now that I have published—one of my biggest challenges (or strengths?) as a writer is that I push myself. Now that my first book is out in the world, I feel an urgency to produce more, at the same time I worry that rushing never makes for solid work.

***

Jacqueline Alnes is working on a memoir about running and neurological illness. You can find her on Instagram and Twitter @jacquelinealnes.