Search Results for: New York Magazine

Critics: Endgame

Illustration by Homestead

Soraya Roberts | Longreads | May 2019 | 9 minutes (2,309 words)

It’s a strange feeling being a cultural critic at this point in history. It’s like standing on the deck of the Titanic, feeling it sink into the sea, hearing the orchestra play as they go down — then reviewing the show. Yes, it feels that stupid. And useless. And beside the point. But what if, I don’t know, embedded in that review, is a dissection of class hierarchy, of the fact that the players are playing because what else are you supposed to do when you come from the bottom deck? And what if the people left behind with them are galvanized by this knowledge? And what if, I don’t know, one of them does something about it, like stowing away their kids on a rich person’s boat? And what if someone is saved who might otherwise not have been? If art can save your soul, can’t writing about it do something similar?

The climate report, that metaphorical iceberg, hit in October. You know, the one that said we will all be royally screwed by 2040 unless we reduce carbon emissions to nothing. And then came news story after news story, like a stream of crime scene photos — submerged villages, starving animals, bleached reefs — again and again, wave after wave. It all coalesced into the moment David Attenborough — the man famous for narrating documentaries on the wonders of nature — started narrating the earth’s destruction. I heard about that scene in Our Planet, the one where the walruses start falling off the cliffs because there is no ice left to support them, and I couldn’t bring myself to watch it. Just like I couldn’t bring myself to read about the whales failing to reproduce and the millions of people being displaced. As a human being I didn’t know what to do, and as a cultural critic I was just as lost. So when Columbia Journalism Review and The Nation launched “Covering Climate Change: A New Playbook for a 1.5-Degree World,” along with a piece on how to get newsrooms to prioritize the environment, I got excited. Here is the answer, I thought. Finally.

But there was no answer for critics. I had to come up with one myself.

* * *

Four years ago, William S. Smith, soon to be the editor of Art in America, attended the Minneapolis-based conference “Superscript: Arts Journalism and Criticism in a Digital Age” and noticed the same strange feeling I mentioned. “The rousing moments when it appeared that artists could be tasked with emergency management and that critics could take on vested interests were, however, offset by a weird — and I would say mistaken — indulgence of powerlessness,” he wrote, recalling one speaker describing “criticism as the ‘appendix’ of the art world; it could easily be removed without damaging the overall system.” According to CJR, arts criticism has been expiring at a faster rate than newspapers themselves (is that even possible?). And when your job is devalued so steadily by the industry, it’s hard not to internalize. In these precarious circumstances, exercising any power, let alone taking it on, starts to feel Herculean.

Last week’s bloody battle — not that one — was only the latest reminder of critics’ growing insignificance. In response to several celebrities questioning their profession, beleaguered critics who might have proven they still matter by addressing larger, more urgent issues, instead made their critics’ point by making it all about themselves. First there was Saturday Night Live writer Michael Che denigrating Uproxx writer Steven Hyden on Instagram for critiquing Che’s Weekend Update partner Colin Jost. Then there was Lizzo tweeting that music reviewers should be “unemployed” after a mixed Pitchfork review. And finally, Ariana Grande calling out “all them blogs” after an E! host criticized Justin Bieber’s performance during her show. Various wounded critics responded in kind, complaining that people with so much more clout were using it to devalue them even more than they already have been. “It’s doubtful, for instance, that Lizzo or Grande would have received such blowback if they hadn’t invoked the specter of joblessness in a rapidly deteriorating industry,” wrote Alison Herman at The Ringer, adding, “They’re channeling a deeply troubling trend in how the public exaggerates media members’ power, just as that power — such as it is — has never been less secure.” 

That was the refrain of the weeklong collective wound-lick: “We’re just doing our jobs.” But it all came to a head when Olivia Munn attacked Go Fug Yourself, the fashion criti-comic blog she misconstrued as objectifying snark. “Red carpet fashion is a big business and an art form like any other, and as such there is room to critique it,” site owners Heather Cocks and Jessica Morgan responded, while a number of other critics seized the moment to redefine their own jobs, invoking the anti-media stance of the current administration to convey the gravity of misinterpreting their real function, which they idealized beyond reproach. At Vanity Fair, chief critic Richard Lawson wrote of his ilk offering “a vital counterbalance in whatever kind of cultural discourse we’re still able to have.” The Ringer’s Herman added that criticism includes “advocacy and the provision of context in addition to straightforward pans,” while Caroline Framke at Variety simply said, “Real critics want to move a conversation forward.” Wow, it almost makes you want to be one.

I understand the impulse to lean into idolatry in order to underscore the importance of criticism. Though it dates back as far as art itself, the modern conception of the critic finds its roots in 18th-century Europe, in underground socially aware critiques of newly arrived public art. U.K. artist James Bridle summed up this modern approach at “Superscript,” when he argued that the job of art is “to disrupt and complicate” society, adding, “I don’t see how criticism can function without making the same level of demands and responding to the same challenges as art itself — in a form of solidarity, but also for its own survival.” Despite this unifying objective, it’s important to be honest about what in actual practice passes for criticism these days (and not only in light of the time wasted by critics defending themselves). A lot of it — a lot — kowtows to fandom. And not just within individual reviews, but in terms of what is covered; “criticism” has largely become a publicity-fueled shill of the most high-profile popular culture. The positivity is so pervasive that the odd evisceration of a Bret Easton Ellis novel, for instance, becomes cause for communal rejoicing. An element of much of this polarized approach is an auteur-style analysis that treats each subject like a hermetically sealed objet d’art that has little interaction with the world.

The rare disruption these days tends to come from — you guessed it — writers of color, from K. Austin Collins turning a Green Book review into a meditation on the erasure of black history to Doreen St. Felix’s deconstruction of a National Geographic cover story into the erasure of a black future. This is criticism which does not just wrestle with the work, but also wrestles with the work within the world, parsing the way it reflects, feeds, fights — or none of the above — the various intersections of our circumstances. “For bold and original reviews that strove to put stage dramas within a real-world cultural context, particularly the shifting landscape of gender, sexuality and race,” the Pulitzer committee announced in awarding New Yorker theatre critic Hilton Als in 2017. A year later the prize for feature writing went to Rachel Kaadzi Ghansah, the one freelancer among the nominated staffers, for a GQ feature on Dylann Roof. Profiling everyone from Dave Chappelle to Missy Elliott, Ghansah situates popular culture within the present, the past, the personal, the political — everywhere, really. And this is what the best cultural criticism does. It takes the art and everything around it, and it reckons with all of that together.

But the discourse around art has not often included climate change, barring work which specifically addresses it. Following recent movements that have awoken the general populace to various systemic inequities, we have been slowly shifting toward an awareness of how those inequities inform contemporary popular culture. This has manifested in criticism with varying levels of success, from clunky references to Trump to more considered analyses of how historic disparity is reflected in the stories that are currently told. And while there has been an expansion in representation in the arts as a result, the underlying reality of these systemic shifts is that they don’t fundamentally affect the bottom line of those in power. There is a social acceptability to these adaptations, one which does not ask the 1 Percent to confront its very existence, ending up subsumed under it instead. A more threatening prospect would be reconsidering climate change, which would also involve reconsidering the economy — and the people who benefit from it the most.  

We are increasingly viewing extreme wealth not as success but as inequity — Disney’s billion-dollar opening weekend with Avengers: Endgame was undercut not only by critics who questioned lauding a company that is cannibalizing the entertainment industry, but by Bernie Sanders: “What would be truly heroic is if Disney used its profits from Avengers to pay all of its workers a middle class wage, instead of paying its CEO Bob Iger $65.6 million — over 1,400 times as much as the average worker at Disney makes.” More pertinent, however, is how environmentally sustainable these increasingly elaborate productions are. I am referring to not only literal productions, involving sets and shoots, but everything that goes into making and distributing any kind of art. (That includes publicity — what do you think the carbon footprint of BTS is?) In 2006, a report conducted by UCLA found that the film and television industries contributed more to air pollution in the region than almost all five of the other sectors studied. “From the environmental impact estimates, greenhouse gas emissions are clearly an area where the motion picture industry can be considered a significant contributor,” it stated, concluding, “it is clear that very few people in the industry are actively engaged with greenhouse gas emission reduction, or even with discussions of the issue.”

The same way identity politics has taken root in the critic’s psyche, informing the writing we do, so too must climate change. Establishing a sort of cultural carbon footprint will perhaps encourage outlets not to waste time hiring fans to write outdated consumers reviews that do no traffic in Rotten Tomatoes times. Instead of distracting readers with generic takes, they might shift their focus to the specifics of, for instance, an environmental narrative, such as the one in the lame 2004 disaster movie The Day After Tomorrow, which has since proven itself to be (if nothing else) a useful illustration of how climate change can blow cold as well as hot. While Game of Thrones also claimed a climate-driven plot, one wonders whether, like the aforementioned Jake Gyllenhaal blockbuster, the production planted $200,000 worth of trees to offset the several thousand tons of carbon dioxide it emitted. If the planet is on our minds, perhaps we will also feature Greta Thunberg in glossy magazines instead of Bari Weiss or Kellyanne Conway. Last year, The New York Times’ chief film critic, A.O. Scott, who devoted an entire book to criticism, wrote, “No reader will agree with a critic all the time, and no critic requires obedience or assent from readers. What we do hope for is trust. We try to earn it through the quality of our writing and the clarity of our thought, and by telling the truth.” And the most salient truth of all right now is that there is no art if the world doesn’t exist.

* * *

I am aware that I’m on one of the upper decks of this sinking ship. I have a contract with Longreads, which puts me somewhere in the lower middle class (that may sound unimpressive, but writers have a low bar). Perhaps even better than that, I work for a publication for which page views are not the driving force, so I can write to importance rather than trends. I am aware, also, that a number of writers do not have this luxury, but misrepresenting themselves as the vanguards of criticism not only does them a disservice but also discredits the remaining thoughtful discourse around art. A number of critics, however, are positioned better than me. Yet they personalize the existential question into one that is merely about criticism when the real question is wider: It’s about criticism in the world.

I am not saying that climate change must be shoehorned into every article‚ though even a non sequitur would be better than nothing — but I am saying that just as identity politics is now a consideration when we write, our planet should be too. What I am asking for is simply a widening of perspective, besides economics, besides race, beyond all things human, toward a cultural carbon footprint, one which becomes part of the DNA of our critiques and determines what we choose to talk about and what we say when we do. After more than 60 years of doing virtually the same thing, even nonagenarian David Attenborough knew he had to change tacks; it wasn’t enough just to show the loss of natural beauty, he had to point out how it affects us directly. As he told the International Monetary Fund last month: “We are in terrible, terrible trouble and the longer we wait to do something about it the worse it is going to get.” In Our Planet, Attenborough reminds us over and over that our survival depends on the earth’s. For criticism to survive, it must remind us just as readily.

* * *

Soraya Roberts is a culture columnist at Longreads.

The Top 5 Longreads of the Week

California raisins. (Photo by George Rose/Getty Images)

This week, we’re sharing stories from Jonah Engel Bromwich, Ryan Goldberg, Meghan Daum, Alison Osius, and Joel Mowdy.

Sign up to receive this list free every Friday in your inbox. Read more…

The Growing Power of Prosecutors

Rex Wholster / Getty

Hope Reese | Longreads | May 2019 | 16 minutes (4,345 words)

In our current criminal justice system, there is one person who has the power to determine someone’s fate: the American prosecutor. While other players are important — police officers, judges, jury — the most essential link in the system is the prosecutor, who is critical in determining charges, setting bail, and negotiating plea bargains. And whose influence often falls under the radar.

Journalist Emily Bazelon’s new book, Charged, The New Movement to Transform American Prosecution and End Mass Incarceration, brings to light some of the invisible consequences of our current judicial system — one in which in which prosecutors have “breathtaking power” that she argues is out of balance.

In Charged, a deeply-reported work of narrative nonfiction, Bazelon tells the parallel stories of Kevin, charged with possession of a weapon in Brooklyn, New York, and Noura, who was charged with killing her mother in Memphis, Tennessee, to illustrate the immense authority that prosecutors currently hold, how deeply consequential their decisions are for defendants, and how different approaches to prosecution yield different outcomes. Between these stories, she weaves in the recent push for prosecutorial reform, which gained momentum in the 2018 local midterm elections, and the movement away from mass incarceration. Read more…

Prince of the Midwest

AP Photo/Phil Sandlin

Michael Perry | Under Purple Skies | Belt Publishing | May 2019 | 10 minutes (1,861 words)

 

You’d never dream it looking at me, all doughy, bald, and crumpling in my 50s, but I owe the sublimated bulk of my aesthetic construct to Prince Rogers Nelson, circa Purple Rain. The film and album were released the summer after my fresh-off-the farm freshman year in college. I sat solo through the movie a minimum of four times, wore the hubs off the soundtrack cassette, draped my bedroom with purple scarves, stocked the dresser top with fat candles, and Scotch-taped fishnet to the drywall above the bed. Intended to create seductive shadows of mystery, it wound up a pointless cobweb.

Read more…

Did One Young Scientist Discover the Paleontology Pot of Gold?

AP Photo/Mark Lennihan

Sixty-six million years ago, an asteroid impact seems to have nearly eliminated 99.9 percent of life on Earth, but scientists still have questions about the conditions leading up to this mass extinction. A dark layer of ash and debris known as the KT boundary marks the dividing line: above it is the Tertiary Period, below it is the Cretaceous period, or the time of dinosaurs. Until 2013, scientists had found very few dinosaur remains in the nine feet of material immediately below the boundary, and there was no agreement about the dinosaurs’ decline leading up to the asteroid impact. Then a paleontology student named Robert DePalma made a monumental discovery in North Dakota.

For The New Yorker, Douglas Preston tells DePalma’s and the asteroid’s story, as he spends time with him at the secretive site of what might be the greatest scientific discovery of the century. As DePalma is not yet a known entity and his discovery is just coming to light, some people in the scientific community find him unreliable and doubt his interpretation of the fossil evidence.

The following day, DePalma noticed a small disturbance preserved in the sediment. About three inches in diameter, it appeared to be a crater formed by an object that had fallen from the sky and plunked down in mud. Similar formations, caused by hailstones hitting a muddy surface, had been found before in the fossil record. As DePalma shaved back the layers to make a cross-­section of the crater, he found the thing itself—not a hailstone but a small white sphere—at the bottom of the crater. It was a tektite, about three millimetres in diameter—the fallout from an ancient asteroid impact. As he continued excavating, he found another crater with a tektite at the bottom, and another, and another. Glass turns to clay over millions of years, and these tektites were now clay, but some still had glassy cores. The microtektites he had found earlier might have been carried there by water, but these had been trapped where they fell—on what, DePalma believed, must have been the very day of the disaster.

“When I saw that, I knew this wasn’t just any flood deposit,” DePalma said. “We weren’t just near the KT boundary—this whole site is the KT boundary!” From surveying and mapping the layers, DePalma hypothesized that a massive inland surge of water flooded a river valley and filled the low-lying area where we now stood, perhaps as a result of the KT-impact tsunami, which had roared across the proto-Gulf and up the Western Interior Seaway. As the water slowed and became slack, it deposited everything that had been caught up in its travels—the heaviest material first, up to whatever was floating on the surface. All of it was quickly entombed and preserved in the muck: dying and dead creatures, both marine and freshwater; plants, seeds, tree trunks, roots, cones, pine needles, flowers, and pollen; shells, bones, teeth, and eggs; tektites, shocked minerals, tiny diamonds, iridium-laden dust, ash, charcoal, and amber-smeared wood. As the sediments settled, blobs of glass rained into the mud, the largest first, then finer and finer bits, until grains sifted down like snow.

Read the story

The Enduring Myth of a Lost Live Iggy and the Stooges Album

Iggy and the Stooges performing at the Academy of Music, New York City, December 31, 1973. Photo by Ronnie Hoffman.

Aaron Gilbreath | Longreads | April 2019 | 48 minutes (8,041 words)

 

In 1973, East Coast rock promoter Howard Stein assembled a special New Year’s Eve concert at New York City’s Academy of Music. It was a four-band bill. Blue Öyster Cult headlined. Iggy and the Stooges played third, though the venue’s marquee only listed Iggy Pop, because Columbia Records had only signed Iggy, not the band. A New York glam band named Teenage Lust played second, and a new local band named KISS opened. This was KISS’s first show, having changed their name from Wicked Lester earlier that year. According to Paul Trynka’s Iggy Pop biography, Open Up and Bleed, Columbia Records recorded the Stooges’ show “with the idea of releasing it as a live album, but in January they’d decided it wasn’t worthy of release and that Iggy’s contract would not be renewed.” When I first read that sentence a few years ago, my heart skipped the proverbial beat and I scribbled on the page: Unreleased live show??? I was a devoted enough Stooges fan to know that if this is true, this shelved live album would be the only known full multitrack recording ever made of a vintage Stooges concert.

The Stooges existed from late 1967 to early 1974. They released three studio albums during their brief first life, wrote enough songs for a fourth, paved the way for metal and punk rock, influenced musicians from Davie Bowie to the Sex Pistols, popularized stage diving and crowd-surfing, and were so generally ahead of their time that they disbanded before the world finally came to appreciate their music. Their incendiary live shows were legendary. Iggy taunted listeners. He cut himself, danced, posed, got fondled and punched, and by dissolving the barrier between audience and performer, changed rock ‘n’ roll.

Read more…

The Top 5 Longreads of the Week

Wedding rings in a rose flower (Photo by Jared Sislin Photography/Getty Images)

This week, we’re sharing stories from Taffy Brodesser-Akner, Anna Merlan, Sara Tatyana Bernstein, Connie Pertuz-Meza, and Emma Beddington.

Sign up to receive this list free every Friday in your inbox. Read more…

The Women Characters Rarely End Up Free: Remembering Rachel Ingalls

Gaia Banks / New Directions Publishing

Ruby Brunton | Longreads | April 2019 | 10 minutes (2,674 words)

Rachel Ingalls, who passed away earlier this year at the age of 78, was a writer who did not seek out the spotlight, but found it not at all unpleasant when at last it came. Beyond a small circle of loyal friends and regular visits to Virginia to see her family, Ingalls lived a fairly reclusive existence after her move from the U.S. to the U.K. in 1965. “I’m not exactly a hermit,” she said, “but I’m really no good at meeting lots of strangers and I’d resent being set up as the new arrival in the zoo. It’s just that that whole clubby thing sort of gives me the creeps.”

A writer of fantastical yet slight works of fiction, with a back catalog numbering 11 titles in total, Ingalls flew more or less under the literary radar until recent years, when the newfound interest that followed the 2017 re-issue of her best-known book, Mrs. Caliban, also finally allowed her readers to learn about her processes and motivations; the attention slowly brought her into the public eye. Reviews across the board revered the oddly taciturn novella, in which mythic elements and extraordinary happenings are introduced into the lives of otherwise normal people by a prose remarkable for its clarity and quickness. “Ingalls writes fables whose unadorned sentences belie their irreducible strangeness.” Wrote Lidija Haas in The New Yorker; in the same piece she described Ingalls as “unjustly neglected.” (Mrs. Caliban was also lightheartedly celebrated as a venerable addition to popular culture’s mysterious year of fish sex stories, a fittingly strange introduction of her work to a broader readership.)  Read more…

When Did Pop Culture Become Homework?

Kevin Winter / Getty, Collage by Homestead

Soraya Roberts | Longreads | April 2019 | 6 minutes (1,674 words)

I didn’t do my homework last weekend. Here was the assignment: Beyoncé’s Homecoming — a concert movie with a live album tie-in — the biggest thing in culture that week, which I knew I was supposed to watch, not just as a critic, but as a human being. But I didn’t. Just like I didn’t watch the premiere of Game of Thrones the week before, or immediately listen to Lizzo’s Cuz I Love You. Instead, I watched something I wanted to: RuPaul’s Drag Race. What worse place is there to hide from the demands of pop culture than a show about drag queens, a set of performance artists whose vocabulary is almost entirely populated by celebrity references? In the third episode of the latest season, Vietnamese contestant Plastique Tiara is dragged for her uneven performance in a skit about Mariah Carey, and her response shocks the judges. “I only found out about pop culture about, like, three years ago,” she says. To a comically sober audience, she then drops the biggest bomb of all: “I found out about Beyoncé legit four years ago.” I think Michelle Visage’s jaw might still be on the floor.

“This is where you all could have worked together as a group to educate each other,” RuPaul explains. It is the perfect framing of popular culture right now — as a rolling curriculum for the general populace which determines whether you make the grade as an informed citizen or not. It is reminiscent of an actual educational philosophy from the 1930s, essentialism, which was later adopted by E.D. Hirsch, the man who coined the term “cultural literacy” as “the network of information that all competent readers possess.” Essentialist education emphasizes standardized common knowledge for the entire population, which privileges the larger culture over individual creativity. Essentialist pop culture does the same thing, flattening our imaginations until we are all tied together by little more than the same vocabulary.

***

The year 1987 was when Aretha Franklin became the first woman inducted into the Rock and Roll Hall of Fame, the Simpson family arrived on television (via The Tracey Ullman Show), and Mega Man was released on Nintendo. It was also the year Hirsch published Cultural Literacy: What Every American Needs to Know. None of those three pieces of history were in it (though People published a list for the pop-culturally literate in response). At the back of Hirsch’s book, hundreds of words and quotes delineated the things Americans need to know — “Mary Had a Little Lamb (text),” for instance — which would be expanded 15 years later into a sort of CliffsNotes version of an encyclopedia for literacy signaling. “Only by piling up specific, communally shared information can children learn to participate in complex cooperative activities with other members of their community,” Hirsch wrote. He believed that allowing kids to bathe in their “ephemeral” and “confined” knowledge about The Simpsons, for instance, would result in some sort of modern Tower of Babel situation in which no one could talk to anyone about anything (other than, I guess, Krusty the Klown). This is where Hirsch becomes a bit of a cultural fascist. “Although nationalism may be regrettable in some of its worldwide political effects, a mastery of national culture is essential to mastery of the standard language in every modern nation,” he explained, later adding, “Although everyone is literate in some local, regional, or ethnic culture, the connection between mainstream culture and the national written language justifies calling mainstream culture the basic culture of the nation.”

Because I am not very well-read, the first thing I thought of when I found Hirsch’s book was that scene in Peter Weir’s 1989 coming-of-age drama Dead Poet’s Society. You know the one I mean,  where the prep school teacher played by Robin Williams instructs his class to tear the entire introduction to Understanding Poetry (by the fictional author J. Evans Pritchard) out of their textbooks. “Excrement,” he calls it. “We’re not laying pipe, we’re talking about poetry.” As an alternative, he expects this class of teenagers to think for themselves. “Medicine, law, business, engineering, these are all noble pursuits, and necessary to sustain life,” he tells them. “But poetry, beauty, romance, love, these are what we stay alive for.” Neither Pritchard nor Hirsch appear to have subscribed to this sort of sentiment. And their approach to high culture has of late seeped into low culture. What was once a privileging of certain aspects of high taste, has expanded into a privileging of certain “low” taste. Pop culture, traditionally maligned, now overcompensates, essentializing certain pieces of popular art as additional indicators of the new cultural literacy.

I’m not saying there are a bunch of professors at lecterns telling us to watch Game of Thrones, but there are a bunch of networks and streaming services that are doing that, and viewers and critics following suit, constantly telling us what we “have to” watch or “must” listen to or “should” read. Some people who are more optimistic than me have framed this prescriptive approach as a last-ditch effort to preserve shared cultural experiences. “Divided by class, politics and identity, we can at least come together to watch Game of Thrones — which averaged 32.8 million legal viewers in season seven,” wrote Judy Berman in Time. “If fantasy buffs, academics, TV critics, proponents of Strong Female Characters, the Gay of Thrones crew, Black Twitter, Barack Obama, J. Lo, Tom Brady and Beyoncé are all losing their minds over the same thing at the same time, the demise of that collective obsession is worth lamenting — or so the argument goes.” That may sound a little extreme, but then presidential-hopeful Elizabeth Warren blogs about Game of Thrones and you wonder.

Essentializing any form of art limits it, setting parameters on not only what we are supposed to receive, but how. As Wesley Morris wrote of our increasingly moralistic approach to culture, this “robs us of what is messy and tense and chaotic and extrajudicial about art.” Now, instead of approaching everything with a sense of curiosity, we approach with a set of guidelines. It’s like when you walk around a gallery with one of those audio tours held up to your ear, which is supposed to make you appreciate the art more fully, but instead tends to supplant any sort of discovery with one-size-fits-all analysis. With pop culture, the goal isn’t even that lofty. You get a bunch of white guys on Reddit dismantling the structure of a Star Wars trailer, for instance, reducing the conversation around it to mere mechanics. Or you get an exhaustive number of takes on Arya Stark’s alpha female sex scene in Game of Thrones. One of the most prestige-branded shows in recent memory, the latter in particular often occupies more web space than its storytelling deserves precisely because that is what it’s designed to do. As Berman wrote, “Game of Thrones has flourished largely because it was set up to flourish — because the people who bankroll prestige television decided before the first season even went into production that this story of battles, bastards and butts was worth an episodic budget three times as large as that of the typical cable series.” In this way, HBO — and the critics and viewers who stan HBO — have turned this show into one of the essentials even if it’s not often clear why.

Creating art to dominate this discursive landscape turns that art into a chore — in other words, cultural homework. This is where people start saying things like, “Do I HAVE to watch Captain Marvel?” and “feeling a lot of pressure to read sally rooney!” and “do i have to listen to the yeehaw album?” This kind of coercion has been known to cause an extreme side effect — reactance, a psychological phenomenon in which a person who feels their freedom being constricted adopts a combative stance, turning a piece of art we might otherwise be neutral about into an object of derision. The Guardian’s Oliver Burkeman called it “cultural cantankerousness” and used another psychological concept, optimal distinctiveness theory, to further explain it. That term describes how people try to balance feeling included and feeling distinct within a social group. Burkeman, however, favored his reactance as a form of self-protective FOMO avoidance. “My irritation at the plaudits heaped on any given book, film or play is a way of reasserting control,” he wrote. “Instead of worrying about whether I should be reading Ferrante, I’m defiantly resolving that I won’t.” (This was written in 2016; if it were written now, I’m sure he would’ve used Rooney).

***

Shortly after Beyoncé dropped Homecoming, her previous album, Lemonade, became available on streaming services. That one I have heard — a year after it came out. I didn’t write about it. I barely talked about it. No one wants to read why Beyoncé doesn’t mean much to me when there are a number of better critics who are writing about what she does mean to them and so many others (the same way there are smart, interested parties analyzing Lizzo and Game of Thrones and Avengers: Endgame and Rooney). I am not telling those people not to watch or listen to or read or find meaning there, I understand people have different tastes, that certain things are popular because they speak to us in a way other things haven’t. At the same time, I expect not to be told what to watch or listen to or read, because from what I see and hear around me, from what I read and who I talk to, I can define for myself what I need. After Lemonade came out, in a post titled “Actually,” Gawker’s Rich Juzwiak wrote, “It’s easier to explicate what something means than to illustrate what it does. If you want to know what it does, watch it or listen to it. It’s at your fingertips. … Right is right and wrong is wrong, but art at its purest defies those binaries.” In the same way, there is no art you have to experience, just as there is no art you have to not experience. There is only art — increasingly ubiquitous — and there is only you, and what happens between both of you is not for me to assign.

* * *

Soraya Roberts is a culture columnist at Longreads.

 

Just a Spoonful of Siouxsie

Illustration by Mark Wang

Alison Fields | Longreads | April 2019 | 14 minutes (3,609 words)

She showed up on an overcast Friday afternoon in January. She barreled into the driveway in an old mustard-gold Buick with a black vinyl top, its back dash decorated plastic bats, novelty skulls, and dried flowers. She was wrapped in black sweaters, black tights, black boots. She wore clunky bracelets, loads of them on the outside of her sleeves. Her hair was long and henna red. She carried an Army surplus satchel pinned with old rhinestone brooches and Cure buttons. She was 19 years old. When I opened the front door and she smiled at me, I thought she was the most perfect person I’d ever seen.

“I’m Gwen,” she said. “I’m here to interview for the nanny job.”

That’s when I noticed the nose ring and I blubbered something incoherent, then apologized because I was both overwhelmed and mortified that someone this cool was going to come into my stupid house.

Read more…