Search Results for: Frank Rich

But Who Tells Them What To Sing?

Getty Images

Adrian Daub | Longreads | September 2021 | 21 minutes (5,894 words)

When a new trailer for the Marvel film Black Widow dropped in April of this year — after the movie had been repeatedly moved back due to the pandemic — the producers seemed intent on reminding people about why they’d been excited about the movie before the lockdowns started. They did so by closing the promo with a new version of the theme from The Avengers, probably to call back viewers to a different, less socially distanced time. How could you know this was a new version of the motif? It was choral, but that was a well Marvel had gone to before. This time it had lyrics. As best I can tell, for the first time.

As fans welcomed the callback in online comments, I was brought back to a question that I’d had when Game of Thrones did something similar at the end of its fourth season and again at the very end of the show. It’s something of a trend these days to take a highly recognizable instrumental theme and make it choral. And I get why: The gesture is big and bold and epic. But my question concerned something comparatively pedestrian: Who decides what the lyrics are? What language are they even in? And who writes them? I decided to find out.

Those of us who listen to soundtracks obsessively do so knowing that that’s not how soundtracks are intended to work on us. Whoever mixed in a chorus for a few seconds of the Black Widow trailer was going for an emotional reaction, not some new layer of meaning to be disentangled. “When I do a film score,” the late James Horner said in a TED talk in 2005, “I am nothing more than a fancy pencil” executing the vision of a filmmaker. You’re not meant to listen to a soundtrack in isolation from the image. It is music in service of the moment.

You’re not meant to listen to a soundtrack in isolation from the image. It is music in service of the moment.

But one place where this fancy pencil has more autonomy is when it comes to the text that a chorus sings. Perhaps it’s better to say that the pencil is condemned to freedom. When the composer John Ottman was hired to score the 2008 Tom Cruise film Valkyrie, he realized that he needed a break in the texture of the soundtrack at the very end of the film. That’s because in the final scenes of the movie basically all of the even remotely redeemable characters get executed. After they had all died and the credits rolled, Ottman decided he wanted a “sense of release, because there had to be a different feeling as the audience walks out of the theater.” So he hit upon the idea of a self-contained choral piece. “The problem was though, what on earth would they be saying?”


Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.

Sign up


What on earth indeed? It’s a moment where blockbuster filmmaking — always so anxiously in control of its meanings — seems to be at a bit of a loss. And it’s a moment where we as an audience suddenly get a sense for how films make meaning, and how it isn’t always the meaning they intend to make.

So who decided what the lyrics to the theme from The Avengers were? The short answer is that I still don’t know. But the long answer to my pedestrian question leads into the high-pressure, highly collaborative world of film scoring. A world in which composers often have just a few weeks to write music that pleases the studio and the director, and potentially even test audiences. And in which they toil with assistants, orchestrators, sound editors, and many, many session musicians to find a sound for a film that is still in the process of evolving. I wanted to find out who among this massive group would be the one to say “hey, let’s add a chorus and have it sung in Sanskrit” or something along those lines.

The answer turns out to be: Pretty much any of them can and sometimes do. What film choruses offer us is a perfect synecdoche for the collective, frenzied, and deeply mercenary magic that creates movies in the first place. It’s as likely that a director had the screenwriter invent specific lyrics early in post-production as that a subcontractor, assistant composer, or orchestrator jotted down some words or went on a Wikipedia deep-dive eight weeks out from release in a desperate late-night quest for a non-copyrighted text to use with a cue that might please a bunch of suits half a world away.

What film choruses offer us is a perfect synecdoche for the collective, frenzied, and deeply mercenary magic that creates movies in the first place.

***

Choruses have been part of film scoring for over a century. People have been singing on screen since the earliest silent reels, and with increasing technical wizardry we could even hear them doing it. But something like the Black Widow trailer is what we call an non-diegetic chorus: These are voices that viewers aren’t supposed to somehow locate within the screen action. In early cinema you had to have musicians physically present, first in the cinema with a viewer, eventually in the scene with the actors. Both of which pretty much ruled out the use of a choir. And, as film music historian Mervyn Cooke points out, once technologies existed that allowed films to have at least a partial soundtrack, filmmakers initially avoided non-diegetic music — precisely because they needed to sell the illusion that the sound was coming “from” the scene.

Non-diegetic music started to become the norm only in the early ’30s. And even then the limitations of recording technology meant that non-diegetic voices were not usually worth the trouble. By the late ’30s this had changed. Snow White and the Seven Dwarfs (1937) had its choir chime in even when it wasn’t for the explicit musical numbers. (Snow White was also the first soundtrack issued as an album, so choruses were part of how film soundtracks traveled semi-independently from their films from the very beginning.)

Alfred Newman had begun relying on wordless “heavenly choirs” going ooo and aaa in the background, in films like Wuthering Heights (1939), How Green Was My Valley (1941), and The Song of Bernadette (1943). As the music historian Donald Greig, who is also an active session singer on many modern scores, has pointed out, in the beginning choruses had to be at least somewhat motivated by theme or screen action — they were there to speak for ghosts, to intimate religious dimensions to the screen action.

And then there was Dimitri Tiomkin’s score for Frank Capra’s Lost Horizon (1937). The film concerns the discovery of Shangri-La in the Himalayas, and when we finally get to the fabled land the soundtrack accompanies the matte-painted wonderland with a chorus singing in … well, in a language that isn’t English and doesn’t seem to be Tibetan either. And thus another Hollywood tradition was born: film choruses belting out perfectly nonsensical prose with utter conviction.

And thus another Hollywood tradition was born: film choruses belting out perfectly nonsensical prose with utter conviction.

Both types of choral performance have never left the Hollywood lexicon. In thinking through how film choruses make meaning, I became obsessed with what the process of recording a soundtrack looks like today and at what point in that process someone actually writes lyrics in fake Tibetan. In the Golden Age, studios kept their own choirs — professional singers would show up at the lot and ooo and aaa for a Miklós Rósza score today and belt out a ferocious battle hymn for Erich Wolfgang Korngold the next. Studios also had their house orchestrators (usually several), and while laypeople remember the composers of Hollywood’s Golden Age, there are other figures that probably shaped the way films sound just as much if not more, all the while just quietly collecting their paychecks.

Speaking with modern singers about their experiences, I was struck by how little their day-to-day job description had changed since Tiomkin’s day. But the world in which they are performing is altogether different. As part of my research for this article I made a massive choir belt out the most menacing rendition of “Mary Had a Little Lamb” ever, and all it cost me was $199 plus tax. The EastWest Symphonic Choirs software allows you to make a virtual choir sing in just about any style imaginable. Want your ooos and aaas to sound like a whisper? More Broadway or more classical? All of that’s in the package.

But there’s more: Due to a system called WordBuilder, you can have this choir sing pretty much anything — you can type in text in English, in phonetics, or a proprietary alphabet called Votox, and the software will assemble it out of a massive databank of vowels and consonants. This is a commercially available product, but there are even bigger sample libraries kept by individual composers: If you’re wondering who’s dropping by to supply a quick “agnus dei” for a Hans Zimmer score, well that’s almost certainly a proprietary sample owned by Zimmer’s film score workshop, Remote Control.

All the professional singers I spoke to were keenly aware of products like EastWest Symphonic Choirs and the sample libraries — because more likely than not they’re in them. If you’re in the business of singing on film, these days you won’t always be asked to sing for an actual score, but instead you might get booked to record samples. There’s a scary possibility that these artists are slowly eroding the industry’s need for their labor — that the fruits of their one day of paid work will perform for the studios in perpetuity and with no extra residuals. Their disembodied vowels are putting their vocal chords out of business. But that possibility hasn’t been fully realized: Often enough when they arrive in the recording studio, singers will find that there is a vocal track already, but it’s done by computer. And yet, the composer wants a live version. Almost all the singers I spoke to expressed some surprise that Hollywood still bothered.

Their disembodied vowels are putting their vocal chords out of business.

One possibility why they do: Composers simply like working with live humans and consider it part of their job to do so. As Jonathan Beard, who has been composing and orchestrating in Hollywood for over a decade, put it to me, choirs are an easy, effective way to give dimension to a scene — “because you have a human body as one of the instruments, and there’s a power the human voice [has] over us in general.”

Composers are highly trained musicians, and a lot of their training has involved singing. The composer brothers Harry and Rupert Gregson-Williams (Harry composed for films like Kingdom of Heaven, the Narnia-films, and most of Denzel Washington’s films of the last 15 years, while Rupert is best known for DC Universe films like Wonder Woman and Aquaman) were both choirboys at St. John’s College in Cambridge — it makes biographical sense that choral textures and their creation would be important to them. And that they might like to think through music with a live chorus rather than a computer. Another surprising preference that speaks to a kind of sweet traditionalism: While sometimes vocal tracks get doubled in recording (meaning what sounds like 16 singers is just eight overlaid onto each other), this seems to be the exception rather than the rule. Clearly someone in the process enjoys working with large groups of people and thinks they give you an aesthetic payoff that engineering wizardry would not.

But there’s a more cynical reason as well, and it’s the reason why automation hasn’t displaced human labor in other fields: The process of booking some freelancers through a fixer, having them record for a day, and then paying them no residuals isn’t actually much of an expense. That’s how London became a preferred place for Hollywood to record: a large population of well-trained musicians, whose union doesn’t insist on residuals. Several London-based singers I spoke with suggested that the reason Hollywood doesn’t record in, say, Germany as often is that singers in continental Europe have steadier income and are less dependent on session work. And once a producer decides that even London-based musicians are too demanding — well, then there’s always Prague or Budapest. The gorgeous voices you heard in a John Ford Western were the sound of unions and full-time employment; in a Hollywood score today they are monuments to the globalizing power of the gig economy.

***

So that is the world from which these vocals emerge. Imagine you are a classically trained singer in, say, London who has done some previous work on soundtracks. You get a call from a fixer, who is assembling a chorus, or soloists, for a production company. You book the gig, and you show up for the recording session knowing which film you’re singing for, probably knowing the composer you’re recording for, but nothing else. Most recording sessions take place in the famous Abbey Road Studios, which are expensive, so you’re usually booked for no more than a certain number of union-approved hours.

Importantly, by the time you show up for the recording session, the film is pretty much “in post post production,” as one session singer put it to me. The film is basically finished, the wrangling over what the score is supposed to sound like is over. By the time you record, whatever orchestral parts you are supposed to accompany are fully assembled — you usually have them in your headphones as you sing. When you get there, you are handed a large stack of notes to sing and, according to all the singers I spoke with, you get through some portion of them in the next few hours — never through all of them. Some cues you sing will never be in the finished film, some cues you might do 10 versions of. And then the studio time the composer booked is over, you hand over your stack of notes, sign statements agreeing not to divulge anything about what you just sang, and you are on your way.

As the soprano Catherine Bott said: “You enter a studio and you open the score and off you go. You sing what you’re told, and it’s all about versatility, just being able to adapt to the right approach, whatever that may be for that conductor or that composer.” And part of that, singers told me, was singing the words — whatever they may be. As Donald Greig pointed out to me, a lot of these singers have training in classics; they certainly know their way around a Requiem or a Stabat Mater. And yet often enough when they step into Abbey Road they’re being asked to sing perfectly nonsensical phrases in pseudo-Latin — but the studio is booked, the clock is ticking, and as Bott put it, “that’s not the time to put up your hand and, you know, correct the Latin.”

Or the English: Bott sang on the soundtrack for the 1986 animated feature An American Tail. For a cue where the little immigrant mouse Fievel first lays eyes on New York harbor, composer James Horner had the choir intone the famous Emma Lazarus poem inscribed at the base of the Statue of Liberty. As she was singing through the cue — “Give me your tired, your poor” — Bott realized that whoever had put together the score had written down “your huddled masses yearning to be free” rather than “breathe free.” She was pretty sure she knew better, as did some colleagues, but out of English reserve, deference to the Americans, or professionalism, no one felt it was their place to say anything. The misquote stayed in the picture and you can buy it on CD today.

Perhaps part of what made me look for the meaning behind the lyrics on some of my favorite soundtracks was exactly this professionalism. A good singer sells the emotion and the conviction, to the point that a listener sort of has to believe that it all means something. Interestingly enough, early in this long tradition of made-up languages, Hollywood felt the need to pretend that it did mean something. When Lost Horizon was released in 1937, Columbia Pictures claimed in its publicity material that Dimitri Tiomkin’s score “includes authentic folk songs of Tibet.” The same press sheet noted that the Hall Johnson Choir, a popular gospel choir, “will sing the folk song arrangements in the native Tibetan language.”

Film music historians agree that this is hogwash. There is no evidence Tiomkin researched Tibetan folk songs for his score — what the ad men were selling as “authentic folk songs” were almost certainly newly written pieces in a made-up language. Tiomkin had started out as a concert pianist and relied on a small army of orchestrators to turn his melodies into actual playable scores. Someone in that group put a pen to paper and wrote these pieces, and either that same person or someone else seems to have made up some fake Tibetan text to distribute to the singers.

But for whatever reason Columbia Pictures’ publicity department didn’t want to frame the vocals in this manner. Perhaps extradiegetic voices were still sufficiently new that they wanted to tell an audience what these voices were doing on the soundtrack. Or it had nothing to do with the soundtrack itself, and was just another way of selling the broader spectacle of filmmaking: Look at the lengths we went to.

At the same time, lyrics have a pesky way of clarifying the intended audience. After all, it is not altogether difficult to imagine why Tiomkin and company wouldn’t have bothered with actual folk songs and actual language. Lost Horizon is one of those movies that stars noted non-Asian persons H.B. Warner as “Chang” and Sam Jaffe as “the High Lama of Shangri-La.” The broad and bogus claims to authenticity are also making a point of who the movie is for. The fact that the Hall Johnson Choir was an African American group best known for singing spirituals, amplifies the sense that Lost Horizon turns non-white people’s authenticity into charming window-dressing for white audiences. Like Shangri-La for its white visitors, even when its lyrics were incomprehensible film music was still “for” white English speakers.

At other times when Hollywood filmmaking relied on choruses, the point was the opposite of exoticism: hyper-comprehensibility. Decades later Tiomkin wrote a rousing score for John Wayne’s jingoistic epic The Alamo (1960). At the end of the movie, with the siege over and one lone survivor and her little daughter leaving the ruined fort, a chorus drifts faintly onto the soundtrack, almost as though the singers were standing somewhere far away in the field of battle. Over the movie’s final shots, the choir takes over the soundtrack, singing a version of what would eventually spend some weeks on the pop charts as “The Ballad of the Alamo.” The first lines a viewer is able to clearly hear are: “Let the old men tell the story / let the legend grow and grow. / Of the thirteen days of glory / at the siege of Alamo.”

This music explicitly tells us why it needs to turn human voices singing in a language the viewer is supposed to understand. The “Ballad” tells us what to do with the story we have just heard: Pass it on, let the legend “grow and grow.” Also — since this was made by John Wayne in the ’60s — the message is probably also don’t be a communist. But note how the movie has to treat three things as essentially the same: the singing has to be audible for the casual moviegoer, over people getting out of their seats early or finishing off their popcorn; the words have to be comprehensible on a purely linguistic level to an audience that has been taught to tune out the music on some level for the last two hours; and the reason why these words were included in the movie has to be clear.

Also — since this was made by John Wayne in the ’60s — the message is probably also don’t be a communist.

The fact that these three factors are separate can be easy to forget for an English-speaking audience reared on American pop culture. I grew up on Hollywood films in dubbed versions — though those didn’t typically dub the music. Meaning, as a kid who didn’t speak English, I became pretty used to following a plot in German, then the music would swell and I’d sort of tune out for a few minutes as the soundtrack, and the English language, washed over me. I’d get the basic idea of course — the characters were happy, or sad, or patriotic — but I had no idea what they were saying, and I was okay with that.

That’s sort of how most of us feel when we listen to the theme to the 21st-century version of Battlestar Galactica — unless we happen to be familiar with the mantras of the Rig Veda. Still, it’s a culturally specific experience. These days we can’t watch fantasy or science fiction without being sung at in Sanskrit, Old Norse, Dwarvish, Elvish, Uruk-hai, Klingon, and so on. When composer John Williams returned to the Star Wars universe for 1999’s The Phantom Menace, he composed an amped-up piece for the final duel — and over its churning ostinatos he overlaid a chorus belting out a … Sanskrit translation of a Welsh poem. And apparently the syllables of the Sanskrit text were rearranged to the point of incomprehensibility. Clearly, these shows and movies are not addressing us as potential speakers of Klingon or Sanskrit or even Welsh — they’re interested in the feel and a sound of a language rather than its meaning. At one recording session, Donald Greig told me, “they spent ages telling us how to pronounce the Russian and then we realized, ‘well this doesn’t actually mean anything.’” This turns out to be both a pretty new and pretty old way of listening to music.

When composer John Williams returned to the Star Wars-universe for 1999’s The Phantom Menace, he composed an amped-up piece for the final duel — and over its churning ostinatos he overlaid a chorus belting out a … Sanskrit translation of a Welsh poem.

***

Hollywood scores come in waves. The film industry isn’t known for being particularly fond of risk taking, and film scores in particular often build on previous scores. The director will often cut the film to a temp track consisting of existing pieces, and it’s easy to imagine that the filmmakers would eventually want something that sounds like their temp track to accompany the finished film. Choirs have never really left Hollywood, but there are certainly moments when producers and directors seem to have almost reflexively sought them out and others when they have avoided them. The Omen (1976) with its massive latinate choral opener, “Ave Satani,” kicked off one such wave. Peter Jackson’s The Lord of the Rings trilogy kicked off another.

This new chapter in the way films sounded started in the Town Hall, a storied concert venue in Wellington, New Zealand. That’s where composer Howard Shore recorded the earliest parts of his soundtrack for The Fellowship of the Ring (the rest would be recorded in London). The recording involved a full orchestra on ground level and rotating choirs in the balcony. It wasn’t lost on the composer that the scene was weirdly traditional: “The orchestra,” Shore explained, “was set up very much the way a pit orchestra was set up in an opera.” The collaborative process around the composition, too, felt like something Mozart and his librettist Lorenzo da Ponte might have recognized. The screenwriters wrote the text the choir would be expected to sing, an on-site translator would translate them into Tolkien’s languages, and Shore would then set the Dwarven or Elvish text.

Somewhat counterintuitively it’s not actually choral music with incomprehensible lyrics that is novel and needs explaining, it is choral music with comprehensible ones. For a long time, and for far longer than instrumental music, choral music in the West belonged to the church, to the mass, and that meant to Latin. A language as native to Christian religious life as it was foreign to most Christians. The Lutheran Reformation did a lot to hand church services over to language the congregants could actually understand, but throughout Europe the experience of being talked, and in particular sung, at in Latin persisted. That’s of course not to say that people didn’t sing in their vernacular languages — just that the experience of singing words you don’t, or don’t fully, understand would have been very normal to these people.

For a long time, and for far longer than instrumental music, choral music in the West belonged to the church, to the mass, and that meant to Latin. A language as native to Christian religious life as it was foreign to most Christians.

For the German philosopher Arthur Schopenhauer choral music was meaningful only insofar as the words were not the point. In his The World as Will and Representation, which appeared first in 1819, was republished in 1844, and strongly influenced composers like Richard Wagner, Schopenhauer claimed that music was the purest expression of reality because it didn’t linger with “representations” — words and the things they represent — but tapped automatically into something deeper. Choral music would seem to fall short of that standard — being pretty centrally concerned with words and the things they denote — but Schopenhauer didn’t think so. After all, you shouldn’t listen to sung music primarily for the words, and often you may not even know the words. And Schopenhauer thought this was for the better.

Latin still works that way for most modern audiences: You might argue that there isn’t much of an expectation on the part of an American film composer circa 1989 (or on the part of the filmmakers who hired him) that the audience should be able to follow along with the Latin lyrics — in fact, it might well be distracting if they did. What text is included, both singers and composers confirmed to me, has far more to do with the flow of phonemes and how it interacts with the raw sound of the vocals. The words are simply yet another instrument in the repertoire the composer has at their disposal. But it’s an instrument that comes freighted with all the complications that inevitably arise when our loquacious species uses language.

The words are simply yet another instrument in the repertoire the composer has at their disposal. But it’s an instrument that comes freighted with all the complications that inevitably arise when our loquacious species uses language.

After all, unlike a humming chorus, a Latin chorus does create extra levels of meaning for those who want to listen more carefully. Composer Jerry Goldsmith wrote “Ave Satani” for The Omen as a deliberate transposition of various Catholic masses. While the individual Latin may have been hard to pick up on (and wasn’t entirely correct to boot), listeners who were Catholic likely would have recognized what was being inverted here, given that they’d spent most Sundays around the actual Latin texts. It’s not clear how seriously Goldsmith (or the choirmaster who jotted down the Latin lyrics for the composer) grappled with that dimension of the score — for one thing, the very title of the piece messes up the declension of Satan. But that dimension was there nonetheless —The Omen was part of a kind of religious revival in Hollywood, and though it plays as camp today it was taken far more seriously then.

James Horner’s score for the 1989 film Glory relies heavily on a Latin chorus, and in the film’s climactic moment that chorus sings recognizably in Latin. Glory tells the story of the 54th Massachusetts Infantry regiment, an all-Black unit during the American Civil War, and the film ends with most of the unit being mowed down by Confederate soldiers while assaulting Fort Wagner in South Carolina. The piece in question relies on a text drawn from a Latin mass, frequently incorporated into the classical canon in various requiems from Mozart to Verdi. But, as so often, Horner (or his orchestrator) doesn’t stick to the actual text, but rather seems to create a mashup of snippets from the traditional requiem mass.

So is Horner just using the text of the requiem mass the way layout professionals use the phrase “Lorem ipsum?” Hard to imagine. After all, it makes a lot of sense to have a requiem text being sung as your characters are dying one by one. But more importantly, precisely because the text is so garbled, certain words stick out all the more: “Recordare,” Latin for “recall,” “stricte” (severely), and “judex” (judge). These pieces are largely taken from the Dies Irae, the part of the requiem mass that tells of the end of the world and God’s judgment, albeit with admixtures from just about every other part. The text, though hard to parse, is remarkably consonant-heavy for a Hollywood soundtrack, and a lot of it seems to be due (and I hope I’m hearing that right, as no actual text exists for this piece that I was able to track down) to the text’s overreliance of the future active participle, which ends in “-urus”: just in terms of pure grammar, the threatening hissing in the text is literally about what is to come.

So is Horner just using the text of the requiem mass the way layout professionals use the phrase “Lorem ipsum?” Hard to imagine.

So maybe the text, and the fact that it’s in Latin, isn’t about pretentiousness on the part of the filmmakers at all. It’s a mass for the dead and a tale of divine wrath, and it seems to make — over the heads of most of the film’s audience, admittedly — a point about retribution. It is remarkable how sophistic (white) Americans, who are frequently so proud to deal in moral absolutes, get when it comes to their Civil War. Horner’s grammatically challenged remix of the “Dies Irae,” I think, makes a point that is stark and simple and remarkably rare in American depictions of the country’s most bloody conflict: The Confederacy is evil, those who kill on its behalf are committing a sin, and they are bringing God’s wrath (and future judgment) upon themselves. There is, then, in this particular instance something to be gleaned from a text that otherwise we’re not meant to pick up on.

Which gets at an interesting disconnect — namely, that different constituencies will experience the same song differently. The choir members know what they’re saying, even if they have no clue as to what any of it means. And the composer, director, sound designer, etc., although they live with a soundtrack far longer than either the performers or even the most devoted audience, don’t tend to get to the words that go with the music until fairly late in the game. They often have to rely on orchestrators and assistants, or a helpful choirmaster who claims he really knows Latin. Their budget, and thus their time, is not tailored to their needs, but to the dictates of the director and the studio. The prose simply appears, like a ghost in this immense machine. And — in spite of the fact that most parties involved seem to be content to have it not mean very much — it winds up signifying something.

One example: An “exotic” text can only be understood by very specific listeners. But, very much to the point, they are not therefore the intended listeners. Lost Horizon wasn’t banking on a particular reception in the Tibetan community — rather the opposite: Dimitri Tiomkin and his collaborators seem to have counted on not having any actual speakers of Tibetan in the audience.

This gets a lot more troubling in the case of the phrase “Nants ingonyama bagithi baba,” likely one of the most repeated, parodied, and bowdlerized lines of text in any soundtrack. It’s clear that it isn’t addressing the average viewer with the intention of being understood. The very fact that it is in Zulu, but the story of The Lion King appears to take place in the Serengeti, thousands of miles to the north, suggests that the language is here to signal one thing and one thing only: African-ness.

For contrast, look at the way composer Michael Abels’ score for Jordan Peele’s Get Out features Swahili voices: Outside of the considerable number of Swahili speakers in the world, most people watching Get Out won’t know what the singers are saying. But what they’re saying does matter, in a way: Literally “listen to your ancestors,” but as a saying meaning something kind of like “you’re about to be in danger.” The viewer who doesn’t understand this line is missing an important warning about what is to come in the film. As is, of course, the film’s African American protagonist who cannot listen (or at least understand) his ancestors. Peele and Abels manage to wring from this small decision a whole range of subtle points.

***

But as with all exoticism, there’s a strange tug of war between condescension and appreciation in these kinds of borrowings. When Ottman decided to use a choral piece at the end of the 2008 film Valkyrie, he clearly needed a German text, and I suspect any German text would have sufficed. But he didn’t pick any German text. The film stars Tom Cruise as Claus Graf Schenk von Stauffenberg, a historic figure who led the only attempt by members of the Nazi state to get rid of Adolf Hitler. The text is “Wandrers Nachtlied,” one of Johann Wolfgang von Goethe’s most memorable, well-known texts, and if it’s a little bit treacly by the great poet’s standards, it’s hard to deny it’s a deeply appropriate choice for this moment. Not overtly about politics, it is nevertheless about history, about reflection, about resignation. And about a different use of the German language than one is used to in Hollywood films.

For any German person it’s weird to hear bad guys so consistently speak (and butcher) your language. I’m not complaining, mind you, it makes perfect sense. But what’s remarkable about Valkyrie is that it seems unusually careful for a Hollywood-film in how it deals with the German language. Earlier in the film, Cruise’s character says that “people need to know we were not all like him,” and this final poem seems to do something similar for the German language — the filmmakers close their movie by pointing out that this language is capable of beauty and deep humanity. The poet Paul Celan — himself a Holocaust-survivor — pointed to the strangeness of writing in a language that was both “my mother’s tongue” (Muttersprache) and “the murderer’s tongue” (Mördersprache). Ottman seems to want to recover the former after showing plenty of the murderers.

The strange thing is: I am pretty sure Goethe’s “Nachtlied” is the first utterance in actual German in this film about Germany. Cruise sort of tries a German accent every other scene, the largely British supporting cast doesn’t even bother. And no one speaks any German, the way Sean Connery does with Russian at certain moments in The Hunt for Red October, or Alan Rickman in Die Hard. The film’s supporting cast is stacked with Germans who belt out accented English throughout. It almost feels like the film wants to bend over backwards a little too much: remind us what beauty and thoughtfulness this language is capable of — even though it never shows us the barbarity, which the film renders in English.

I suppose it’s moments like that one that made me obsess over what choirs sing in movies, and who decides what they sing. Because it’s a moment when blockbuster film or TV, which increasingly is created for the greatest possible global audience, which has been focus-grouped and test-audienced within an inch of its life, manages to speak far more directly, more improvisationally to a much smaller audience. All of us are sometimes in that smaller audience, sometimes not. But we’re aware it’s there. When cinema is literally speaking in tongues, how could we not? And to be the person who hears a call the object of fascination never knew it was putting out there — what better definition could there be of what a fan really is?

* * *

Adrian Daub is professor of Comparative Literature and German Studies at Stanford University. He is the author of four books on German thought and culture in the nineteenth century, as well as (with Charles Kronengold) “The James Bond Songs: Pop Anthems of Late Capitalism” (related story here). He tweets @adriandaub.

* * *

Editor: Krista Stevens
Fact checker: Julie Schwietert Collazo

My Seat at the Table

Getty Images

Bernice L. McFadden | Longreads | August 2021 | 15 minutes (4,049 words)

I discovered through DNA testing that my first maternal ancestor in the United States came from the country in Africa now known as Cameroon. This Cameroonian ancestor was a member of the Bamileke tribe — an ethnic group which originated in Egypt.

The table and the chair were invented in Egypt around 2500 B.C. Egypt is a country located in Northeast Africa and not in the Middle East as people have been misled to believe. Do you find it ironic that gaining a seat at the table has become a metaphor for the advancement into spaces that are historically and predominately white and male and generally resistant to Black and female representation?

Recently, Black people and women have been crashing those homogenized parties, bringing with them their own chairs or filling vacant ones at those proverbial tables.

Some of the gatekeepers feign acceptance of the racial modifications of these platforms, while others have no qualms conveying their disdain or outright outrage at the presence of a Black person at said table. For example, on Jan. 25, 2012, Jan Brewer, the former governor of Arizona, stood on the airport tarmac and chastised, like a child, one Barack Hussein Obama — a Black man who was, at the time, the sitting president of the United States of America. Moments later, when Brewer was asked about the incident she said, “He was a little disturbed about my book.”

Other gatekeepers are covert with their contempt, preferring to close their arms around unwelcomed Black people in an insincere embrace as they sink a blade into their backs.

I have a longtime friend. She and I are BFFs and are as close as sisters. She is white and Filipino, and we have been friends since 1979, when we first met at our mostly white boarding school in the rural Pennsylvania town of Danville.

We are both the eldest of four children, both raised in two-parent households.

For most of our relationship, race was not a topic of discussion. However, that changed in the early 2000s when she came to New York to spend a weeklong holiday with me. She’d spent the day in Manhattan, catching up with friends and taking in theater. Over dinner that evening, she shared that she’d had an extra ticket for the play she’d seen but hadn’t considered inviting me because she assumed I wouldn’t be interested in a staged production that did not have Black characters.

That statement stalled me. I asked if she thought that because I was Black, that my interest lay only in Black-centered entertainment?

She said yes.

I was stunned by her misconception of me and Black people on the whole. I asked if she, a biracial woman living in America, was only interested in European and/or Filipino art? She confessed that her interests were indeed diverse but couldn’t explain why she presumed it did not hold true for me or others who looked like me.

I explained that contrary to what she’d been told, Black people are not a monolith. I told her that we are diverse in every conceivable way.

This was the conversation that set us off on a journey about the myth of race, systemic racism, and what it’s really like to be Black in America.

At our school I was just one of a handful of Black students. On Saturdays, we girls, Black, white, and other, would walk from school into town, to lunch at the Arthur Treacher’s or the Hoagie Shop. Oftentimes, we would go to the local Woolworth’s to buy books, candy, and millinery supplies for sewing class. Even though I knew my white classmates were secretly slipping nail polish and lip gloss into their pockets and backpacks, it was me and the other Black girls that the store employees followed and hawk-eyed.


Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.

Sign up


Sometimes I spent weekends in the homes of my white classmates, those day students who lived in and around the town. It was always a treat to get away from campus, to sleep in a cozy bed and eat a home-cooked meal.

At the time, my family and I lived in a crowded two-bedroom apartment. The kitchen was tiny, leaving little space for a dining table large enough to accommodate a family of six. So, we children ate our meals in the kitchen while my parents ate in the living room, on the couch, plates in their laps.

My father believed that children should be seen and not heard, especially at the dining table, so talking was not permitted during meals. In contrast, the parents of my white friends encouraged and participated in mealtime discussions.

It was at one of those family dinners that I remember how my BFF’s father, a tall, slim, kind man with glasses, responded aloud to a question that I had not heard posed:

“Of course, the white race is the superior race.”

To this day, I do not know who asked the question or if in fact a question was actually asked. Perhaps, this man, who had always been nothing but kind and welcoming to me, found it necessary to remind me that even though I was in his Victorian home, sitting at his dinner table, eating the food that had been lovingly prepared by his Filipino wife — I was inferior to him.

I cannot recall if my friend and her siblings fell silent, or if my friend, her siblings, or her mother looked at me for a reaction or in consolation. I remember that I kept my eyes lowered to my plate, that the grip on my fork tightened, and the leisurely pace of my heart launched into a sprint. I was 15 years old and the situation my family had warned and prepped me for as a Black person living in white America had arrived yet again.

Before that incident, another incident took place in Brooklyn in the waning days of autumn when I was on my way home from middle school. On that day, I exited the subway on the south side of Prospect Park, in a neighborhood where very few Black people lived at the time. There, I was followed by two white teenage boys who pelted rocks at me, shouting, “Nigger, go back to Africa!”

A year or two before, my younger brother and I were walking down Rockaway Boulevard in South Ozone Park, Queens, a neighborhood that in the ‘70s was still majority Italian. As we made our way to our grandparents’ home, a group of white teenage boys and girls stalked us for blocks, hurling soda cans, bottles, and racial slurs.

The fact that my BFF’s father chose that moment to express his deepest held beliefs about his racial superiority is not beyond me. Indeed, my presence at his table was conditional — permitted only because I made his daughter happy and he enjoyed seeing his daughter happy because his love for her was unconditional.

Do I believe his declaration was meant to wound and degrade me?

Yes, I do.

Not only was I hurt, but being an empath, I also absorbed the humiliation on behalf of his Filipino wife who had not batted an eye at the insult.

Do I think that my friend’s mother believed that she, a Filipino person of color, was less than her husband because he was white, and she was not?

Yes, I do.

Mohandas Karamchand Gandhi, the Indian anti-colonial nationalist and spiritual leader, believed that Europeans were the most civilized of the races and that Indians were almost as civilized as Europeans and Africans were wholly uncivilized.

Perhaps my friend’s mother held similar beliefs.

Nevertheless, I would return to that house and eat at that table again and again, without further incident. But I would never forget the shot fired because the wound it left would not allow me to forget. The memory is lodged in me like the bullet it was intended to be.

I would return to that house and eat at that table again and again, without further incident. But I would never forget the shot fired because the wound it left would not allow me to forget. The memory is lodged in me like the bullet it was intended to be.

***

Some years after that dinner, my friend and her family traveled to the Philippines to visit her maternal family. Not too long after her return to the United States, she and I met for dinner at a Manhattan restaurant. I sat across the table from her and listened, enthralled as she recounted her trip in vivid detail. Near the end of her monologue she mentioned that when she ventured out without her Filipino mother or another Filipino family member for a walk or an excursion to one of the many marketplaces — she was baffled about why strangers addressed her in Tagalog, which is perhaps the most widely spoken language in the Philippines.

I frowned, asking, “Why was that so confusing?”

“Well,” she said, “because I don’t think I look Filipino.”

“What do you think you look like?”

“American.”

I am keenly aware that people who look like me — people born Black, without “the complexion for the protection” as comedian Paul Mooney described it — understand that when people say American, that means white. Those of us born in America who are not white are hyphenated to stress that we are not real Americans, but hybrids — like broccoflowers and limequats.

My BFF is tall, beige-complexioned with almond-shaped eyes, and long straight black hair. To me she looks Asian, but I admit, she could also pass for Native American. The one thing she cannot pass for is white, which is how she saw herself.

My BFF is tall, beige-complexioned with almond-shaped eyes, and long straight black hair. To me she looks Asian, but I admit, she could also pass for Native American. The one thing she cannot pass for is white, which is how she saw herself.

I smiled, reached for the wine glass, and asked, “Well, friend, if you look American, then what do I look like?”

I watched the epiphany rise in her eyes like the morning sun.

***

In his 1997 essay, “Deconstructing the Ideology of White Aesthetics,” John M. Kang wrote:

Like male chauvinism, the ideology of White aesthetics assumes that the politically dominant group, White people, are inherently superior to a weaker group, people of color. The ideology of White aesthetics holds that people of color, by virtue of their aesthetic inferiority to White people, deserve to remain subordinated.

Kang’s observation was validated during the 2014 National Book Awards, a major literary event that honors the best and brightest writers.

In 1953, just three years after the award was conceived, Ralph Ellison would win for his novel, Invisible Man. Ellison was the first Black writer to win a National Book Award. Two decades would pass before another Black writer would be so honored. In 1975, Virginia Hamilton received the award for her children’s book, M. C. Higgins, The Great.

In 1983, both Alice Walker and Gloria Naylor received National Book Awards for their novels: The Color Purple and The Women of Brewster Place. So if you’re counting, only four Black authors were awarded National Book Awards over a 30-year period.

The 2014 National Book Awards dinner was held at the ritzy Cipriani Wall Street restaurant located in NYC’s financial district. The nominees, their guests, and ticket holders, all dressed in their finest threads, sat at tables covered in white linen cloth. Before the awards were given, the attendees were treated to a sumptuous meal complete with wine and cocktails.

That year, Jacqueline Woodson, a Black woman, received the award in the Young People’s Literature category for her novel, Brown Girl Dreaming. After Woodson gave her acceptance speech, host Daniel Handler — aka Lemony Snicket, a white man best known for his children’s books, A Series of Unfortunate Events and All the Wrong Questions — returned to the stage and gleefully bellowed:

“I told you! I told Jackie she was going to win. And I said that if she won, I would tell all of you something I learned this summer, which is that Jackie Woodson is allergic to watermelon. Just let that sink in your mind. And I said you have to put that in a book. And she said, you put that in a book.”

Handler continued: And I said I am only writing a book about a Black girl who is allergic to watermelon if I get a blurb from you, Cornell West, Toni Morrison, and Barack Obama saying, ‘this guy’s OK! This guy’s fine!'”

“Alright,” he chuckled when he realized the crowd was uncomfortable. “Alright, we’ll talk about it later.”

***

The Laugh Factory in Los Angeles is a well-known comedy club that has hosted many legendary comics of all backgrounds, creeds, ethnicities, and genders. The audience sits in chairs that are arranged in the form of a C around the stage.

Back in 2006, Michael Richards, former star of the popular syndicated television show Seinfeld, was performing at the Laugh Factory when he became enraged because Black audience members were heckling him during his standup routine.

The infuriated Richards took the opportunity to remind the Black audience members that: “Fifty years ago we’d have you upside down with a fucking fork up your ass.” Richards continued, “You can talk, you can talk, you’re brave now motherfucker!’

He demanded that the Black people be removed from the club, barking, “Throw his ass out. He’s a nigger! He’s a nigger! He’s a nigger! A nigger, look, there’s a nigger!”

***

If the lunch counter is the heir to the table, then the chair is the progeny of the stool. For decades, Black people, those offspring of enslaved Africans, were barred from service at lunch counters in the Jim Crow south.

On Feb. 1, 1960, the Greensboro Four, who were students at North Carolina Agricultural and Technical College — Ezell Blair Jr. (who later took the name Jibreel Khazan), David Richmond, Franklin McCain, and Joseph McNeil — walked into the Woolworth’s department store in Greensboro, North Carolina, sat down at the lunch counter, and ordered coffee and sandwiches.

Soon, their mission to disrupt and dissolve the segregationist edicts that supported Whites Only counters were adopted by Black people and their white allies in other segregated Southern states, and the “Sit In” movement was born.

The “Sit In” crusade was an act of non-violent, civil disobedience that was frequently met with violence.

Activists were spat on, milk poured over their heads, smoke blown into their faces —in some cases they were punched, slapped, and brutally removed from the lunch counters.

***

A news desk is similar to a luncheonette counter. Journalists sit at these desks to report the news. Guests are often invited to sit at news desks to enlighten viewers on a topic on which they may or may not have expertise. Sometimes, multiple guests are summoned to debate an issue.

On April 7, 2010, AWB (Afrikaner Resistance Movement) secretary-general Andre Visagie, a white South African man, appeared with political analyst Lebohang Pheko, a Black South African woman on e.tv’s current affairs show Africa 360, to discuss race relations in the wake of Eugène Ney Terre’Blanche’s murder.

Terre‘Blanche was a white supremacist and Afrikaner nationalist who founded the AWB. According to Wikipedia, Terre‘Blanche swore to use violence to preserve minority rule. In 1997, Terre’Blanche was convicted and sentenced to six years in Rooigrond Prison for assaulting a gas station attendant and for the attempted murder of a Black security guard. He served three years before being released. Terre’Blanche was murdered on his farm on April 3, 2010.

During the TV show exchange, Andre Visagie became enraged when Pheko continuously interrupted him. In the video, Visagie rips off his microphone and springs from his chair. The incensed Visagie aims his finger at Pheko, declaring: “You won’t dare interrupt me!”

Chris Maroleng, the Black South African host of the show, planted himself between Pheko and the irate Visagie. For a millisecond, it seems as though the two men might come to blows until finally, Visagie addresses Pheko again, warning, “I am not finished with you.”

Andre Visagie was born and raised under an apartheid system dissolved in 1994. In 2010, he was a silver-haired old man living in a country where Black people were no longer required to be subservient to the white minority.

As I watched the exchange between the white Visagie and the Black and female Pheko, I could sense the radiating fury of Visagie as he tried to grapple with the fact that a Black woman was asserting herself, holding her ground, and speaking her mind as if she was his racial equal.

Only that the world was watching kept Visagie from pummeling Pheko to death.

***

In some academic institutions, students sit on furniture known as a combo school desk, which is a chair with a small table attached.

In October 2015, a 16-year-old Black girl was seated in a combo school desk in her math class at Spring Valley High School in Columbia, South Carolina.

In South Carolina the school system remained partially segregated until 1970. In February of 1970 the United States Court of Appeals for the Fourth Circuit Court ordered that a school desegregation directive be issued in Lamar, a town just one hour from Columbia.

Nearly 200 hundred angry white parents, irate that their children would be taught alongside Black children, armed themselves with guns, chains, bricks, and axe handles and descended on buses carrying elementary- and high school-aged students from Lamar. The mob overturned two school buses and clashed with law enforcement before they were finally subdued with tear gas. During the melee, six Black students were injured.

The young lady in the math class at Spring Valley High School was on her cell phone, which is against the rules, but not a crime. When asked to put her phone away, she took her sweet time doing so. This infuriated her white teacher, who asked her to leave the class. When she refused, the vice principal was called in. He too asked her to leave the class. Still, she refused to leave.

Senior Deputy Ben Fields, a white school resource officer, was called in to handle the situation.

According to the LA Times, Fields “… wrapped his arm around her neck and tried to pull her from her desk, which flipped backward to the floor. He dragged her out of the desk, threw her across the floor, and arrested her for disturbing the classroom.”

***

One of the games I remember playing in grade school was musical chairs. The teacher would arrange a circle of chairs that equaled one less chair than the number of players. For example, if there were 10 students, there would be nine chairs.

The teacher would play a song on the record player and we children would march around the circle of chairs. When the teacher stopped the music, we would all scramble to secure a seat. The student left standing — because he or she failed to capture a chair — was the loser.

Afterward, the teacher removed a chair, turned on the music, and the game continued until there were only two students and one chair left.

As the number of chairs decreased, the anxiety among the players heightened. Oftentimes the game turned violent. Students would push and shove their fellow classmates to keep them from stealing the chair away from them.

The point of musical chairs is to teach children fair play and sportsmanship.

***

In May of 2019, my high school friend married the love of her life in a lovely church ceremony in Pennsylvania. The intimate wedding reception, attended by close friends and family, was held at a rustic, stylish restaurant.

The bride, her groom, and all 60 of her guests sat at a long wooden table. Good wine and delectable food were served.

I was the only Black person in attendance. I was aware of my Blackness but not uncomfortable with it.

Across the table from my friend and her new husband, I sat sandwiched between my BFF’s youngest brother and a woman who was filled with so much joy that her laughter sounded like sleigh bells.

Seated next to the happy couple was the brides’ middle brother and his wife. The teenage children of both brothers filled out the remaining seats at the west end of the table.

From the corner of my eye, I saw the wife of the second brother stealing long, probing glances at me. When I suddenly turned to meet her inquisitive eyes, her face brightened with embarrassment.

We gazed at each other until flustered she asked, “So, how do you like living in New Orleans?”

I told her that I liked it just fine, to which she nodded, looked away, and wondered aloud to no one in particular how the family cat was getting on in her absence.

Afterward, I returned my attention to the woman with the jingle-bell laughter.

There were several conversations happening at once around the table. Everyone spoke at an even decibel — just loud enough to be heard by the person they were speaking to, but not so loud that their exchange could be heard by guests seated two or three seats away.

The woman I was conversing with said something funny, and I chuckled into my palm, stifling my usual, open-mouthed guffaw, because I was aware that more often than not, white people find Black joy invasive.

I was conscious of this even before August 2015, when the Black women members of the Sistahs on the Reading Edge Book Club, were kicked off of a Napa Valley wine train in California because white passengers found their laughter “offensive.”

The woman I was conversing with said something funny, and I chuckled into my palm, stifling my usual, open-mouthed guffaw, because I was aware that more often than not, white people find Black joy invasive.

I had wiped a tear from my eye with one hand and was reaching for my water glass with the other, when one of the teenagers asked a question, loud enough for the entire table to hear:

What’s the name of that song by NWA?

I brought the water glass to my lips and even though I kept my eyes trained on the woman who’d made me laugh my eyes wet, I could no longer hear the words tumbling out of her mouth, for my ears were tuned for the response to the question. Heat crept through me and I realized that my anxiety had escalated from low-risk stage green to warning-risk stage yellow.

The question was repeated — this time a decimal above the initial inquiry.

What’s the name of that song by NWA?

To me the question sounded like the clearing of a throat, a tap on my shoulder, a nudge in my side — which is to say it yearned for my attention.

The question had been posed twice — by two of the grandchildren of the man who wounded me decades earlier. He had been dead for years, leaving his progeny to continue his legacy.

I believe his grandchildren wanted me to turn around so they could see the fire that they’d lit in my eyes. Perhaps too, they wanted to witness, firsthand, the infamous angry Black woman that is lore in white imaginations.

But I did not give them the satisfaction of seeing my anger and my pain and the leaking wound their words had reopened. Instead, I maintained my position — head turned, back to them — enduring the mental and emotional weathering — the erosion those words inflicted on me.

The microaggression veiled as an innocent question about a group whose name is an acronym for Niggaz Wit’ Attitude was asked a third time, this time by the mother who had abruptly ended her short conversation with me to wonder about her cat.

No,” she giggled, “I don’t remember the name of that song by N … W … A.

She dragged the letters for effect.

Nigger was the trigger to which I was expected to react. And even though the foul word itself had not been uttered, its implication was as clear as the crystal wine glasses on the table.

I understood that this word play was my verbal reminder that my seat at that table was untenable. I understood that my presence was tolerated but not welcomed and that if they had to deal with my company because the bride loved me and they loved the bride, well then, their lenience would come with a side of cruelty.

Nigger was the trigger to which I was expected to react. And even though the foul word itself had not been uttered, its implication was as clear as the crystal wine glasses on the table.

***

The table and the chair were invented in Egypt. Egypt is a country located in Northeast Africa and not in the Middle East as people have been misled to believe. I am a descendant of the Bamileke tribe — an ethnic group which originated in Egypt.

Egypt is in Africa.

Egypt is in Africa.

* * *

Bernice L. McFadden is the author of 15 novels and the recipient of the 2017 American Book Award as well as NACCP Image Award for Outstanding Literature for her novel, The Book of Harlan. She is a Professor of Practice at Tulane University.

* * *

Editor: Krista Stevens
Fact checker: Julie Schwietert Collazo

Judge a Book Not By its Gender

Illustration by Carolyn Wells

Lisa Whittington-Hill | Longreads | May, 2021 | 29 minutes (7,916 words)

I blame Drew Barrymore for two things: the amount of money I have spent on celebrity memoirs and an unfortunate attempt to dye my hair platinum blonde in 1993, inspired by Drew’s locks in a Seventeen magazine Guess Jeans ad.

Little Girl Lost, Barrymore’s 1990 account of growing up as a child star in Hollywood, was my first celebrity autobiography. It ignited my love of celebrity memoirs, especially those by women. My dog-eared copy has survived numerous book purges and cross-country moves. I am not alone in my appreciation for it. The coming-of-age tale was a New York Times bestseller and although the book is now out of print, it has achieved cult-like status. It was even the subject of a 2018 New York Times Magazine Letter of Recommendation.

Barrymore was just 11 months old when she got her start in a television commercial for Puppy Chow. At 7 she starred as Gertie in Steven Spielberg’s blockbuster 1982 film E.T. and that same year became the youngest person ever to host Saturday Night Live. Barrymore’s drug and alcohol use began shortly after E.T. phoned home. The first time she got drunk she was 9. Barrymore started smoking weed at 10 and by 12 had moved on to cocaine. The actress entered rehab at 13; during her second stint in rehab she completed Little Girl Lost, which was published when she was just 16.

Barrymore’s drug and alcohol use began shortly after E.T. phoned home.

Gossip and juicy stories about nightclubbing with Jack Nicholson definitely make for a good read, but what initially drew me to the book was that Barrymore wrote it to counter stories about herself in the National Enquirer. “[I]magining the godawful headlines — ‘Drew Barrymore Cocaine Addict at Twelve Years Old’ or ‘Barrymore Burns Out in Teens’ — and the impression people would get of me was all my worst possible fears come true. I would’ve been the last person on Earth to deny my problems, but I wanted to have the option of confessing them,” Barrymore writes in Little Girl Lost. She wanted to come clean on her own terms. Barrymore’s desire to control her own life story compelled me to read the book and has made me return to it over the years.

Barrymore wanted to redirect her life’s narrative and that’s a popular reason why celebrities embrace the genre, but it is not the only reason. Some stars write their book to revive a stalled career and return to the limelight. For others, memoirs extend their 15 minutes of fame. This is a popular motivation for reality show stars. (Will you accept this rose and this six-figure book deal?) Memoirs also settle old scores. In André Leon Talley’s The Chiffon Trenches: A Memoir, the fashion journalist and former Vogue creative director works through his issues with Vogue editor Anna Wintour. Memoirs can also promote the brand a star has built around their celebrity. Reese Witherspoon’s Whiskey in a Teacup, which markets the star’s Southern Lifestyle to y’all, or any book from one of Queer Eye’s Fab Five are great examples.

For readers, celebrity memoir appeal lies in the juicy gossip and name dropping, and the chance to peek inside and live, if only for 500 pages, the glamorous lifestyles of the rich and famous. Social media, reality television, celebrity gossip blogs, and the popularity of TMZ-style tabloid journalism have created an insatiable desire to know more about our favorite celebrities. Celebrity memoirs help fulfill this desire. Sometimes, unfortunately, we learn a little too much about our favorite stars. After reading Carrie Fisher’s The Princess Diarist, her third memoir, I am unable to watch Star Wars without thinking about all the coke Fisher said was consumed on set. I imagine the film’s stars hollowing out lightsabers to use like giant straws to blow rails with. (That’s not how the force works!)

While it’s easy to dismiss celebrity memoirs as guilty pleasure reads or unworthy of serious literary consideration, you cannot deny the genre’s popularity. One of the bestselling celebrity memoirs of all time, former first lady Michelle Obama’s 2018 release, Becoming, is still on the The New York Times bestsellers list and has sold more than 10 million copies. Recent months have seen new books from everyone from singer Mariah Carey to actor Matthew McConaughey to soccer star Megan Rapinoe. Celebrity memoirs are big business and we have Rolling Stones co-founder and guitarist Keith Richards to thank for that. His bestselling memoir Life was published in October 2010 and more celebrity autobiographies were published in the four years that followed than had been in the previous 15.

Life, for which Richards received a $7 million dollar advance, sold over one million copies in its first year. Following the success of Life, memoirs by male musicians from Duff McKagan to Steven Tyler were all bestsellers and it is not just men penning the hits. Remember when we all got together and decided women were funny after Bossypants came out? Tina Fey’s 2011 bestselling memoir preceded an onslaught of popular memoirs by funny ladies, including Mindy Kaling’s Is Everyone Hanging Out Without Me? (And Other Concerns) and Amy Poehler’s Yes Please.

***

Since first reading Little Girl Lost at 20, I have devoured memoirs by female celebrities from punk singer Alice Bag’s Violence Girl: East L.A. Rage to Hollywood Stage, A Chicana Punk Story to Jersey Shore star Snooki’s Confessions of a Guidette. I’m interested in how women write their stories, what they leave out, what they focus on, and how much of what they reveal is a reaction to the image of them we have from watching their movies or listening to their music or seeing them stumbling out of nightclubs in Us Weekly.

“How do we edit our life into a decent story? That’s the rub with an autobiography or memoir. What to reveal, what to keep hidden, what to embellish, what to downplay, and what to ignore? How much of the inner and how much of the outer?” says punk icon and Blondie lead singer Debbie Harry in her 2019 memoir, Face It, of a process that is scrutinized and critiqued much more if, like Harry, you’re a woman.

I’m interested in how women write their stories, what they leave out, what they focus on, and how much of what they reveal is a reaction to the image of them we have from watching their movies or listening to their music or seeing them stumbling out of nightclubs in “Us Weekly.”

And while there is no shortage of male celebrities spilling their guts all over my poorly constructed Ikea bookshelf, the fact that they share shelf space with celebrity memoirs written by women is about all they have in common. When it comes to celebrity memoirs there’s a distinct gender bias in everything from how the books are marketed to the type of topics female celebrities are expected to write about and the amount of themselves they are expected to expose to sell books.

The gender divide bias becomes even more problematic, and downright depressing, when you read the reviews and see how critics and the press receive female celebrity memoirs. Rather than celebrate women and their amazing stories, reviewers revert to stereotypes and tired clichés and, in the process, miss the actual story. Women can spend chapters talking about their accomplishments, their awards, and their accolades and reviewers will still only focus on the sex, the scandal, and the bombshell reveals that are expected from female-penned celebrity memoirs if they want to actually sell books. From memoir titles to book blurbs, when it comes to celebrity memoirs by women, sadly, we haven’t come a long way baby.

 

***

Debbie Harry’s Face It was one of the most anticipated celebrity memoirs of the recent past. In the book, Harry chronicles everything from her adoption at only 3 months old, to her days in the hippie band Wind in the Willows and all-girl group the Stillettos, to forming both Blondie the band and Blondie the persona. For Harry, Blondie was very much a character she played, one inspired by the “Hey, Blondie!” catcalls she received from construction workers after bleaching her hair, as well as the 1930s Blondie comic strip character who was a “dumb blonde who turns out to be smarter than the rest of them.” Marilyn Monroe was also an inspiration; Harry describes Monroe as “the proverbial dumb blonde with the little-girl voice and big-girl body,” who despite her appearance has “a lot of smarts behind the act.”

Face It also covers Harry’s acting in films like Videodrome and Hairspray, her time training as a professional wrestler for a role in the Broadway play Teaneck Tanzi: The Venus Flytrap, as well as her activism and philanthropy work. (Fun fact: She was almost Pris in Blade Runner, but her record company made her turn it down.) There is certainly no shortage of great material for reviewers to discuss. Unfortunately, they responded with the same tired sexist tropes that greet memoirs written by women.

“In her memoir, Debbie Harry proves she’s more than just a pretty blonde in tight pants,” read the headline on The Washington Post’s review of Face It. The headline was later changed to, “In her memoir, Debbie Harry gives an unvarnished look at her life in the punk scene” after social media responded less than kindly to the sexist headline choice. The Washington Post admitted they botched the headline and appreciated the feedback, but the headline was not the review’s only problem.

The review opens with: “Even if Debbie Harry, of the band Blondie, isn’t to your taste—her voice too smooth, her sexiness too blatant, her music too smooth—you can’t dismiss certain truths about her.” While this sentence is a great example of disdain, it is not a great review opening. I read Bruce Springsteen’s 2016 memoir Born to Run at the same time as Harry’s and tried to imagine the Post opening a review of Springsteen’s book in the same way. To be fair I do find his sexiness far, far too blatant.

So how does the Post open Springsteen’s memoir review? “Why, one might ask, would Bruce Springsteen need to write an autobiography? Haven’t we been listening to it for the past half century? Hasn’t he been telling us his story all along?” says Joe Heim in the review’s first paragraph. Springsteen, a talented songwriter, has already shared so much through his music, what more could he be required to give us? It is okay if you want to sit this one out Bruce, I have heard Atlantic City, and do not require any further emoting from you at this time.

The Post’s review of Face It just goes from bad to worse, with criticism that Harry “sometimes comes across as self-interested” to a focus on the more sensationalist aspects of her story like sex and drugs. (This is an autobiography, right? I didn’t see them complaining about the 79 chapters in Springsteen’s book.) “She had a hookup with an Andy Warhol protégé in a phone booth in Max’s Kansas City and began what she blithely calls ‘chipping and dipping’ in heroin,” reads the review. The Post points out that “Harry is quite explicit in her descriptions of her drug use and sex life,” which they seem to interpret as permission to exploit the more sensationalistic aspects of her life and use them as a focal point in their review.

The review also offers a great example of how media likes to promote and celebrate the idea of women as trailblazers, praising Harry for being candid about the realities of being a female musician (an “unvarnished look”), while also painfully reinforcing the realities of being a female musician by using a sexist, stereotypical headline that focuses only on Harry’s appearance and sex appeal.

Control is a central theme of Harry’s book, whether it be of her image, her band, or her art. Early in the book Harry recounts a record company promoting Blondie’s first album using posters with an image of her in a see-through blouse, despite early reassurances that the posters would only feature headshots and would include all band members. She was not happy with the marketing decision, saying, “Sex sells, that’s what they say, and I’m not stupid, I know that. But on my terms, not some executive’s.” And while doing things on her own terms is a source of pride for Harry, reviewers have a serious problem with it.

For Harry control empowers, for memoir reviewers it threatens. “You can’t control other people’s fantasies or the illusion they’re buying or selling,” says Harry early in Face It when talking about people having posters of her on their bedroom walls. While Harry resigns herself to her lack of control, reviews of her work never want to relinquish theirs. Harry’s insistence on doing things on her own terms is panned by reviewers who call her guarded and closed off.

Reviewers want to read a book by a female celebrity and have her completely figured out by the last page. “[W]hat’s a memoir for, if not to pull back the curtain and check out the lady who is pushing the buttons?” asks Harry in Face It. But when the curtain doesn’t pull back as much as reviewers want, they become resentful, sullen, and offended, reacting with “how dare you?” to any resistance on the part of the woman to give them everything they want, every piece of her. The Atlantic’s review reads almost like it’s giving Harry permission to tell her story on her own terms, saying “holding back is an understandable maneuver for someone who’s been stared at so much.”

One way or another, the reviewers keep the sexist treatment coming when discussing Face It. The Guardian was also annoyed that Harry did not give enough of herself in the book. “It’s a shame that Harry passes up the chance to dig deeper into her experiences of objectification and the nature of fame, but more disappointing is that we learn so little about her interior life, and how she really thinks and feels.” I guess talking about being raped at knifepoint by a stranger is not enough for the reviewer. What’s with the heart of glass Debbie? Give us more of your pain! And on page five, not 105!

I guess talking about being raped at knifepoint by a stranger is not enough for the reviewer. What’s with the heart of glass Debbie? Give us more of your pain! And on page five, not 105!

The headline of Rolling Stone’s piece on Face It highlights how Harry’s book “looks back on what she learned from Andy Warhol and David Bowie.” The media loves to position women in relation to the men in their lives as if the only way we can understand work by women is in the context of the men who orbit them. Despite writing 368 pages about herself, according to Rolling Stone, the only interesting thing about Harry is the famous male company she kept.

The New York Times continues the tired pop culture gender bias with a review that manages to make it all the way to the fourth paragraph before it mentions her age. It also talks about the number of memoirs by female rockers being released at the same time as Harry’s book. (“[T]here’s a bit of a pileup of female rockers getting reflective this season.”) I smell a trend. Ladies, they be writing! The review mentions the fact that Harry’s “face is unlined” and talks about her “crisp red collared blouse with white polka dots and red leggings.” I think Bruce was wearing the exact same thing when they wrote their piece about him and Born to Run. How embarrassing.

Two weeks after Face It came out another musical icon released a memoir. Me by Elton John covers the singer’s childhood in the London suburb of Pinner, his early musical days in Los Angeles, his songwriting partnership with Bernie Taupin, successful solo career, and marriage and family with husband David Furnish. Keen celebrity memoir readers might also be quick to point out that the title of John’s memoir is the same as that of actress Katharine Hepburn’s. Is there anything men will not just unapologetically lay claim to?

The review mentions the fact that Harry’s “face is unlined” and talks about her “crisp red collared blouse with white polka dots and red leggings.” I think Bruce was wearing the exact same thing when they wrote their piece about him and “Born to Run.” How embarrassing.

While Rolling Stone’s book review name-checked Harry’s famous male friends in the headline, not surprisingly, John’s does not. “Elton John’s Me Is A Uniquely Revealing Pop Star Autobiography. The long-awaited book covers his hard childhood, struggles with addiction and road to recovery.” It ends with “Elton has never been one to hold back difficult truths, and Me — while a little skimpy on revelations about his brilliant, ground breaking music — is essential reading for anyone who wants to know the difficult road that he walked while creating it.”

Entertainment Weekly’s description of Me is also glowing: “While Me is as colorful as you’d expect from an artist famous for his outlandish stage costumes and outsize temper tantrums, it is also so much more than simply a dishy sex, drugs and, rock ‘n roll tell-all.” The Entertainment Weekly review shows that when it comes to male celebrity memoirs there may be sex and drugs, but no review should reduce their work to just these scandalous and juicy elements.

Can you feel the love tonight? Not yet? Never fear, here comes The Guardian to continue the praise. Their review opens with, “Choosing one’s favourite Elton John story – like choosing one’s favourite Elton song – can feel like limiting oneself to a mere single grape from the horn of plenty.” Reading reviews of the book you have to wonder if John is still standing because he is unable to sit down from all the ass kissing. The Daily Mail calls it “the rock memoir of the decade” while for The Washington Post it is an “unsparing, extravagantly funny new memoir” and “bracingly honest.” It’s hard to find criticism and scrutiny in the reviews of John’s work because there is not much negativity. John’s book is not better than Harry’s; in fact, I think Harry’s is much stronger. She’s more self-aware and can deconstruct the misconceptions and preconceptions that fans, the media, and other musicians have of her.

Can you feel the love tonight? Not yet? Never fear, here comes “The Guardian” to continue the praise.

“You think you’re being difficult, my little sausage? Have I ever told you about the time I drank eight vodka martinis, took all my clothes off in front of a film crew, and then broke my manager’s nose?” he writes of being a father reacting to his son’s temper tantrums. There are plenty of stories about famous friends like Stevie Wonder, Yoko Ono, John Lennon, Andy Warhol, and Neil Young. The anecdotes leave readers feeling like they never get to peek behind the shiny veneer of the celebrity that is Elton John. At times it’s all surface and that’s fine, but reviewers do not criticize him for it in the same way they would if he were a woman.

John’s book reviews do talk of his well-documented addiction to cocaine (“If you fancy living in a despondent world of unending, delusional bullshit, I really can’t recommend cocaine highly enough,” he writes), but they are quick to follow it up with redemption stories, which is a standard formula in memoirs written both by and about men.

“Now that he’s sober, there’s the more conservatively dressed, happily married elder statesman of British pop, a proper establishment figure,” writes The Guardian. Not only do they give him a redemption arc and treat his addiction very much like a phase, but they also give his addiction issues a free pass, writing “while his extraordinary talent justified his personal excesses, it is his self-awareness that has counterbalanced the narcissism and made him such a likable figure.”

***

Redemption comes up often in male celebrity memoir coverage, but examine the media’s reaction to another celebrity memoir and it becomes painfully clear that this narrative is strictly for the boys.

Actress, producer, and director Demi Moore’s memoir Inside Out was released a few weeks before John’s. Moore and her book were soon all over the media and it was not for her redemption story. Like John, Moore struggled with addiction, but unlike John the media never lets her forget it, along with other parts of her story.

“Demi Moore drops shocking revelations about Ashton Kutcher, sexual assault and sobriety,” reads the headline of an L.A. Times piece about the memoir. The story proceeds to break down Moore’s childhood pain, her miscarriage, Ashton Kutcher cheating on her, and her struggles with alcohol and drugs.

Unlike In Touch Weekly, they skipped the “Ashton and Bruce Are in Good Places Too” sidebar because like with Debbie Harry, we cannot talk about Moore without mentioning the famous men in her life. More than one review talks about how Willis and Kutcher must feel about Demi airing their dirty laundry. Was Bruce mad? What does Ashton really think? Dude, where’s my sound bite?

Entertainment Weekly’s piece ran with the headline, “Celebrities react to Demi Moore’s revealing memoir Inside Out. From Jon Cryer’s affectionate follow-up to Ashton Kutcher’s cryptic non-response.” They forgot to add “male” in front of “celebrities” though as all the celebrities quoted in the piece were men. Also, if one more reviewer mentions how great Moore looks for her age, I will make them watch that awful scene in St. Elmo’s Fire where Rob Lowe’s character passionately details the origin story of St. Elmo’s Fire while performing pyrotechnics with a can of aerosol hairspray and a lighter on repeat until they beg me for mercy.

Also, if one more reviewer mentions how great Moore looks for her age, I will make them watch that awful scene in “St. Elmo’s Fire” where Rob Lowe’s character passionately details the origin story of St. Elmo’s Fire while performing pyrotechnics with a can of aerosol hairspray and a lighter on repeat until they beg me for mercy.

Most of Moore’s memoir coverage focused on the tabloid aspects of it. Read the headlines to see if you can spot a trend and how many you can read before you want to just set shit on fire (you can borrow Rob’s aerosol can).

“7 Biggest Bombshells From Demi Moore’s Explosive Memoir” (accessonline.com)

“Demi Moore: 8 Biggest Bombshells From Her Memoir Inside Out” (popculture.com, also, take that accessonline.com)

“Demi Moore’s raw Inside Out reveals rape, why marriage to Ashton Kutcher crumbled” (USA Today)

“Demi Moore Gets Real About Her Painful Childhood, Drugs, Ashton Kutcher and Other Exes in New Book ‘Inside Out‘” (Stay classy, Us Weekly)

“Why Demi Moore Fulfilled Ashton Kutcher’s Threesome Fantasies” (E! Online)

The unfortunate thing about these headlines, which would be vastly different if they were referencing a man’s memoir, is that, like Harry, they reduce Moore’s story to only its most scandalous and juicy elements. Moore got her acting start in 1981 as Jackie Templeton on General Hospital (Luke and Laura forever!), the number one show on daytime television at the time. She followed that up with roles in films like the Brat Pack bonanzas St. Elmo’s Fire and About Last Night.

Then she got what many, including Moore, consider to be a turning point in her career. “This could be either an absolute disaster, or it could be amazing,” she writes of reading the script for Ghost, which ended up being a big hit in 1990, grossing over $500 million. It was nominated for five Oscars and four Golden Globes, including a Golden Globes best actress nomination for Moore.

Moore followed the success of Ghost with A Few Good Men, Indecent Proposal, and Striptease, a film for which she was offered over $12 million, an amount no other woman in Hollywood had ever received. Moore became the highest paid actress in Hollywood. “But instead of people seeing my big payday as a step in the right direction for women or calling me an inspiration, they came up with something else to call me: Gimmie Moore.” It is worth noting that at the time her husband Bruce Willis had just been paid $20 million for the third Die Hard movie. (Yippee ki yay indeed!)

“She became a movie star in this time where women didn’t naturally fit into the system,” said Gwyneth Paltrow, a friend of Moore’s, in the The New York Times piece on Inside Out. “She was really the first person who fought for pay equality and got it, and really suffered a backlash from it. We all certainly benefited from her,” says Paltrow.

And while it pains me greatly to side with someone who talks a lot about vagina steaming, Paltrow’s right. Moore is an inspiration and fighting for equal pay in Hollywood should be one of the things the media focuses on when they talk about Inside Out, but, sadly, it is not. It is unfortunate that when Moore is discussed it is in the context of Ashton Kutcher and threesomes, at the expense of the many other empowering and interesting parts of her life.

And while it pains me greatly to side with someone who talks a lot about vagina steaming, Paltrow’s right. Moore is an inspiration and fighting for equal pay in Hollywood should be one of the things the media focuses on when they talk about “Inside Out,” but, sadly, it is not.

Remember her iconic Vanity Fair cover? Shot in 1991 by Annie Leibovitz when Moore was seven months pregnant with her second daughter Scout, it’s considered one of the most influential magazine covers of all time. Legendary Esquire art director George Lois describes it as, “A brave image on the cover of a great magazine — a stunning work of art that conveyed a potent message that challenged a repressed society.” Let’s talk about that!

Or her intense training for her role in G.I. Jane, a 1997 film Moore both starred in and produced. “I was emotionally invested in the story, the message and the provocative questions it raised,” she says of the film. The film was panned by critics and Moore talks at length in Inside Out about her disappointment at the reception to a project that meant so much to her.

The parts of the book where Moore talks about Hollywood’s double standard, whether it be the pay gap or reactions to the age difference between her and Kutcher, are some of the best parts of the book. Unfortunately, they are the parts covered least.

The last line of Inside Out is, “we all suffer, and we all triumph, and we all get to choose how we hold both.” It is a great line for a memoir to end on, but in Moore’s case, while she may get to choose how she holds both, the media will only ever focus on the suffer part.

There is the emphasis on opening up, on fighting, on bravery, on revealing — “Demi Moore Lets Her Guard Down,” reads The New York Times headline. This is the way memoirs by women are positioned and even if it isn’t explicitly spelled out, it has become the expectation so much so that when female celebrities don’t expose themselves completely they are resented for it. The reception to Harry’ book Face It offers proof.

***

Jessica Simpson released her memoir Open Book in February 2020. It reached number one on The New York Times bestseller list, but like Moore’s, Simpson’s book soon became tabloid fodder. “Jessica’s Shocking Confessions,” reads the headline on Star’s piece on the book, which focuses on Simpson’s struggles with drug and alcohol abuse and her famous exes from Nick Lachey to John Mayer. Like Moore, Simpson is now sober.

Simpson was signed to Columbia Records in 1997 at 17 as the label’s answer to Britney Spears and Christina Aguilera and went on to release six bestselling records. She also starred in the MTV reality show Newlyweds: Nick and Jessica, which featured Simpson and then husband and 98 Degrees singer Nick Lachey, who at the time was the more successful of the two. If you don’t remember Lachey from MTV you might know him from his recent gig hosting Netflix’s Love is Blind where he greets contestants with “Obviously, I’m Nick Lachey,” which seems to overestimate his place in both pop culture’s canon and our general consciousness.

Newlyweds, a ratings success, aired for two years and while it made the couple a household name, it was Simpson who stole the show with her ditzy, dumb blonde antics. Her confusion over whether Chicken of the Sea was chicken or tuna earned her a place in both reality television and pop culture history. The most interesting parts of Open Book are when Simpson talks about her reality television persona and the identity crisis it led to. “How was I supposed to live a real healthy life filtered through the lens of a reality show? If my personal life was my work, and my work required me to play a certain role, who even was I anymore?” she writes.

Open Book is Simpson’s attempt to distance herself from her Newlyweds role and change perceptions of her, a common reason people write memoirs. Some get it —“You Remember Jessica Simpson, Right? Wrong,” reads the headline on The New York Times piece about her memoir — but, unfortunately, most of the reviewers discussing her book don’t. Simpson has moved beyond her Newlyweds character. She’s built a billion-dollar fashion and licensing business and is a mom to three kids, but the media seem uncomfortable embracing Simpson in her new roles, preferring to keep her forever stuck in 2003, in her UGG boots and pink Juicy Couture tracksuit, confused about tuna.

Simpson has moved beyond her “Newlyweds” character. She’s built a billion-dollar fashion and licensing business and is a mom to three kids, but the media seem uncomfortable embracing Simpson in her new roles, preferring to keep her forever stuck in 2003, in her UGG Boots and pink Juicy Couture tracksuit, confused about tuna.

Simpson talks about the effect this identity crisis had on her and her struggles with her weight and body image, as well as her sexual abuse at age 6, and her addiction to alcohol and pills. She started to increasingly rely on alcohol during her relationship with Mayer in 2006, insecure that she wasn’t smart enough to date Mayer. My heart breaks when I think of Simpson wasting time worried about being the intellectual equal of the man who gave us the musical depth that is “Your Body is a Wonderland” and later referred to sex with Simpson as “sexual napalm.”

It is also troubling that after talking about how Mayer brought out her insecurities, the media thinks it is a good idea to focus on Mayer’s reaction to Open Book. I know you thought you were never good enough for this guy and that he was always judging you, so let’s get him to judge you some more by asking what he thought of your book!

***

Simpson’s attempts to challenge the dumb blonde perception of her are not the only example of a female celebrity going off script or off brand in their memoir and failing to give the media, and readers, what they want or expect. Singer and songwriter Liz Phair’s Horror Stories says “a memoir” on the front cover, but the book is more a collection of essays and stories by Phair than a straightforward linear memoir. Reviewers did not respond well to Phair’s artistic license with the storytelling form.

“It’s hard to tell the truth about ourselves. It opens us up to being judged and rejected,” Phair writes in Horror Stories and that may be one reason she chose to tell her story the way she did. Through stories about blizzards, blackouts (from lack of electricity, not drinking), marital infidelity, giving birth to her son, and getting dressed up to go to Trader Joe’s, Phair reveals a lot about herself and about identity, insecurity, fame, and regret. “In the stories that make up this book, I am trusting you with my deepest self,” she writes in the book’s prologue. Her deepest self just might be a bit harder to find for those fuck and run readers who are too busy complaining about the book’s nontraditional memoir style to actually read it.

Horror Stories does not talk a lot about her music, including Phair’s critically acclaimed, influential 1993 album Exile in Guyville. A song-by-song reply to the 1972 Rolling Stones album Exile on Main St., it was the number one album in year-end lists from Spin and The Village Voice and was rated the fifth best album of the 1990s by Pitchfork. “At the time, it was a landmark of foul-mouthed, comprised intimacy, a tortured confessional, a workout in female braggadocio, and a wellspring of penetrating self-analysis and audacity,” reads The New Yorker’s piece on the 20th anniversary of Exile in Guyville’s release.

“Frankness is Liz Phair’s brand. Her 1993 breakthrough album, the brilliant and profane Exile in Guyville, chronicled her post-college experiences in Chicago’s male-dominated music scene. Phair’s new memoir Horror Stories makes little mention of the album or her artistic life,” reads The Washington Post’s review. Remember how the Post thought that Bruce Springsteen did not need to write Born to Run because he had already revealed so much in his songs already? Why doesn’t Phair get the same consideration?

“Though there are anecdotes about flopping on live television and scrapping a record after learning of a collaborator’s abuse, the absence of concrete stories about Exile in Guyville is palpable,” writes Pitchfork. Just give us the hits, Liz! “Her relationship to music seems to have been the longest and maybe the most demanding love of her life, the one for which she has been willing to get lost, to fail, and to try again over and over for decades. Call me a selfish fan, but I have to say that is one story in all its horror and passion I would love to hear,” reads the review in The New York Times.

Reviewers spend so much time focused on what’s missing from Horror Stories that they miss what’s there. Well, maybe not all of what’s there. In chapter 14 of Horror Stories, called “Hashtag,” Phair writes about waking up one morning to headlines about the rock star who was supposed to produce her next album. Multiple women had come forward to accuse him of sexual harassment and emotional abuse. The FBI was also investigating him for exchanging sexually explicit communications with an underage fan.

Phair never specifically names Ryan Adams, but, in February 2019, seven months before Horror Stories was released, The New York Times broke the story about multiple women, including his ex-wife Mandy Moore, coming forward to accuse Adams of manipulative behavior, sexual misconduct, emotional and verbal abuse, and harassment.

In the chapter, Phair talks about her own experiences with sexual assault, sexual harassment, stalkers, and the sexism she experienced in the music industry. She writes about being instructed by a record label president to let radio programmers “feel her up a little” because it would help boost her career or about being told that she would never work again if she didn’t go along with sexy photo shoots. But her personal stories are not what the press focused on when she was promoting Horror Stories.

Phair was frequently asked about Adams and her experience working with him. “I don’t want every headline about this book that is so important to me to be about Ryan Adams,” she tells Entertainment Weekly. She becomes understandably annoyed with a male reporter from New York Magazine who asks her several questions about Adams, including one about his process as a producer. (I know when I hear about a man accused of sexual misconduct the first thing I wonder about is his artistic process.) “Out of everything in the book, why is the Ryan Adams thing such an interesting topic?” Phair asks him. “You’re not the only one singling out Ryan Adams as a hot talking point, and it’s sad. It does need to be talked about, but so do the larger issues.”

It’s unfortunate that Phair shares intimate details about herself, and her own experiences with sexual harassment and assault, and the media takeaway from that is that they don’t like the format of her book and would rather talk about the famous man in her life. Congrats on your book Liz, did Ryan ever send you inappropriate texts?

***

While Phair is criticized for not talking about what is expected of her in her memoir, men who follow the same course do not hear “how dare you?” The reaction to Acid for the Children, the 2019 memoir by Red Hot Chili Peppers bassist Flea (aka Michael Balzary), proves that.

Acid for the Children details Flea’s childhood growing up in Australia, his relationship with his older sister Karyn, his family’s move to the U.S. when he was 4, his first crush, how Kurt Vonnegut Jr. changed his life, and his love of basketball and the Sony Walkman. He talks about meeting Red Hot Chili Peppers lead singer Anthony Kiedis in 1976 at Fairfax High School, about learning to play bass, about his first band Anthym, about shooting coke and taking speed, his time in the California punk band FEAR, and about acting in the 1983 movie Suburbia. There are also lists of the concerts that changed his life, books that blew his mind, and movies that grew him. Lots of great material, right? You know what’s missing? Anything about the Red Hot Chili Peppers, the bestselling, Grammy-winning, Rock-and-Roll-Hall-of-Fame-inducted band he founded, plays bass in, and is most strongly associated with.

Flea’s book ends just as Tony Flow and the Miraculously Majestic Masters of Mayhem, what would later become the Red Hot Chili Peppers, play their first show at the Grandia Room in Los Angeles to 27 people in February 1983. This performance comes up on page 375 of the 385-page book. There’s no mention of the Red Hot Chili Peppers, his movie roles beyond Suburbia (My Own Private Idaho being one of his most famous), his role as a father of two girls, how he founded the Silverlake Conservatory of Music, or his work with other musicians from Thom Yorke’s Atoms for Peace to Alanis Morissette. (Flea played bass on “You Oughta Know,” her hit single from 1995’s Jagged Little Pill.)

The book is about Flea’s journey to the band, rather than with it. Surely, reviewers were as outraged by this omission, as they were when Phair failed to talk about Exile in Guyville in Horror Stories. It will not surprise you to know they were not bothered at all. Rather than focus on what was missing from Acid for the Children, the coverage focuses on what’s there and praise for it. Reviews focus on Flea’s gift and skill as a writer and fail to mention that if you want to dream of Californication, you will have to do that somewhere else. Reviewers can see, and appreciate, Flea as something other than just the bassist for the Red Hot Chili Peppers. There is a very distinct set of rules female celebrities writing their memoirs must follow. The more tell all, the more trauma and the more tabloid, the better. They are not free to write about what they want. They must bare it all, page after page. Men like Flea have the freedom to operate by a very different set of rules. He can leave his scar tissue out and reviewers have no problem with it. Book coverage focuses on Flea the writer, rather than Flea the bassist. This same courtesy, and basic level of respect, is never extended to women telling their stories. Female celebrities like Debbie and Demi are never just human beings writing about their lives. Reviewers are unable to abandon their preconceived notions, their ideas of who these women are, their celebrity personas and just see them as people who should be allowed to tell their stories their way.

“[H]e’s actually a lovely writer, with a particular gift for the free-floating and reverberant. He writes in Beat Generation bursts and epiphanies, lifting toward the kind of virtuosic vulnerability and self-exposure associated with the great jazz players,” reads the review in The Atlantic.

In an interview with Entertainment Weekly Flea said that his goal with Acid for the Children was that “it could be a book that could live beyond being a celebrity book or a rock star book and just stand on its own as a piece of literature.” I can only imagine the outrage if Debbie Harry wrote Face It and the book ended with, “And then I started this band Blondie. See you later!” Or if Demi Moore ended Inside Out with, “Then I got the part in this movie St. Elmo’s Fire. The end.” Or if Courtney Love wrote her memoir (please do this, Courtney) and the last page read, “And then I met this guy Kurt, but I have to go be the girl with the most cake now. Peace out.” The fact that Love and her accomplishments are forever tied to her husband is a whole other gender bias problem all together.

The book is about Flea’s journey to the band, rather than with it. Surely, reviewers were as outraged by this omission, as they were when Phair failed to talk about “Exile in Guyville” in “Horror Stories.” It will not surprise you to know they were not bothered at all.

Of course, Flea is not the first Red Hot Chili Pepper to give it away in a celebrity memoir. In 2004, lead singer Anthony Kiedis wrote Scar Tissue, a New York Times bestseller about his life, the Red Hot Chili Peppers, and his time in and out of rehab, as well as in and out of various women. If you have ever thought, “I bet Anthony Kiedis does well with the ladies but would really like to get a better sense of his success rate,” then this is the book for you. In his memoir Kiedis gets away with writing about debauchery, depravity, and drug abuse in a way that reads like a Behind the Music episode on steroids. (See any book by a current or past member of Mötley Crüe or Guns N’ Roses for a further look at this style.) A woman would never get away with writing about drugs like Kiedis does.

When women write about their addiction there’s an apologetic, self-aware tone male memoirs don’t have: “I know I am a drug addict, and I keep messing up, but I’m really sorry, and please stick with me cause I am gonna sort this out.” (See How To Murder Your Life by fashion and beauty journalist Cat Marnell and More, Now, Again: A Memoir of Addiction from Prozac Nation author Elizabeth Wurtzel, who passed away in 2020, for great examples of this.) Also, I would like to point out the blurbs on the backs of Scar Tissue by Kiedis and How To Murder Your Life by Marnell in case you still doubt there’s a gender bias when it comes to how celebrity memoirs are received.

“Hot Bukowski” —Rolling Stone on Marnell

“A frank, unsparing, meticulous account of a life lived entirely on impulse, for pleasure, and for kicks” —Time on Kiedis

Oh, and, if you’re reading this and in charge of greenlighting Red Hot Chili Pepper memoirs can you please get John Frusciante working on his? Frusciante is known for talking at length about both his connection to spirits (he might already have a ghostwriter!) and different dimensions and worlds. If there’s a book by a band member to be written this is the one.

It is also impossible to talk about Flea’s book without mentioning the title, which comes from the song by a band called Too Free Stooges. A man can get away with calling his memoir Acid for the Children, while a woman certainly cannot. I would like to see Demi Moore title her memoir Whippets for the Wee Ones and see how far she gets. If I look at memoir titles by women on my bookshelves there is Hunger Makes Me a Modern Girl, by Sleater-Kinney’s Carrie Brownstein, The Girl in the Back by 1970s drummer Laura Davis-Chanin, Girl in a Band by Sonic Youth’s Kim Gordon, and Not That Kind of Girl by actress and Girls creator Lena Dunham.

A man can get away with calling his memoir “Acid for the Children,” while a woman certainly cannot. I would like to see Demi Moore title her memoir “Whippets for the Wee Ones” and see how far she gets.

All the titles mention “girl” as if there is a need to announce that early on and get it out of the way, before the book has even been opened. Let us compare these with titles of the celebrity memoirs by dudes that I own. There’s Life by Keith Richards, Slash by Slash, The Heroin Diaries by Nikki Sixx, and In the Pleasure Groove by John Taylor. I do not know what the pleasure groove is, but I do hope it is also the name of the kick-ass yacht in Duran Duran’s “Rio” video.

***

Acid for the Children is not the only recent celebrity memoir by a man to resist the traditional memoir style and not receive criticism for it, although in the case of singer and songwriter Prince’s The Beautiful Ones, named for the song from Purple Rain, it’s understandable why it lacks the typical style of a life story given that its subject died just one month after the book’s publication was announced.

“He wanted to write the biggest music book in the world, one that would serve as a how-to-guide for creatives, a primer on African American entrepreneurship and a ‘handbook for the brilliant community,’” he told Dan Piepenbring, an editor at The Paris Review, who was writing the book with Prince. Notoriously private, to the point that reporters were not allowed to record their interviews, many were surprised Prince would want to write his life story at all. He wanted his book contract to state he could pull it from shelves if he felt the work no longer reflected him, which just seems like a very Prince thing to do.

Prince had completed just 30 handwritten pages before he died of an accidental fentanyl overdose on April 21, 2016. The pages detailed his childhood and his early days as a musician. Piepenbring returned to Prince’s Paisley Park compound months after the singer’s death to find additional material that could be used in the book. This material includes personal photos, drawings, song lyrics, and a handwritten synopsis of Purple Rain, Prince’s 1984 film that marked his acting debut. The addition of personal artifacts to round out the story means The Beautiful Ones is more scrapbook than memoir. “The Beautiful Ones does not offer a clear-eyed view of who Prince really was — he would have hated that, but it illuminates more than it conceals,” reads The Washington Post’s piece on the memoir.

Reading reviews of The Beautiful Ones, I wondered if the book would have even been finished and released if Prince were a woman or would it have been indefinitely shelved because of the death of its star. Maybe it would have focused on the singer’s drug use, final days, death, and the reaction to his death. The media has a way of making a female celebrity’s story about her death, not her life, which was noticeably lacking when the media talked about Prince and The Beautiful Ones.“It’s up to us to take what’s there and make something out of it for ourselves, creating, just as Prince wanted,” said NPR in their piece on the memoir. Prince’s life ended with respect and a beautiful tribute in book form, and glowing reviews for it. This respect is definitely missing when we pay tribute to female celebrities who have died. Their deaths provide another opportunity for the media to pick them apart and let their scandals overshadow their contributions. Following Prince’s death there were no pieces like the gossip-heavy Vanity Fair piece from 2012 on the late singer and actress Whitney Houston, “The Devils in the Diva,” which “investigates Houston’s final days: the prayers and the parties, the Hollywood con artist on the scene, and the message she left behind.” Or the, at times, less-than-respectful movies made about female celebrities after their deaths that focus more on their personal lives and troubles than they do on their art. Even in death, women like Houston and Amy Winehouse are still expected to bare all even though they are no longer with us.

This year will give us new memoirs from actresses Sharon Stone, Priyanka Chopra Jonas, and Julianna Margulies, as well as singers Brandi Carlile and Billie Eilish. We are also getting a Stanley Tucci memoir and I think we can all agree he is the sexiest bald man (sorry, Prince William). Women are not just turning to books to tell their truths, with recent documentaries from the likes of Paris Hilton and Demi Lovato giving female celebrities the opportunity to tell their truths, clear up misconceptions, and control the narratives around their lives. We can only hope the way these stories are received starts to change, and that women can be free to tell their stories the way they want to (embrace your inner Flea, ladies!) without fear of negative reviews, sexist reviews, or questions about Ryan Adams’ artistic process. And please, no one ask John Mayer for his opinion.

***

Lisa Whittington-Hill is the publisher of This Magazine. Her writing about arts, pop culture, feminism, mental health, and why we should all be nicer to Lindsay Lohan has appeared in a variety of magazines.

Editor: Krista Stevens
Fact-checker: Julie Schwietert Collazo
Copy editor: Cheri Lucas Rowlands

Ten Outstanding Short Stories to Read in 2021

Author Kelly Link (Photo by Awakening/Getty Images)

The #longreads hashtag on Twitter is filled with great story recommendations from people around the world. Pravesh Bhardwaj is a longtime contributor — throughout the year he posts his favorite short stories, and then in January we’re lucky enough to get a list of his favorites to enjoy in the year ahead.

***

Read more…

Longreads Best of 2020: All of Our No. 1 Story Picks

All Best of Longreads illustrations by Kjell Reigstad.

All through December, we’ll be featuring Longreads’ Best of 2020. Here’s a list of every story that was chosen as No. 1 in our weekly Top 5 email.

If you like these, you can sign up to receive our weekly email every Friday. Read more…

The Ugly History of Beautiful Things: Lockets

Illustration by Jacob Stead

Katy Kelleher | Longreads | June 2020 | 19 minutes (4,853 words)

In The Ugly History of Beautiful Things, Katy Kelleher lays bare the dark underbellies of the objects and substances we adorn ourselves with.

Previously: the grisly sides of perfumeangora, pearls, mirrors, and orchids.

* * *

He wasn’t even two years old; a tiny thing, really, hardly even a person. Alfred was the ninth son of King George III and Charlotte of Mecklenburg-Strelitz, their fourteenth child. But his numerous siblings didn’t make Alfred any less beloved. Portraits of the boy show him as rosy-cheeked and handsome, with light eyes, a pronounced Cupid’s bow, and soft folds of neck fat. His royal parents loved him dearly, and when he died on the 20th of August, 1782, Queen Charlotte was said to have “cried vastly.” The king, too, was bereft. Later, when he went mad, he reportedly held conversations with his lost little boy and his brother, Octavius, who’d also died as a child.

Often, upon losing a family member, 18th century mourners would send the dead to their graves only after giving them one last haircut. They would harvest their locks to create elaborate weavings. Sometimes, the hair would be fashioned into floral wreaths. Sometimes, it would be made into jewelry. Frequently, the hair was plaited and pressed into lockets, which were then worn close to the heart. Prince Alfred didn’t have enough hair on his small blonde head for a weaving, but a tress did make it into a locket — a single soft curl. It sits behind glass, in a gold and enamel frame that displays the dates of his birth and death. The other side of the locket, a delicate piece of jewelry shaped like an urn, is decorated with seed pearls and amethysts. It is now part of the Royal Collection Trust. “Due to his age, there was no official mourning period for Alfred,” notes scholar and collector Hayden Peters at The Art of Mourning. “But his death came at a time of the mourning industry being a necessary part of fashion and a self-sustaining one in its own right.”

When it comes to mourning jewelry, there’s no piece quite like the locket. Whether urn, round, oval, heart, or coffin-shaped, it’s an item that telegraphs absence. I love is the message the locket sends. Or perhaps more accurately, I have loved. Even today, we understand that lockets are meant to show allegiance to someone who is not present, whether the loss is through death or just the general isolation of modern life. A grandmother might wear a locket with pictures of her far-away grandchildren. One half of a long-distance couple might keep a locket with a bit of their partner’s hair. I know a woman who wears a locket with a picture of her dead sister; she plays with it sometimes when she’s drifting in thought.

It’s a beautiful piece, but it’s impossible for me to divorce the beauty of the silver pendant from its significance. Once you know someone’s greatest wound, it’s hard to look at them the same way you did before. And once you know an object’s terrible provenance, it’s difficult to covet it without feeling at least a little guilty, a little angry at your own sinful schadenfreude.

Before the ritualization of mourning in the Victorian era, wearable containers were a discrete way to keep an item close, usually something that had significant personal meaning or an intimate purpose. These pendants, brooches, or rings were visible and sometimes highly ornate, but their contents weren’t typically meant for public consumption. As emotions have slowly become more public (and more performative), so too have lockets gone from being highly private objects to functioning as a means of displaying big sentiment in a socially acceptable way. Like generational trauma tap dancing through DNA strands, jewelry transports sentiment from one person to the next. It holds, in its tiny little chains and clasps, evidence of our most devastating emotions, from fear to grief to existential despair. It makes those things small, palatable, pretty.  But in the shrinking of emotion, we run the risk of losing touch with the expansive and all-consuming reality of grief.  We risk losing the opportunity to come together as a community, to hold not jewelry, but each other.

* * *

For as long as we’ve been aware of our bodies, we’ve adorned them. Adam and Eve donned fig leaves to cover their nakedness, and thus clothing was born. But we just as easily could have covered ourselves with other objects, for other reasons. It’s possible we wore furs to stay warm. It’s also possible we wore them to look cool. (We’ve come a long way, sartorially, from the hides-and-leaves days.)

If this conflates clothing and jewelry, it’s because the line between the two is actually quite thin. Clothing is typically made of fabric, leather, or fur, while jewelry is made of metal. Yet some jewelry is made of leather and fabric, and some clothing is made from iron and gold, so the difference isn’t about materials. It’s about function: Clothing covers and protects the body, jewelry adorns and enhances it. “Jewelry has been a constantly evolving product of its time for centuries, and looking at the styles of a particular age is a great way to discover where people’s heads were,” says jewelry historian Monica McLaughlin. “Over time, jewelry has served as a form of talisman or a personal item of reflection, as a way to support one’s country in a war effort, or as an outlet for people — rich or poor — to memorialize their loved ones or proclaim their latest enthusiasms, It really is a tiny, exquisite little window into history.”

I love is the message the locket sends. Or perhaps more accurately, I have loved.

The word locket, most likely derived from the Frankish word loc or the Norse lok, meaning “lock” or “bolt,” first appeared in the 17th century, but the concept of a diminutive, wearable container dates back much further. The earliest examples of container jewelry — a category that includes lockets, rings, bracelets, broaches, and even chatelaines, a kind of metal belt that allowed the wearer to carry keys, scissors, good luck charms, and a variety of small containers attached to one central decorative piece — come from the Middle East and India, though it’s proven difficult to tell exactly when or where the locket was born. Until recently, jewelry wasn’t as rigorously studied as other art forms, says Emily Stoehrer, jewelry curator for the Museum of Fine Arts in Boston. “Maybe it’s the materials,” she muses. Or maybe it has something to do with the newly gendered nature of jewelry (diamonds weren’t always a girl’s best friend, if you get my drift).

The Hathor-headed crystal pendant (Harvard University—Boston Museum of Fine Arts Expedition)

The Museum of Fine Art has built up a substantial jewelry collection over the past century. One of the MFA’s most popular and most written-about items is the Hathor-headed crystal pendant, a piece that has been dated to 743-712 B.C.E. It’s also the earliest example of container jewelry that I’ve found, though I strongly doubt that it was the first of its kind. Just over two inches tall and an inch-and-a-quarter wide, it consists of a hollow crystal ball topped with a tiny gold sculpture of a serene, long-haired Hathor. The goddess wears a headdress featuring a pair of cow horns and a sun disc. The woman’s face looks composed, kind, and brave — fitting, since she’s the deity of beautification, fertility, and a protector of women. Hathor, according to Geraldine Pinch, author of Egyptian Mythology, was “the golden goddess who helped women to give birth, the dead to be reborn, and the cosmos to be renewed.” Later, during the Greco-Roman period, she became known as a moon deity, and the goddess of “all precious metals, gemstones, and materials that shared the radiant qualities of celestial bodies.”

This pendant was found in the tomb of a queen who lived in Nubia. We don’t know what the crystal originally contained; the MFA website says it “probably contained substances believed to be magical.” Stoehrer doesn’t have much more to add, saying that it is “believed to have had a papyrus scroll inside it with magical writing that would have protected the wearer.” The mystery, she says, is part of the appeal. “People love the story of what might have been in it, what it might have said.”

According to Stoeher, wearable prayers and early receptacle jewelry were created around the globe, but were particularly popular in “non-western” countries; historians have found evidence that people in ancient India and Tibet carried magical wardings on their bodies, pieces of prayers and words for good luck. Christians eventually began to wear small containers holding devotional objects a bit later, sometime in the Middle Ages. But some devoted followers of Christ weren’t satisfied with writing down a few words of worship and calling it a day. Instead, they hoarded pieces of people, bits of bone and hair and blood.

Relics are one of the grisliest forms of Christian worship. Although the belief in relics, defined by the Metropolitan Museum of Art as the “physical remains of a holy site or holy person, or objects with which they had contact,” has been part of the religion since its beginning, the trade in relics truly began to pick up steam during the reign of Charlemagne. According to historian Trevor Rowley, the body of a saint could act as a stairway to heaven, providing a “spiritual link between life and death, between man and God.” Relics were typically stored in decorative cases called reliquaries. Made from ivory, metal, gemstones, and gold, reliquaries had places of honor in churches, monasteries, cathedrals, and castles. The most revered relics were objects that Jesus or Mary had touched or worn (including purported pieces of the True Cross, his Crown of Thorns, or scraps of woven camel-hair believed to have been worn by Mary as a belt) but there are plenty of relics that belonged to lesser figures, like saints. Many of these aren’t lifeless objects like shoes or hats, but bits of hands and arms and hearts and legs. (There are also secular relics, like three of Galileo’s fingers, on display at the Galileo Museum in Florence, or the supposed 13-inch-long alleged pickled penis of Rasputin housed at the Museum of Erotica in St. Petersburg, though these objects aren’t worshiped in quite the same way.) Since there are thousands of recognized saints in Christianity and it’s hard to tell one disembodied leg or desiccated kidney from another, there are a lot of possible relics out there to be unearthed, sold, and displayed.

Fascinating as these grim objects may be, they’re still less strange than the reliquaries once worn by medieval Christians. It’s one thing to inter a body in a church and allow visitors to pray over it on a Sunday, and quite another to take a fragment of finger bone, stick it in a tiny silver case, and wear it around your neck, but that’s exactly what people did. One personal reliquary housed at the British Museum, dated to 1340, is made from gold, amethyst, rock crystal, and enamel. Inside the colorful locket nestles a single long thorn believed to come from the holy crown. Many reliquaries held splinters of bone, though later analysis often found that the bone was unlikely to be from a saint (and sometimes wasn’t even from a human). Merchants sold reliquary pendants stuffed with teeth, hair, blood-stained fragments of cloth, drips of tomb oil, and other supposedly holy items. The practice continues to this day, but Peak Relic was during the Romanesque period, which ended around 1200 CE.

As the Middle Ages gave way to the Renaissance, container jewelry was used more and more often for mundane (and hygienic) purposes. There are many examples of people keeping scented materials in little wearable containers in attempts to mask their natural smells. Known as pomanders, from the French pomme d’ambre (apple of ambergris), these perfume balls were packed with musk oil, ambergris, and other less costly plant-based fragrances. The Metropolitan Museum of Art has ten in their permanent collection, including an incense ball from 13th or 14th century Syria and a skull-shaped pomander from 17th century England. There are intricate silver many-chambered balls and basket-shaped pendants that would have once housed fragrances like neroli, civet musk, ambergris, rose oil, and myrrh, a shell-shaped gold pendant that still has “traces of a red residue” inside its chambers, and even a pomander bead that was part of a devotional necklace or rosary and contained pictures of three female saints hidden behind spring mechanisms.

It’s one thing to inter a body in a church and allow visitors to pray over it on a Sunday, and quite another to take a fragment of finger bone, stick it in a tiny silver case, and wear it around your neck, but that’s exactly what people did.

If you didn’t want to carry around perfume, you could pack your pomander with an opium-laced mixture known as “Venice Treacle” in late medieval and early Renaissance England. (Opium was believed to be effective against the plague, so its usage was medicinal as well as recreational.) If you were really ambitious, maybe you’d wear a poison ring. It would be an easy way to defeat political rivals: Pour them a goblet of wine, flick the locking mechanism, and let the poison drop from your hand into their cup. Voilà, no more pesky Venetian cardinal or aggressive Flemish countess. According to legend, multiple members of the infamous Borgia family wore poisoned rings filled with cantarella, a custom concoction made by 16th century Italian merchants from either the juices of rotting pig entrails sprinkled with arsenic or the froth that accumulates on a poisoned pig’s mouth after it dies from arsenic poisoning — fables differ in the details.

Pomanders and poison rings weren’t truly that far from reliquaries in their design or their purpose. All of these things — saints’ bones, prayer snippets, rancid pig poison, sweet-smelling whale bile — were precious and private. They all afforded the wearer some sort of protection. Protection against the plague, protection against evil, protection against embarrassment. Even pomanders were about protection; it was often believed that illness spread through bad smells. According to the miasma theory, scents were a matter of life and death. A whiff of “bad air” could fell even the halest traveler. A pomander kept your smells from invading the rest of the world, and the world’s smells from infecting you.

There are examples of container jewelry from almost every era of human history and almost every corner of the globe. Perhaps there is something primal about our desire to squirrel away objects, to keep some precious little things on our bodies at all times. Maybe we need small things to feel big. I think, sometimes, that humans are drawn to things that are oversized and things that are terrifically undersized. Like Gulliver, we want to see worlds of both giants and manikins. We like dollhouses and lockets, giant nutcrackers and too-big wineglasses. These things remind of us childhood, and of dreams, places where reality is slippery and true faith is possible.

And maybe we hoard little parts of things in order to feel whole. Maybe prayers need something physical to attach to, hope needs something tangible to ground it, and grief a placeholder for an unspeakable absence.

* * *

Trends tend to grow slowly at first, bubbling under the surface of the collective consciousness. They simmer, sometimes for a few years, sometimes for a few hundred, until some precipitating event when suddenly, the once-obscure trend is everywhere.

Queen Elizabeth I Ring, c. 1560. Found in the collection of the Chequers Estate. (Photo by Fine Art Images/Heritage Images/Getty Images)

That’s how it was with mourning jewelry. Since the 16th century, people had been commissioning jewelers to make them little mementos for their lost ones, rings and bracelets and lockets like the Chequers Ring, which has been dated to the mid-1570s and was worn by Queen Elizabeth I. The gold locket ring is in the shape of an E and adorned with white diamonds, rubies, and mother of pearl. Behind is a secret compartment with two enamel portraits believed to represent Queen Elizabeth herself and her mother, Anne Boleyn, who was executed when Elizabeth was nearly three years old. Pieces like the Chequers Ring are thematic siblings to the memento mori jewelry that was popular at the time, which often featured jeweled coffins, delicate gold skeletons, and other macabre bits of shiny symbolism. Instead of reminding the viewer that they, too, will die, mourning jewelry reminded the people that the wearer had experienced a loss, that they harbored great grief. Perhaps they also reminded the wearer that they had a right to their sadness. Mourning jewelry made absence visible and tangible. It made sadness present on the physical body.

Queen Victoria didn’t come up with the idea of mourning jewelry, but she did mourn more visibly and publicly than anyone else had, or could. Following the death of her husband Prince Albert in December 1861, Victoria entered a state of permanent mourning. She had the means to grieve decadently, and she did. She didn’t have just one locket for Albert, but several. She wore these charms on bracelets, broaches, and around her neck. It was her style; according to historian Claudia Acott Williams, Victoria’s first piece of sentimental jewelry was a gift from her mother and contained a lock of her deceased father’s hair, as well as several strands of her mother’s hair. During her very public courtship and wedding, “She and Albert would mark so many of those ubiquitous human moments that endeared her to the public with jewelry commissions that were widely publicized in the popular press and subsequently emulated by her subjects.” After Albert was gone, Victoria commissioned a gold memorial locket made with onyx and diamonds. Around the outside of the pendant, enamel letters spell out Die reine Seele schwingt sich auf zu Gott (“the pure soul flies up above to the Lord”). Inside, she placed a lock of Albert’s brown hair and a photograph of her deceased love. Victoria left instructions that, upon the occasion of her death, this locket be placed into Albert’s Room at Windsor Castle and left on display. It must have meant so much to her, that locket. It must have felt like a piece of her broken heart, an emotional wound made wearable and beautiful.

People of all socio-economic strata wore mourning jewelry of some kind. After all, you didn’t need to use costly gems; you could just give the deceased a post-mortem haircut and use the strands to create a bracelet or a ring. Some jewelry even featured bones in place of jewels (Victoria had a gold thistle brooch set with her daughter Vicky’s first lost milk tooth in place of the flower), though this wasn’t nearly as common as jewelry that featured woven, braided, or knotted hair. “If you’re poor, you wouldn’t have access to photography. That’s too expensive,” says Art of Mourning’s Peters. “But you could cut your hair off and pop it in a locket and give it to someone you love. That way, you can be with them always.”

Peters also notes many jewelers trying to capitalize on the trend played a bit fast and loose with the sources for their hair weavings. Sometimes you’d go to a craftsperson and ask that a locket be made with your beloved’s hair, and you’d return home with a piece made from their hair — and then some. “A lot of the hair they used was from nunneries,” he explains. Some customers knew that the hair was being supplemented, but not everyone was aware of this practice.


Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.

Sign up


Even more disturbing to Peters was the role that advertising played in the promotion of mourning goods and rituals. “Exploitation of death through grief is as certain as death itself,” writes Peters in an essay published in A Miscellany of Death & Folly. “In particular, fashion has been a focal point through which death has been exploited, due to its highly emotive nature.” Department stores stocked solely with mourning paraphernalia began to pop up. Peters makes it clear that these items weren’t necessarily all that personal. Often, each mourner that attended a funeral would be gifted a simple ring, and people tended to judge the lives of their peers by the type and quality of jewelry they left behind for grieving friends and neighbors.

The sentimental jewelry trend wasn’t confined to the Continent.  It was also fashionable in America to wear hair brooches, silver lockets, and other personal pieces. After the Industrial Revolution, people from most social classes could buy mass produced lockets, which they could then fill with photographs of their beloved or bits of their hair. Many of these were made in Newark, New Jersey, the jewelry manufacturing capital of the United States. The industry got its start there in the early 1800s, and by the late 1920s, Newark was producing 90 percent of the 14-karat gold jewelry in America. Alongside the full-color images of filigree gold pendants and colorful “fruit salad” bracelets and the essays about the shifting trends in American consumerism, The Glitter & The Gold: Fashioning America’s Jewelry tells tales of abuse and exploitation. Though the journeyman jewelers were fairly well paid, conditions in factories were generally grim and child labor was commonplace. Paid far less than their male coworkers, girls were often employed to do the most precise handwork, like fashioning gold watch chains or hand-painting enamel, because of their thin and dexterous fingers. “The jewelers work, in all its branches, is particularly trying to the eyes, and it not infrequently happens that defective sight compels men to abandon the trade,” reported chief of the state’s Bureau of Statistics of Labor and Industries around the turn of the twentieth century. Smead adds that “respiratory disorders were also common — common enough to be the leading cause of death among jewelers.”

* * *

By the time the Civil War came about, many middle class Americans were purchasing costume and fine jewelry that was made in Newark (though often factories would mark their goods “London” or “Paris” since U.S.-made items wouldn’t come into vogue for another fifty years). Lockets, heart-shaped and oval, were particularly popular during this socially chaotic period, and showed up frequently in literature and art. It was common practice for soldiers and their sweethearts to exchange sentimental trinkets before the man marched off to battle. A posthumously published and mostly-forgotten short story by Kate Chopin makes one such piece a central player: “The Locket” switches perspectives between a young Confederate soldier and his sweetheart. He had been wearing a locket, given to him by his girl at home, which he refers to as his good luck charm. After the battle, the same gold necklace is plucked off a corpse and mailed to the girl, who assumes that her love was killed. At the end, he returns home to find his lover dressed all in black. Another boy died, one who stole the locket believing that its “voodoo” would keep him alive. Our ersatz hero lives, thank the gods of love.

It’s a sentimental story about a sentimental piece of jewelry, and I can’t say I liked it much. It reminds me of a Nicholas Sparks story, or a Thomas Kinkade painting, or any other corny, sappy work of art. It drips with tears and snot. It has a hollow core: too much emotion, not enough meat. The story is set up as a tragedy, but at the last minute, Chopin pulls the rug out from under the reader and wraps them in a cozy blanket. Here, she says, here is what you wanted.

As for the boy who died? Well, we’re not supposed to think hard about him. Surely he deserved to die, for he was a thief and a coward. Like most sentimental works, it follows pat beats: a problem is set up, an exchange happens, a resolution is reached. In the end, the titular locket is revealed to have had no power — except to trick the woman into believing her love was lost, and perhaps to trick the robber into thinking he was safe on the battlefield.

That’s the dirty heart of the story. Maybe it’s not about the character’s great love, but the reader’s great fear. Fear that there is no protection from death, that there is no charm to keep away loss. Fear that unlike the boy in the story, your boy won’t come back.

Twenty-first century mourning has gone in two very different directions. It’s either become entirely intangible or deeply physical, almost to an obsessive degree. There are online guest books to mourn the dead, ghostly Facebook pages that live on “in legacy,” and online grief support groups, or you can buy diamonds made from the hair and ashes of a dead loved one. “Cremation diamonds are forever since they are diamonds made out of human ashes,” reads the website for Lonité, a Switzerland-based company that pressurizes the carbon-rich remnants of a body in order to “grow” amber-colored jewels that start at $1250 per quarter-carat, significantly less than most mined diamonds but slightly more than the average lab-grown diamond. Other companies will turn your ashes into glass beads or encase them in clay or metal. And while hair jewelry isn’t quite as fashionable as it once was, there are still hair artists who can weave a lock of hair into a keepsake.

It’s tempting to conclude that the ugliest part of lockets is what we put inside them—the poison, the remnants, the evidence of adultery, and the perfumed animal oils. But I think the worst part is how desperately we try to shrink down our emotions, to make them small and private and containable. Instead of sharing our fears aloud or wearing our sadness on the surface, we place it into jeweled containers, objects that latch and close and can be tucked under the shirt, inside the dress. We sublimate our emotions, turning gray flat ashes into brilliant, sparkling diamonds.

It must have meant so much to her, that locket. It must have felt like a piece of her broken heart, an emotional wound made wearable and beautiful.

“If we can be called best at anything,” writes mortician and author Caitlin Doughty in From Here to Eternity, “it would be at keeping our grieving families separated from their dead.” She goes to a village in Indonesia, where dead bodies are paraded through the streets while mourners keen and wail and cheer; Mexico, where mummies sit on altars waiting for families to come and give them gifts; and Japan, where family members visit a high-tech crematorium to gather up fragments of their lost and loved with chopsticks. To Americans, she admits, these customs may seem disrespectful. But they are not. They’re ways of working through grief. Giving mourners a task grants them purpose and a sense of control. Giving mourners a public space to celebrate their dead offers much-needed moments of physical and emotional catharsis. Giving mourners access to the dead body provides a sense of closeness and closure.

American culture lacks these rituals. Instead, we have single-day funerals. We have mass-produced headstones, mass-produced urns, mass-produced lockets that allow us to minimize loss without moving through it. There is no federal law that grants paid bereavement leave, not even for the death of a spouse or a child. Your interior world may have collapsed, but you are still expected to prove your worth. Grieve, but be productive.

Peters argues that hair art isn’t morbid, but rather a healthy sign that people can “live with” grief. I’m not so sure. I tend to agree more with McLaughlin, who stresses the locked-away part of the locket. “Lately, I feel like everything is about control,” says McLaughlin. “The world is bursting into flames around us and there’s basically nothing we can do about it, so instead we cling harder to the tiny things that mean something to us.” And maybe, she adds, the act of keeping these things “close and hidden away from others heightens that feeling of safety and control.” We don’t come together and howl in grief. We don’t keen at the sky or wail around the pyre or hold our dead tightly and brush their hair.

I have a cousin who died young from suicide. He was a few years older than me, and I spent the first sixteen years of my life looking up to him. He painted his nails with sparkly blue polish and dyed his hair black. He could do an incredible Irish accent. He took drugs and defended me from the worst abuses of my older brother. He was protective of me, and I loved him for it. I have very few memories of the funeral. I was deep in a depression of my own, and hadn’t yet discovered the value of medication. Many of my memories from those years are foggy and insubstantial, clouded by grief, marijuana, and hormones. I sometimes re-read the guestbook at Legacy.com where people write him messages. I receive email alerts when new posts are added. I am glad it exists, but it feels terribly incomplete. In grief, everything feels incomplete.

I do not have a necklace with a locket holding his dyed hair, but I do have a tiny little pill container that attaches to my key ring. In it, I have three pills. They soothe me, they calm me, they give me a sense of control. It’s with me at all times. I have often dared to imagine a world where I didn’t need them. Where I could cry in public, wail on the street, get snot and tears on my good clothes. Where I could allow emotions to be as big as they needed to be. Until then, I have my version of the poison ring, the pomander ball, the little locket, designed to protect. Designed to contain.

* * *

Katy Kelleher is a freelance writer and editor based in Maine whose work has appeared in Art New England, Boston magazine, The Paris ReviewThe Hairpin, Eater, Jezebel, and The New York Times Magazine. She’s also the author of the book Handcrafted Maine.

Editor: Michelle Weber
Factchecker: Matt Giles

How Four Americans Robbed the Bank of England

The Great City Forgeries: Trial Of The Accused At The Central Criminal Court. Austin Biron Bidwell; George Macdonnell; George Bidwell; Edwin Noyes; Henry Avory, Esq., Clerk Of The Court; Mr. Justice Archibald Alderman; Sir W.r. Carden, 1873 Engraving. (Photo by: Universal History Archive/Universal Images Group via Getty Images)

Paul Brown | Longreads | June 2020 | 22 minutes (5,961 words)

On April 18, 1872, Austin Bidwell walked into Green & Son tailors on London’s renowned Savile Row and ordered eight bespoke suits, two topcoats, and a luxurious dressing gown. Bidwell was 26 years old, 6ft tall, and handsomely groomed with a waxed mustache and bushy side-whiskers. If the accent didn’t give it away, his eye-catching western hat marked him out as an American — a rich American. London tradesmen called Americans with bulges of money in their pockets “Silver Kings,” and they were most welcome in upmarket establishments like Green & Son, which charged as much for the strength of their reputations as for the quality of their goods.

Read more…

In Search of Etty Hillesum

WikiCommons / Photo illustration by Katie Kosma

Elizabeth Svoboda| April 2020 | 16 minutes (4,136 words)

It’s the eve of the summer solstice, a time when evening feels like high noon and people buzz with unearned adrenaline. I’ve spent all day on the streets of Amsterdam, but I still need to make one last pilgrimage — to the home of Etty Hillesum, a Jewish diarist and radical altruist whose finest hour came as she approached her death at the hands of the Nazis.

While in Amsterdam years ago, I visited the hiding place of Etty’s young counterpart Anne Frank. Nowadays, you can’t just show up to see the Anne Frank House: You have to reserve your ticket in advance, and the lines snake around the block. Etty’s home, by contrast, is easy to miss, tucked into a row of humble red-brick flats on the first block of Gabriel Metsustraat. There are no lines, no advance reservations, and you can’t go inside, because it’s a private residence. All that distinguishes the building from its neighbors is a plaque by the front door: In this house, Etty Hillesum wrote her diary, 1941–1942.

On the second floor of Etty’s home, a generously paneled bay window opens onto the city. From this window, Etty would have had a sweeping view of the Museumplein, a rolling expanse of green that now hosts an ongoing parade of festivals and sporting events. As Etty’s world narrowed under an onslaught of Nazi decrees, she was able to drink in this view almost to the last, marred though it was by park benches on which no Jews were permitted to sit. Though most of today’s park visitors have gone home, the strains of a global summer anthem float across the open space: 

… All the bad things disappear

And you’re making me feel like maybe I am somebody…

Read more…

This Week In Books: I Bought Some Books

Soldiers read books while maintaining social distancing due to the coronavirus (COVID-19) pandemic at Foca Transport and Terminal Unit in Izmir, Turkey on April 29, 2020. (Photo by Mahmut Serdar AlakuÅ/Anadolu Agency via Getty Images)

Dear Reader,

My concentration is pretty much shot. So I have to confess I haven’t gotten very far into A Distant Mirror. I’ve mostly been playing Unciv on my phone and watching Devs and making curry and cleaning out the closet and periodically tweeting at A24 that I would really like to watch First Cow now and feeling slightly removed from my body. But that hasn’t stopped me from ambitiously and somewhat compulsively ordering even more plague books: The Great Mortality (about the black death) and The Great Influenza (about the 1918 flu, of course) from The Book Table in Oak Park, Illinois; Asleep (about the mysterious pandemic of “sleeping sickness” that followed on the heels of the 1918 flu) and The Ghost Map (cholera) from The Bookstore at the End of the World; Pox Americana (smallpox) and Epidemics and Society (all of them!) from Community Bookstore in Brooklyn. (I also ordered Joan of Arc In Her Own Words from Split Rock Books in Cold Spring, New York, but that’s related to an entirely different phase I’m going through.)

I’m not sure what I feel like all these plague books will achieve. Will I read them all? Probably not. Will they all sit on my desk talismanically protecting me from getting sick? Of course, but that goes without saying. Will they make me feel more or less anxious? TBD, I’ll let you know.

Ordering the books was a circuitous choice for me because I’ve been having some trouble coming to grips with the fact that the American lockdown fell so short of what it should be; that we began talking about reopening before we ever, it seems to me, fully closed. All these bookstores I ordered from are places I used to work or are owned by friends of mine, and I know they’re doing their best to keep themselves and all their employees safe and paid (though The Bookstore at the End of the World is a Bookshop site begun by a group of bookstore employees who were covid-furloughed by their employers). What that means, practically, is that because none of these stores have employees on site, all of these orders were fulfilled “direct,” which, in the rarefied parlance of bookselling which I know from my years in the business, means they were shipped directly to me from one of the wholesaler’s warehouses (the bookstores get a cut of the sale, although a smaller cut than normal). The wholesaler in this case — in all cases, as far as I know, including orders placed through Bookshop — is Ingram, the behemoth book distributor rivaled in reach only by Amazon and owned by the billionaire Ingram family. Early on in the pandemic, as lockdown began rolling across the country, I thought for certain that the warehouses themselves would soon close — not just Ingram and the smaller regional wholesalers, but the publishers’ warehouses as well, not to mention the printers! I thought the whole industry would have to, at least momentarily, pause. But while many publishers have pushed back the release dates for their spring titles and laid off employees (so that’s not going well) and one major printer has closed (while another has filed for bankruptcy, so that’s not going well), the major publisher warehouses themselves, as far as I can tell, have stayed open — with social distancing measures in place, of course. (The situation at the Big Five publishers feels a little opaque to me, but smaller publishers/distributors such as Small Press Distribution, a longtime distributor of micro presses, have been clear about their need to raise money.) Ingram, meanwhile, has been considered essential throughout the country during the pandemic and its warehouses have remained open and shipping direct to customers (as well as, of course, to stores in states where things like curbside pickup and receiving/shipping in and out of the store are still allowed — Point Reyes Books in California made an excellent video of what that looks like).

And so, what I’m trying to get at is that in the beginning of the pandemic I thought the best way to support bookstores was to order gift cards and donate to fundraisers (special shout out to Unnameable Books in Brooklyn and The Seminary Co-op Bookstores in Chicago) or maybe order audiobooks or ebooks if that is your thing (though independent bookstores earn somewhat slim percentages of those sales, when they are able to offer them at all), convinced as I was that any sort of physical shopping would be tantamount to forcing warehouse and postal workers to endanger themselves, and that those warehouses would soon close down anyway! But I suppose that lately, despite few if any tangible signs that the spread of the virus has begun to decline in America, I let the growing narrative that “corona is nearly over now” and “the country is reopening soon” seep into my brain. And so, to be frank, I ordered some extremely nonessential stuff.

I guess I stopped expecting that the book warehouses would shut down. I stopped expecting the peak and have settled for the plateau.

But I’m sitting here staring at this copy of Epidemics and Society, which has already arrived and which I have set in a “decontamination pile” because we’re running low on disinfectants in my apartment, and I’m wondering, if I’m afraid to touch it, should I really have had someone send it? It’s a ghoulish feeling.

When the pandemic was starting, my feed was full of people tweeting about buying Nintendo Switches, so I mean, I’m aware that I’m not the only person in the world to buy something nonessential during the pandemic. I guess it’s possible I’m just being overwrought, here.

But it still seems like something is fishy about all this. I still feel like a ghoul. I feel like we have settled for a rolling epidemic until (purely theoretically!) herd immunity is reached, but we are doing it without admitting that that’s what we are doing— or acknowledging who will suffer for it (prisoners, warehouse workers, grocers, nurses!). And business owners are being forced into this mass casualty scheme because federal and local governments refuse to provide financial relief.

So, yeah, I have no idea where I’ve landed here. Am I ghoul for buying all these plague books? I mean, ok, yes; we all know the answer is yes.

I’m a ghoul with just enough plague books to tide me over until the second surge.

1. “The Pre-pandemic Universe Was the Fiction” by Charles Yu, The Atlantic

Sci-fi writer Charles Yu weighs in on reality. “Years ago, I started writing a short story, the premise of which was this: All the clocks in the world stop working, at once. Not time itself, just the convention of time. Life freezes in place. The protagonist, who works in a Midtown Manhattan high-rise, takes the elevator down to the lobby and walks out onto the street to find the world on pause, its social rhythms and commercial activity suspended. In the air is a growing feeling of incipient chaos. I got about midway through page 3 and stopped. I didn’t know what it meant.”

2. “What Rousseau Knew about Solitude” by Gavin McCrea, The Paris Review

Novelist Gavin McCrea writes about Rousseau’s lonely years, noting that the thinker’s Reveries of the Solitary Walker are haunted by the society they seek to avoid. “Looking at himself through the eyes of society, he is ‘a monster,’ ‘a poisoner,’ ‘an assassin,’ ‘a horror of the human race,’ ‘a laughingstock.’ He imagines passersby spitting on him. He pictures his contemporaries burying him alive. Rumors about him are, he believes, circulating in the highest echelons: ‘I heard even the King himself and the Queen were talking about it as if there was no doubt about it.’” This version of Rousseau sounds, to me, pleasantly like a morose Twitter poster. It just feels very familiar. I feel like I could scroll through Twitter right now and see some defeated soul posting that if they ever walk in public again, they will be spit on and the Queen will hear about it.

3. “Creation in Confinement: Art in the Age of Mass Incarceration” by Nicole R. Fleetwood, The New York Review of Books

An excerpt adapted from Nicole R. Fleetwood’s Marking Time: Art in the Age of Mass Incarceration, in which she surveys art created by incarcerated people or made in response to incarceration. Fleetwood describes the unique challenges of documenting prison art: “…many of the artists, whether currently or formerly incarcerated, do not have possession of their art, nor any documentation of their work, nor knowledge of how and where their art has circulated… art made in prison may be sent to relatives, traded with fellow prisoners, sold or ‘gifted’ to prison staff, donated to nonprofit organizations, and sometimes made for private clients. There are people I interviewed who described their work and practices to me but had nothing to show.”

4. “The Exclusivity Economy” by Kanishk Tharoor, The New Republic

Author Kanishk Tharoor reviews Nelson D. Schwartz’s The Velvet Rope Economy: How Inequality Became Big Business, an exploration of the byzantine hierarchies that have emerged in all manner of consumer-facing industries to separate the wealthiest customers from the chaff. “What these changes augur, in [Schwartz’s] view, is the crystallization of a caste system in the United States and the birth of a new aristocracy.”


Sign up to have this week’s book reviews, excerpts, and author interviews delivered directly to your inbox.

Sign up


5. “Gay Literature Is Out of the Closet. So Why Is Deception a Big Theme?” by Jake Nevins, The New Yorker

Jake Nevins surveys recent queer fiction and finds that deception is a major theme, even when it’s not explicitly the deception of the closet. “For much of the 19th and 20th centuries, from Dorian Gray to Tom Ripley, the lie of the closet was the hinge upon which queer literature would pivot, reflecting what were then the often judicial or mortal costs of being openly gay. Insincerity, ‘merely a method by which we can multiply our personalities,’ as Dorian Gray put it, was the mode of congress gay men had been taught to adopt for the sake of self-preservation…”

6. “The Surreal Stories of ‘Lake Like a Mirror’ Show How Power Distorts Reality” by YZ Chin, Electric Literature

YZ Chin interviews Chinese Malaysian author Ho Sok Fong about her short story collection Lake Like a Mirror, recently translated from the Chinese by Natascha Bruce. Ho says her stories try to reflect the way the exercise of power distorts reality. “I think a surrealist style can twist the surface of a reality that presents as neutral. Then we can see reality as a screen that has been yanked askew, and its seemingly solid surface starts to be pulled apart. Through this we realize that reality can be distorted by power. This isn’t something realism can achieve.”

7. “What if, Instead of the Internet, We Had Xenobots?” by Garth Risk Halberg, The New York Times

In his review of the long-awaited second novel from Adam Levin (author of the 1,000-page widely lauded high school bildungsroman The Instructions), Garth Risk Halberg writes that “Levin can make the kitchen-sink ambition of (mostly white, mostly male) midcentury postmodernism feel positively new.” His latest book, Bubblegum, is about “a novelist-cum-memoirist-cum-unemployed schlub named Belt Magnet, of the fictional Chicago suburb of Wheelatine, Ill.” who can “hear the suicidal pleas of certain inanimate objects through a telepathic ‘gate’ above his right eye” and was one of the first patients therapeutically paired with a “botimal” aka “a mass-produced… velvety soft, forearm-length, ‘…flesh-and-bone robot that thinks it’s your friend®!’”

8. “No Sleep till Auschwitz” by Jeremy M. Davies, The Baffler

New fiction from Jeremy M. Davies, author of The Knack of Doing, presents a fictionalized publishing industry that is — purely fictionally speaking, of course! — terrible. “Drucksteller saluted the long con of literature by way of the time-honored method of stealing a ream of copy paper and not flushing the toilet on his way off the estate.”

Stay well,

Dana Snitzky
Books Editor
@danasnitzky
Sign up here

Little League, Revisited

Photo courtesy of the author / Getty / Little League World Series / Photo illustration by Longreads

Adam Kuhlmann | Longreads | April 2020 | 17 minutes (4,265 words)

It’s a cold, gray morning in late December, the week that sags like bunting hung between Christmas and New Year’s. I pull my mother’s Subaru alongside a large cinder block building identified only by a street address peeling from a rust-pocked and dented steel door. I see no functional windows, just a few square cavities that have been boarded up from the inside.

My wife, Mysha, eyes the grim façade from the passenger seat. “Is it strange,” she asks, “that Chase takes lessons inside a commercial slaughterhouse?”

Chase is my nephew, an 11-year-old with the eyelashes of a Hollywood starlet and a penchant for neon athletic wear. During our annual holiday visit to my Virginia hometown, he had invited us to watch him pitch and hit baseballs for an hour, under the tutelage of a private coach.

“It gives him a leg up,” my sister had told me the previous night after Chase went to bed. Perhaps sensing my skepticism, she explained the nature of today’s competitive child-rearing: how all of a kid’s activities — from his first birthday party to his college admissions — must be coordinated and enhanced, for a fee, by biologically unrelated adults.

At 39, with no plans to father a child myself, I am free to pass judgment on all manner of parental behavior without worrying that, one day, I’ll have to admit I was wrong. So, I reminded my sister about the 1990s, when the most we’d hoped for was piano lessons. As for getting into college, I told her about the Friday night before I took the SAT. I’d stayed up late, crowding around Betsy Newman’s backyard fire pit. I’d joined a boozy, a cappella rendition of Blind Melon’s “No Rain.” My test prep had consisted of just saying no to the nozzle of a can of Cool Whip, a triumph of restraint I’d managed without a glance of adult supervision.

My sister patiently absorbed my nostalgia. Then she added: “Chase wants this too. He loves baseball.”

I couldn’t argue with Chase’s results. Last summer he’d been selected for the all-star team of his neighborhood little league. My sister sent us photos of the boys celebrating at a local Mexican restaurant. In one close-up, Chase’s arm is draped over the shoulder of a boy with the same tousled hair spilling from the same star-spangled hat. With the other hand, he is slugging a yellow concoction from a goblet the size of a table lamp.

During our annual holiday visit to my Virginia hometown, my nephew, Chase, had invited us to watch him pitch and hit baseballs for an hour, under the tutelage of a private coach.

Looking down at her phone, Mysha confirms the address, so we slip into a small parking lot in the back of the building. Though it’s no more welcoming than the front, at least we find no sign of doomed Angus cattle.

Inside, the facility’s décor hews to jock brutalism. Forty feet above us, fluorescent lights hang from metal beams, filling the cavernous room with a stadium’s ice-blue brightness. The atmosphere is warmed only by the sound of classic rock rattling from speakers bolted to the walls. Black netting curtains off a pair of batting cages, where a few stocky teens hack at soft tosses. The floor is covered in green artificial turf studded with five-gallon buckets, around which cluster litters of scuffed baseballs.

I spot my brother-in-law, Clay, seated with two other men whose buzz cuts and taut expressions would fit in on the bridge of a naval destroyer. They lean forward from metal folding chairs, studying the ritualized movements of their boys. Nearby is a makeshift pitching mound, where I spot Chase moving into his windup: a fluid and compact gathering of 100 pounds of muscle and bone. His pitch sails high, pulling out of his catcher’s crouch a college-aged man in gray sweats. His bottom lip is swollen with tobacco, and he pauses to discharge a brown stream into a soda bottle before offering my nephew a blunt appraisal: “You’re overthrowing again. What happened to your release point?”

Chase cocks his head thoughtfully. “I forgot to reach out with it.”

“Right,” the coach says, demonstrating with his own right hand before returning a dart to Chase’s glove side. “Fix it.”

In his plush suburban home, Chase is a merry prankster. When he was 4, he stood on the carpeted mezzanine, reached his hand between two wooden balusters, and dropped an untidy sock onto the face of my sister, napping on the sofa below. Here, in this Spartan box, Chase’s aim is nearly as true — but he is all business.

We slide in, and the fathers stand to make room for us in the self-consciously gallant way of Southern men. And suddenly I recognize that I am easily the smallest person in the seating area. This includes my wife, who at 6-foot-1 dwarfs me in a way that attracts stares in public.

Out of the corner of my eye, I track a wide throw that tips off Chase’s glove and bounces once on its way toward our congregated shins. I bend and manage to spear it with my right hand.

One father draws out a whistle through his teeth.

“Once a second baseman, always a second baseman,” Clay says.

I toss the ball back to Chase, who registers the deed — and our presence — with a stoic little nod.

“College ball?” asks the other father.

Before I can laugh, say “no,” and explain that this catch had been the most graceful maneuver I’d accomplished in 20 years — indeed, I’d just tweaked my back and would require, this evening, a liberal application of Tiger Balm — Clay jumps in.

“This guy played in the Little League World Series!”

I wince.

Read more…