Erin Blakemore is a Boulder, Colorado-based journalist. Her work has appeared in publications like The Washington Post, TIME, mental_floss, Popular Science and JSTOR Daily. Learn more at erinblakemore.com.
There are two events that can define a separation of generations: Where were you when Princess Diana got married? Where were you when she died?
I was a tiny toddler sitting on my young mom’s lap for the first, an awkward 17-year-old for the second. San Diego’s Starlight Musical Theatre was in the middle of a production of Singin’ in the Rain and my job was to get costumes onto cast members before they hurtled out onstage.
Somehow I learned she was dead during the performance, in the time before widespread cell phones or internet. News spread fast, through the usual backstage channels, in whispers and passed notes. The busy dressing rooms were oddly quiet. People danced off stage and started crying in the wings. Downstairs, near the costume shop, they used the pay phone to find out details from friends.
The world seemed stunned, half silent. But why? Why did we spend the next few days glued to the television and the radio? Why did we leave flowers and sing songs and feel personally affected by a woman few knew and even fewer ever understood? Who was this bashful princess, anyway? This reading list contains a few answers—but 20 years after her death, the enigmatic Diana is harder to grasp than ever.
Say goodbye to those red sidewalk boxes — and a slice of American literary greatness. Since 1955, the Village Voice has been a ubiquitous part of New York City culture. In a half century it was transformed from a counterculture rag to a longform powerhouse rooted in the character and the color of the city.
This week, the current owners of the Voice announced the end of the era: The free print edition of the paper is finished. Once available on every street corner, it will now be online only. In their write-up for The New York Times, John Leland and Sarah Maslin Nir mourn the paper’s once inescapable presence: “Without it, if you are a New Yorker of a certain age, chances are you would have never found your first apartment. Never discovered your favorite punk band, spouted your first post-Structuralist literary jargon, bought that unfortunate futon sofa, discovered Sam Shepard or charted the perfidies of New York’s elected officials.”
The Village Voice was the first paper you grabbed on the way to the subway, the last thing you grabbed at night for the long ride home. It redefined the alt-weekly and introduced readers to a new kind of journalist and critic. If the Voice was the first place you were published, then you were on the way to a brilliant career. Here are some of our favorite moments of brilliance.
When you think of zombies, it’s likely you envision something like the flesh-eating, immortal creatures created by George Romero, who defined a new genre of horror with Night of the Living Dead and Dawn of the Dead. Thanks to Romero, who died this week at the age of 77, the zombie movie has become more than a chance to feel scared. It’s also an essential lens through which we can view pop culture, politics, and society. In honor of the great director, here is some our favorite writing about the terror of the living dead.
One of Romero’s most famous narrative coups was casting a black actor as the hero of his 1968 film, Night of the Living Dead. It was a decision that turned a run-of-the-mill horror movie into a complex commentary on the civil rights movement, and imbued other zombie films with the ability to criticize society.
The thing about good zombie fiction (and I say this as someone who enjoys an awful lot of zombie fiction) is that the zombies are never the most horrific thing. Zombies don‘t typically have the capacity for complex thought — they don‘t execute stratagems, play politics, torture people. All they do is feed. The true horror in any zombie story worth its salt is what other people do when faced with the zombie threat. Zombies are merely relentless; humans can be sadistic.
Cameras snap, laptops click, recorders flip on and reporters lean forward. On one side, the White House Press Secretary; on the other, the media — gladiators of free speech or mad dogs out for blood, depending how you see them. The great American press briefing is an ecosystem with its own traditions and its own inscrutable rules that has survived, in one form or another, for more than a hundred years. Under the Trump administration, the White House press briefing may not survive the summer.
It’s easy to forget that the the modern press briefing — in which a member of the government routinely meets with select members of the press — is a relatively new custom in the history of the presidency. It’s also easy to forget its informality has always been an illusion.
In 1989, during a performance of Hamlet at the National Theater, Daniel Day-Lewis walked off the stage. Like Hamlet, he claimed, he’d seen his father’s ghost. He never took to the stage again. With this week’s announcement that Day-Lewis is retiring from acting, it looks like his film days are over, too. And when Daniel Day-Lewis commits to something, he really commits.
Cue the public mourning for one of our most dedicated actors, a man as famous for avoiding the cameras as he is for standing in front of them. Day-Lewis embodied Acting with a capital A, embracing all of its finicky pretense. The end of his career may also be the end of an era for the great method actor — and the brilliant, if reluctant, male movie star.
Dear old Dad. To hear retailers tell the story, he’s a transparent creature, someone who is pleased by the simple things: a shirt, a book, a steak, a new gadget. But the dads most of us grew up with — and without — are a more inscrutable lot. They’re people, after all, whose past lives, present concerns, and future legacies can vex, perplex, and frustrate their children. Can we ever really know these men? Some of the best writing about dads embraces that mystery, confronting the hard questions of what it truly means to know one’s father.
Seven thousand, three hundred days. Twenty years. Judging by the response to the release of Arundhati Roy’s long-anticipated follow-up to her first novel, 1997’s The God of Small Things, you’d think it had been two hundred. Reviews of The Ministry of Utmost Happiness are almost as ecstatic as the ones that accompanied Roy’s first book — and they almost always include a lament that it took her so damn long to produce.
The God of Small Things received a Man Booker Prize, bestseller status, and a whirlpool of accolades, but after its publication, Roy opted out of fiction altogether, pursuing a career as a political activist-cum reporter, unearthing the stories of society’s rebels and outcasts, advocating for a non-nuclear India, the independence of Kashmir, and criticizing prime minister Narendra Modi.
How dare she?
That’s the underlying question in nearly every interview with Roy that’s followed. Who wouldn’t give just about anything for a fawning debut New York Times book review, a public clamoring for the next book? Doesn’t she owe her readers another glimpse into her imagination? Read more…
They came in the tens of thousands, pushing baby carriages and packing roller skates. All in all, an estimated 200,000 pedestrians crossed the Golden Gate Bridge on May 27, 1937, its first day in business. The bridge was already a San Francisco landmark—a flaming, burnt-orange beacon conceived a decade earlier by Leon Moisseiff, who had engineered the Manhattan Bridge. It was a graceful design, but suspension bridges still weren’t entirely safe—the engineer’s Tacoma Narrows Bridge would fail spectacularly only a few months after it opened in 1940.
The Golden Gate also has a dark side. To afford a view of the city, the bridge has a low barrier that is easy to scale. (In “Jumpers,” the New Yorker’s Tad Friend meditates on the bridge’s reputation for death—for the families and friends of those who succeed in their jumps, it’s an indelible monument to their loved ones’ pain.) This month, city workers will finally begin the installation of a new barrier, a grey netting that will blend into the water without obscuring the view. Officials hope it will finally reduce suicide rates on the deadly bridge.
She means well, but I dread the dental hygienist. The judgmental tone in her voice is probably just exhaustion; the only dentist I can afford to see has an office that’s a in perpetual spin of budget-seeking patients. I’m one of scores of people who’ll sit her the chair today, and whenever I leave, I hear someone standing at the dreaded reception desk trying to argue their way out of a bill in an embarrassed tone.
Sometimes I’m in that corner too, wheeling and dealing for a way to swing basic treatments with money I don’t have. To my shame, I often go months or even years between routine cleanings, opting to spend money on debt or bills or food instead. Read more…
Yes, it was only last week—nine days to be exact, but who’s counting?—that President Trump committed the historical equivalent of hurling a live grenade into a crowd when he ventured into an improvisational analysis of the Civil War during an interview on Sirius XM radio. “People don’t ask that question, but why was there the Civil War?” he said to the interviewer. “Why could that one not have been worked out?” It was a comment that poked the bee’s nest of public opinion and pushed the Civil War back into feverish public debate.
It’s been easy to dismiss President Trump’s comments as ignorant non-sequiturs or a childish attempt to divert attention from more pressing political issues. After all, there’s an entire field of inquiry devoted to asking exactly those questions about the Civil War, and scholars have devoted their lives to that question—but given Trump’s staunch anti-intellectualism, it’s not really surprising that he’s never bothered to notice. “Donald Trump has always acted in the moment, with little regard for the past…” wrote Marc Fisher in the Washington Post a day after the firing of FBI Director James Comey. But the Civil War, it seems, is an endless trauma to American democracy. As the republic reconsiders it again and again, it continues to mirror our understanding of the country we currently live in.
Perhaps, as Jon Meacham suggested in TIME after the president’s remarks, Trump was simply looking for himself in history—a plausible theory given the president’s perennially self-centered worldview. But by overlooking the war’s relevance and refusing to acknowledge slavery’s role in its birth, the president wasn’t merely sidestepping the issue; he was using tactics similar to those employed by “Lost Cause” revisionists and Confederate holdouts for generations, in which the cause of the war is questioned, reimagined, or willfully forgotten.
Our current decade marks the 150th anniversary of the war. Biographies, histories, and reconsiderations have come in measured steps and harsh reckonings—and discussions of memory, cause, conflict, reparation, and reconciliation have made it clear this war must continue to be discussed.
Conflicts rarely have only one cause, just as more than one thing can be true at a time. As Tony Horwitz wrote in The Atlantic in 2013 on the anniversary of the war’s start, slavery may not even have been central to Northerners’ experience of the Civil War. It was a kind of midwife, though, a stage on which a nation barely a century old played out its conflicts over sovereignty, autonomy, and national identity. Slavery as an institution concerned itself with just those questions. It used the bodies and labor of people stolen from their homes, excluded from equal society, and refused a personal identity.
In the summer of 2015, after Nikki Haley, then governor of South Carolina, announced the removal of the Confederate flag from the state capitol, Ta-Nehisi Coates collected the words of Confederate leaders who stated clearly that slavery was central to the identity of Southern states, which viewed it not just as an inalienable economic asset but as the very basis of white equality. The existence of slaves meant that white men could sidestep industrialized slavery of their own; the institutions’ proponents freely admitted that it upheld and enabled their quality of life.
Once slavery was abolished, the certain supremacy of Southern white men was threatened and the institutions it propped up were no longer guaranteed. The Confederate cause went from vaunted reason to fight to a heroic struggle that was snatched from its champions, spawning Lost Cause revisionist rhetoric that centralized the white Confederate experience. And as soon as the war ended, another one began, this one concerned with textbooks, memorials, and the “official” historical narrative.
Revisionists knew what they had lost. They knew that it would do them no favors to admit they had fought and lost a war over the right to oppress others. And so they turned toward telling their own story through the lens of states’ rights, a perspective that made room for the Confederacy to reintegrate into the union and still maintain face.
Trump’s no-big-dealism is a more plausibly deniable form of that same beast. Downplaying slavery, whether in textbooks that omit it or comments that ignore its existence with wide eyes, calls 150 years of historical reckoning into question without saying a word. It invites people to start from square one—sidestepping, perhaps, the abundance of historical evidence and analysis that already exists.
If Civil War history is a graveyard, it’s one still strewn with fresh graves. It will haunt us until we face it down collectively, reconciling its truths with the world we have constructed around its gates. The president is not the first person who’d rather avert his eyes than look inside—even though Trump whistles blithely by, it doesn’t mean the cemetery ceases to exist.