Search Results for: writing

This Is How a Woman Is Erased From Her Job

Photograph by Kate Joyce

A.N. Devers | Longreads | December 2017 | 26 minutes (6,577 words)

This is a story about a woman who was erased from her job as the editor of the most famous literary magazine in America.

In 2011, the New York Times ran Julie Bosman’s energetic and gregarious profile of Lorin Stein, the latest head editor of the famous literary magazine The Paris Review — a position for which she declared, “Bacchanalian nights are practically inscribed in the job description.” The profile portrayed Stein as an intellectual bon vivant who loved parties, party-boy banter, and debating literature as if it were the most important thing in the world.

We know now that Stein, by his own admission, abused his power with women writers and staff of the Paris Review. He has resigned from the literary magazine and from his editor-at-large position at Farrar, Straus and Giroux in response to the board of the Paris Review’s investigation into sexual harassment allegations and his conduct. We also know, by his own admission, that he did not treat literature as the most important thing in the world.

Stein himself admitted it in a cringeworthy 2013 online feature from Refinery29 focused not only on the magazine’s debaucherous parties but also on the interior decor of the Paris Review’s offices and fashion choices of the staffers, who were nearly all women. “It’s always been two things at once,” he says about the Review. “On the one hand, it’s a hyper-sophisticated, modernist, avant-garde magazine. On the other hand, it’s sort of a destination party.”

We now know, between this and Bosman’s piece, even without details of the accusations or reports printed in the Times, or the far worse accusations listed in the “Shitty Media Men” list, that these are glaringly honest portrayals of Stein’s priorities at the helm of the Paris Review. Unfortunately.

Also unfortunate was the error in Bosman’s piece naming Stein as the third editor to “hold the title in the magazine’s 58-year history, and the second to follow George Plimpton, himself a legendary New York social figure.” Stein was actually the fourth. Brigid Hughes, the editor who succeeded George Plimpton, had been inexplicably left out of the profile. She was also not mentioned in the piece announcing Stein’s successorship of Philip Gourevitch; although there was no factual error, she was simply ignored.

Read more…

Living Differently: How the Feminist Utopia Is Something You Have to Be Doing Now

Cover of program for the National American Women's Suffrage Association procession. (Getty Images)

Lynne Segal | Verso | November 2017 | 32 minutes (8,100 words)

The following is an excerpt from Radical Happiness: Moments of Collective Joy, by Lynne Segal (Verso, November 2017). This essay is recommended by Longreads contributing editor Dana Snitzky.

* * *

The utopian novel had become one of the most effective means of frightening people off it.

It is sometimes said that the twentieth century began with utopian dreaming and ended with nostalgia, as those alternative futures once envisioned seemed by then almost entirely discredited. However, it was never quite so straightforward. The challenge to envisage how to live differently, in ways that seem better than the present, never entirely disappears.

The most prominent American utopian studies scholar, Lyman Tower Sargent, notes that dystopian scenarios increasingly dominated the speculative literary form as the twentieth century progressed. In the UK, the equally eminent utopian studies scholar Ruth Levitas concurs, pointing out, for instance, that as sociology became institutionalized in the academy, it became ‘consistently hostile’ to any utopian content.

What stands out in speculative fantasies of the future arising towards the end of the twentieth century are their darkly dystopic leanings, whether in books, cinema, comics or elsewhere. The best known would include the mass surveillance depicted in the Russian author Yevgeny Zamyatin’s satirical novel We (1921).

Set in the future, it describes a scientifically managed totalitarian state, known as One State, governed by logic and reason, where people live in glass buildings, march in step, and are known by their numbers. England’s Aldous Huxley called his dystopic science fiction Brave New World (1932), where again all individuality has been conditioned out in the pursuit of happiness. Bleaker still was George Orwell’s terrifyingly totalitarian 1984 (1945): ‘If you want a picture of the future,’ Orwell wrote in 1984, ‘imagine a boot stamping on a human face – forever.’

These imaginings serve primarily as warnings against futures that are often read, as with Zamyatin and Orwell, as condemnations of Soviet society. The happiness expressed in Huxley’s ‘utopic’ universe depicts a deformed or sinister version of the route where all utopias end up, as totalitarian regimes, in which free will is crushed. As the Marxist political scientist Bertell Ollman later noted: ‘From a means of winning people over to the ideal of socialism, the utopian novel had become one of the most effective means of frightening people off it.’

Post-1945, public intellectuals for the most part broadcast the view that democracy and utopic thinking were opposed, the latter declared both impossible and dangerous. The influential émigré and British philosopher of science Karl Popper argued in his classic essay ‘Utopia and Violence’ (1947) that while ‘Utopia’ may look desirable, all too desirable, it was in practice a ‘dangerous and pernicious’ idea, one that is ‘self‐defeating’ and ‘leads to violence’. There is no way of deciding rationally between competing utopian ideals, he suggested, since we cannot (contra Marxism) scientifically predict the future, which means our statements are not open to falsification and hence fail his test for any sort of reliability.

Indeed, accusations of ‘totalitarian’ thinking were the chief weapon of the Cold War, used by Western propaganda to see off any talk of communism. In the USA it was employed to undermine any left or labour movement affiliations, as through the fear and financial ruin inflicted upon hundreds of Americans hauled before Senator McCarthy’s House Un-American Activities Committee in the 1950s – over half of them Jewish Americans. Read more…

Longreads Best of 2017: All of Our No. 1 Story Picks

All through December, we’ll be featuring Longreads’ Best of 2017. Here’s a list of every story that was chosen as No. 1 in our weekly Top 5 email.

If you like these, you can sign up to receive our weekly email every Friday. Read more…

Maybe Your House Can Be “Most Congenial”

An English Heritage plaque at Hampton Court Palace Gardens. Photo by Elliott Brown via Flickr (CC BY-ND-SA 2.0)

In an essay at White Noise, Richard Wallace considers his chances at being memorialized with one of the blue English Heritage plaques that dot historic homes in London’s (mostly well-heeled) boroughs:

I mostly think money, power and status are chimeras, eliding the serious parts of the human project… Then I periodically remember those English Heritage blue plaques that go on the walls of noteworthy dwellings, and I think: no. Fuck goodness and principle. I want to get so famous they give my house a medal.

Lack of marketable skills aside, an informal of analysis of plaque recipients reveals the real predictor of plaques: class.

There’s a distinct sense that a certain type of people are predisposed to plaque-worthiness, and the reason is probably what class-progressives already know: that it’s so much easier to get recognised for your achievements if you get a good start in life. This shouldn’t diminish the accomplishment of the great; nor should it mollify less affluent mediocrities. But when we look at these plaques, we are forced remember that English history is uniquely bound to inequality, to people ascending the apex of the world on a staircase of hunched shoulders. Repeat, repeat: David Cameron and his Bullingdon brothers, Theresa May and her fields of wheat. Blue Plaque England is not a place where we can all live. Kensington’s too small for everyone. But as unfair as it is, English Heritage plaques merely record history; nobody can argue that class division is not British. The writing is on the wall.

Read the essay

Derivative Sport: The Journalistic Legacy of David Foster Wallace

David Foster Wallace in New York City's East Village, circa 2002. (Janette Beckman/Redferns)

By Josh Roiland

Longreads | December 2017 | 32 minutes (8,200 words)

At a hip Manhattan book launch for John Jeremiah Sullivan’s 2011 essay collection Pulphead, David Rees, the event’s emcee, asked the two-time National Magazine Award winner, “So John…are you the next David Foster Wallace?” The exchange is startling for its absurdity, and Sullivan shakes his head in disbelief before finally answering, “No, that’s—I’m embarrassed by that.” But the comparison has attached itself to Sullivan and a host of other young literary journalists whom critics have noted bear resemblance to Wallace in style, subject matter, and voice.

When Leslie Jamison published The Empathy Exams, her 2014 collection of essays and journalism, a Slate review said “her writing often recalls the work of David Foster Wallace.” Similarly, when Michelle Orange’s This is Running for Your Life appeared a year earlier, a review in the L.A. Review of Books proclaimed: “If Joan Didion and David Foster Wallace had a love child, I thought, Michelle Orange would be it.”

Wallace was, himself, a three-time finalist for the National Magazine Award, winning once, in 2001; yet he compulsively identified himself as “not a journalist” both in his interactions with sources and reflexively as a character in his own stories. Nonetheless, he casts a long shadow in the world of literary journalism—a genre of nonfiction writing that adheres to all the reportorial and truth-telling covenants of traditional journalism, while employing rhetorical and storytelling techniques more commonly associated with fiction. To give better shape to that penumbra of influence, I spoke with Sullivan, Jamison, and Orange, along with Maria Bustillos, Jeff Sharlet, Joel Lovell, and Colin Harrison about Wallace’s impact on today’s narrative nonfiction writers. They spoke about comparisons to Wallace, what they love (and hate) about his work, what it was like to edit him, their favorite stories, posthumous controversies, and his influence and legacy.

Joel Lovell only worked with Wallace on one brief essay. Despite that singular experience, Lovell’s editorial time at Harper’s and elsewhere in the 1990s and 2000s put him in great position to witness Wallace’s rising status in the world of magazine journalism. He was unequivocal when I asked him which nonfiction writer today most reminds him of Wallace.

Joel Lovell: The clear descendant is John Jeremiah Sullivan, of course. For all sorts of reasons (the ability to move authoritatively between high and low culture and diction; the freakishly perceptive humor on the page) but mostly just because there’s no one else writing narrative nonfiction or essays right now whose brain is so flexible and powerful, and whose brainpower is so evident, sentence by sentence, in the way that Wallace’s was. No one who’s read so widely and deeply and can therefore “read” American culture (literature, television, music) so incisively. No one who can make language come alive in quite the same way. He’s an undeniable linguistic genius, like Dave, who happens to enjoy exercising that genius through magazine journalism. Read more…

When to (Not) Have Kids

An employee of Planned Parenthood holds a sign about birth control to be displayed on New York City buses, 1967. (H. William Tetlow/Fox Photos/Getty Images)

For a variety of reasons, I don’t have kids. As a woman of a certain age, I’ve been conditioned to believe I must qualify that statement by assuring you it’s not that I’m some kid hater, or that I don’t think babies are cute. They are! (Okay, I also find them to be kind of disgusting.) But among my many reasons for not procreating is that kids grow up to be people, and life for most people on this overcrowded, overheated planet is hard, and getting harder.

Even before Donald Trump took office, I had often wondered: with terrorism, war, and genocide, with climate change rendering Earth increasingly less habitable, how do people feel optimistic enough about the future to bring new people into the world? Since the presidential election, the prospects for humanity seem only more dire. I’m hardly alone in this thinking; I can’t count how many times over the past year I’ve huddled among other non-breeders, wondering along with them in hushed tones, How on earth do people still want to have kids? I was surprised, at this bleak moment in American history, that I hadn’t seen any recent writing on the topic. Was it still too taboo to discuss not making babies, from any angle? Then this past week a few pieces caught my eye.

The one that spoke most directly to my doubts about perpetuating the human race, and its suffering, was “The Case for Not Being Born,” by Joshua Rothman at The New Yorker. Rothman interviews anti-natalist philosopher David Benatar, author of 2006’s Better Never to Have Been: the Harm of Coming Into Existence, and more recently, The Human Predicament: A Candid Guide to Life’s Biggest Questions. Rothman notes that Benatar makes no bones about his pessimism as it relates to humanity.

People, in short, say that life is good. Benatar believes that they are mistaken. “The quality of human life is, contrary to what many people think, actually quite appalling,” he writes, in “The Human Predicament.” He provides an escalating list of woes, designed to prove that even the lives of happy people are worse than they think. We’re almost always hungry or thirsty, he writes; when we’re not, we must go to the bathroom. We often experience “thermal discomfort”—we are too hot or too cold—or are tired and unable to nap. We suffer from itches, allergies, and colds, menstrual pains or hot flashes. Life is a procession of “frustrations and irritations”—waiting in traffic, standing in line, filling out forms. Forced to work, we often find our jobs exhausting; even “those who enjoy their work may have professional aspirations that remain unfulfilled.” Many lonely people remain single, while those who marry fight and divorce. “People want to be, look, and feel younger, and yet they age relentlessly. They have high hopes for their children and these are often thwarted when, for example, the children prove to be a disappointment in some way or other. When those close to us suffer, we suffer at the sight of it. When they die, we are bereft.”

While this isn’t how I always look at life, I believe Benatar makes some good points. (Not to mention I’ve endured three of the above mentioned hot flashes while writing this, and one’s optimism does tend to dip in those estrogen-depleted moments.)

Rothman’s piece reminded me of an essay we published here on Longreads a couple of years ago,  “The Answer is Never,” by Sabine Heinlein. Like me, Heinlein often finds herself having to defend her preference for choosing to be childless: “One of the many differences between my husband and me is that he has never been forced to justify why he doesn’t want to have children. I, on the other hand, had to prepare my reasons from an early age.” She keeps a laundry list of reasons handy:

Over the years I tried out various, indisputable explanations: The world is bursting at the seams and there is little hope for the environment. According to the World Wildlife Fund, the Earth has lost half of its fauna in the last 40 years alone. The atmosphere is heating up due to greenhouse gases, and we are running out of resources at an alarming speed. Considering these facts, you don’t need an excuse not to have children, you need an excuse to have children! When I mention these statistics to people, they just nod. It’s as if their urge to procreate overrides their knowledge.

Is there any knowledge forbidding enough that it could potentially override such a primordial urge? In a devastating essay at New York magazine, “Every Parent Wants to Protect Their Child. I Never Got the Chance,” Jen Gann attests that there is. Gann writes about raising a son who suffers from cystic fibrosis, an incurable disease that will likely lead to his early death. The midwife practice neglected to warn her that she and her husband were carriers, and Gann writes that she would have chosen to terminate the pregnancy if they had.

The summer after Dudley was born, my sister-in-law came to visit; we were talking in the kitchen while he slept in the other room. “But,” she said, trying to figure out what it would mean to sue over a disease that can’t be prevented or fixed, “if you had known — ” I interrupted her, wanting to rush ahead but promptly bursting into tears when I said it: “There would be no Dudley.” I remember the look that crossed her face, how she nodded slowly and said, twice, “That’s a lot.”

What does it mean to fight for someone when what you’re fighting for is a missed chance at that person’s not existing?

The more I discuss the abortion I didn’t have, the easier that part gets to say aloud: I would have ended the pregnancy. I would have terminated. I would have had an abortion. That’s firmly in the past, and it is how I would have rearranged my actions, given all the information. It’s moving a piece of furniture from one place to another before anything can go wrong, the way we got rid of our wobbly side tables once Dudley learned to walk.

Finally, an essay that took me by surprise was “To Give a Name to It,” by Navneet Alang, at Hazlitt. Alang writes about a name that lingers in his mind: Tasneen, a name he had come up with for a child when he was in a relationship years ago, before the relationship ended, childlessly. It reminded me of the names I long ago came up with for children I might have had — Max and Chloe, after my paternal grandfather and maternal grandmother — during my first marriage, long before I learned I couldn’t have kids. This was actually good news, information that allowed me, finally, to feel permitted to override my conditioning and recognize my lack of desire for children, which was a tremendous relief.

Reading Alang’s essay, I realized that although I never brought those two people into the world, I had conceived of them in my mind. And somehow, in some small way, they still live there — two amorphous representatives of a thing called possibility.

A collection of baby names is like a taxonomy of hope, a kind of catechism for future lives scattered over the horizon. Yes, those lists are about the dream of a child to come, but for so many they are about repairing some wound, retrieving what has been lost to the years. All the same, there were certain conversations I could have with friends or the love of my life, and certain ones with family, and somehow they never quite met in the same way, or arrived at the same point. There is a difference between the impulse to name a child after a flapper from the Twenties, or search however futilely for some moniker that will repair historical trauma. Journeys were taken — across newly developed borders, off West in search of a better life, or to a new city for the next phase of a career — and some things have been rent that now cannot quite be stitched back together. One can only ever point one’s gaze toward the future, and project into that unfinished space a hope — that some future child will come and weave in words the thing that will, finally, suture the wound shut. One is forever left with ghosts: a yearning for a mythical wholeness that has slipped irretrievably behind the veil of history.

Yes, I know those ghosts, but not the yearning. I suppose I’m fortunate to not be bothered by either their absence in the physical realm, nor their vague presence somewhere deep in the recesses of my consciousness. Fortunate to no longer care what my lack of yearning might make people think of me.

New York Radical Women and the Limits of Second Wave Feminism

New York Radical Women protest the Miss America Pageant on the boardwalk at Atlantic City, 1969. (Santi Visalli Inc./Archive Photos/Getty Images)

At New York magazine, Joy Press has compiled an oral history of New York Radical Women (NYRW), a collective that existed from 1967 to 1969 and played a large role in defining second wave feminism in the United States. Its founders were generally younger and more radical than the women of the National Organization for Women (NOW), who’d come together in 1966 to address specific legislative failures in Washington, DC. NYRW focused more on elements of the culture that held women back.

The theatrics of the group’s organizing has been seared into the public’s imagination. In 1968, they protested the Miss America pageant by interrupting its telecast, crowning a live sheep on Atlantic City’s boardwalk, throwing objects symbolizing female oppression into a “freedom trash can.” The media called them “bra-burners” for this spectacle, and though nothing caught fire that day, the myth endured.

Along with their image-making, NYRW’s intellectual work, in the form of speeches, essays, pamphlets, and books laid the foundation for women’s studies as an academic discipline. Press explains:

Read more…

The Joys and Sorrows of Watching My Own Birth

JoKMedia / Getty

Shelby Vittek | Longreads | December 2017 | 13 minutes (3,315 words)

 

It’s a hot August night in 1991 at the Greater Baltimore Medical Center, and the delivery room is filled with bright lights. A film crew is documenting a woman giving birth. After almost 12 hours of active labor, it’s time for her to really push.

A few anxious rounds of counting to 10 and many deep breaths later, the doctor says, “Ooooh there you go, lots of hair.”

“That’s it, the baby’s coming!” the red-haired nurse says with excitement.

That’s when I enter the picture, with a head full of red hair of my own.

* * *

I know this scene well. It’s my own birth. Not many people can say they’ve watched their own delivery, but I can.

In fact, I’ve watched myself be born more times than I should probably ever admit to. I’m doing it again tonight for the ninth time this week, sitting on the floor in my studio apartment with my eyes fixated on the television. The sight of my fiery red hair making its debut will never fail to amaze me.

The video of my birth in no way resembles your typical home video. It’s more like a documentary, with my parents and family, and then finally me, as its subjects. Every single reaction of theirs is recorded in the truest manner, and edited as well as early ’90s technology could allow. That’s because it was not shot by a proud father-to-be, but instead a professional film crew. I was paid $300 to be born (the check went directly into my first college fund, I’ve been told), and the footage was used to make an educational video for other expecting parents to watch during Lamaze birthing classes. Hundreds, if not thousands, of other people have watched me be born, too.

Read more…

Assertiveness Training

Alex Milan Tracy / Sipa via AP Images

Susan Sheu | Longreads | December 2017 | 23 minutes (5,862 words)

In the early 1980s, my mother took a class at the local Wisconsin university’s student psychology center called “Assertiveness Training.” She was awakening belatedly to a version of the mind-expanding youth she had missed by marrying and dropping out of college at age 20 in 1967, during the Summer of Love. The class was taught by Dr. B, who told the students to use “I” statements to ask for what they wanted in plain terms during work and family interactions. (“I am unhappy that you said that to me. I feel that I am not heard when I speak to you.”) The idea was to learn to be assertive but not aggressive, to stop being a silently suffering martyr or someone who holds in all their anger and resentment until it boils over into inappropriate and ineffective rage or self-destructive behavior. It goes without saying that the class was all women. As she immersed herself in college again, my mother began to tell me that when I grew up, I could be anything I wanted — a doctor, a lawyer, a scientist. Even though the Equal Rights Amendment had not been ratified, she wanted me to believe that my future was up to me. Perhaps that was one reason she took Assertiveness Training, to be the kind of mother who raised a daughter who wouldn’t need a class like that.

My grandmother was the model of someone who regularly displayed inappropriate anger, someone my mom was trying to avoid becoming. My grandma Violet had once been docile, and my mom believed that she made the rest of us pay for that false submissiveness for the rest of her life. The short version of my grandmother’s story is that she didn’t marry the man she was in love with because he was Catholic and she was Protestant (this was Nebraska, circa 1928); she didn’t attend college despite receiving a debate scholarship because her mother feigned illness to keep her youngest child at home; and she tried to be a good wife in a marriage with a decent, practical man with whom she was not in love. She ran my grandpa’s restaurant while he was serving in World War II, and when he returned, no longer had any day-to-day responsibilities in the business operations.

By the time I knew her, my grandmother was smoking, alternating between Camels and Newports, drinking gin and, if she was feeling moderate, Mogen David wine (“The Jews” drank it. And Sammy Davis, Jr., “that talented Negro,” was a Jew. It had a screw top. And it was sweet.). She told off anyone who stood in her way, and for decades after her death, my mother made me pretend she was still alive, because it was the memory of my grandma’s fiery temper more than the restraining order that kept my father away. My grandma also took Valium, prescribed by the psychiatrist she began seeing shortly before her death in 1978. I was 9 when she died, but I already knew that her outspokenness and self-medication were a great source of shame for my mom and grandpa.

I’ve since come to understand that my grandma had the appropriate response to her circumstances.

Read more…

When Newspapers Cover the Private Lives of Nazis

Adolf Hitler on the patio of the Berghof wearing civilian clothes around 1936. (Imagno/Getty Images)

By now you’ve likely read Richard Fausset’s troubling New York Times’ profile of a “white nationalist and fascist” that tries to normalize and sympathize with its subject. You’ve also likely read the countless follow-ups damning not only Fausset’s article but also the Times’ tepid and inept response.

The profile attempted let ordinary details speak for themselves, and it opens with a description of a wedding registry: “On their list was a muffin pan, a four-drawer dresser and a pineapple slicer…Weddings are hard enough to plan for when your fiancé is not an avowed white nationalist.” But these ordinary details don’t contain meaning, they merely surround it. As Josephine Livingstone of The New Republic explains,

writers who simply represent (rather than report on) extremists leave rhetorical spaces open for Nazi ideology to flood in. You cannot let a Nazi hang himself, because he is the one left holding the rhetorical rope.

Fausset’s article wasn’t the Times‘ first attempt to transform racism into a personality quirk. From 1933, when Adolf Hitler was appointed chancellor of Germany, to his 1939 invasion of Poland, there was a significant movement both in the United States and worldwide to portray Hitler as a misunderstood genius whose everyday likability could better connect with the working class German people and lift the country from its post-war depression.

Magazines and newspapers like the Times of London, The New York Times, The Saturday Review (“Hitler at Home”) and even the American Kennel Gazette (“Hitler Says His Dogs are Real Friends“)  were more interested in Hitler’s interior design sensibility, his gustatory preferences, and his love of German Shepherds. In 1936, Vogue toured Hitler’s chalet as part of a package showcasing the interior design of the homes of foreign rulers. (Federico Mussolini’s villa was also included). Their coverage of Hitler successfully peddled these themes of austerity, industriousness, and single-minded drive to the masses eager to believe in Germany’s rebirth.

In her 2015 book Hitler at Home, Despina Stratigakos, a professor of architecture and history at the University of Buffalo, catalogued numerous attempts to normalize the dictator, which started with the publication of The Hitler that Nobody Knows, a 1932 photo album that doubled as a behind the scenes peek into Hitler’s private life. With more than a hundred photographs taken by Hitler’s personal photographer, the book — which sold 400,000-plus copies by 1942 — meant to serve as a beacon proclaiming Hitler as the leader of the new Germany. But Stratigakos stresses the effect was a more insidious.

Until the turnabout in 1932, National Socialist publicists had diverted attention away from or suppressed stories about their leader’s private life. Yet even as they continued to fight reports that could harm Hitler’s reputation, the Nazis began to construct for public consumption their own version of the private individual. The image of “Hitler as private man” would now be reconfigured from a liability into an asset…Bildung and self-improvement, together with self-discipline, a strong work ethic, and modesty, formed the core moral values of the German middle classes. The components of the “good” Hitler were thus artfully assembled with an eye to courting this constituency of voters and persuading them to abandon their allegiance to [war hero and political opponent Paul von] Hindenburg.

Even the New York Times wasn’t exempt from indulging in Hitler’s spin. Laurel Leff, a professor of history at Northwestern University, published Buried by the Times in 2005, examining the ways the Times either ignored or inadequately covered the Holocaust, partially due to a distaste among the editors for Zionism. In October 1935, the Times magazine included a fawning profile of Hitler as an architect, featuring his remodel of a small Bavarian cottage and it’s transformation into the fortress of Berghof, which was shown completed on the cover of a May 1937 issue.

But perhaps the strangest Times article was, “Herr Hitler at Home in the Clouds.” Written by Hedwig Mauer Simpson, the wife of Stanley Simpson, a British journalist and Munich-based correspondent for the New York Times and Times of London (she was a frequent contributor to the The Associated Press and The Daily Mail)—he would be the first to report on the Dachau concentration camp, a piece that was ultimately turned down by the Times of London. A journalistic power couple within Munich, the Simpsons were among the first reporters to have early access to Hitler, and she was known for her ability to file several stories at once and under intense pressure.

In the article, Simpson rehashes worn troupes about Hitler’s vegetarianism, the long walks he enjoyed with his Alsatian dogs, and his love of the German people. The tick-tock of his daily routine is described down to the minute. Breakfast is at 9 am, lunch is served by “white uniformed butlers,” and dinner is promptly at 8 p.m., with the ladies of the Berghof in evening dress and Hitler in English tweeds. In a rare step back from the festivities, Simpson writes that the setting contains “all the elements of exacting bureaucracy and secret-police efficiency.”

The Times article was published on August 20, 1939, 11 days before Hitler’s invasion of Poland. Simpson would take one of the last peacetime trains out of Munich to London, and it appears she gave up writing following her departure from Germany. There is nothing in the article that suggests the chancellor, who “no makes no secret of being fond of chocolate,” has anything on his mind except the promise of an afternoon nap. Simpson clearly feels pampered and privileged to be in his presence. Whatever she felt on that last train out of Germany isn’t recorded here.

Longreads’ Catherine Cusick recently discussed why articles like Fausset’s and Simpson’s are dangerous: “Reporters and editors committed to covering this movement may not be able to feel their own hearts beating faster out of fear.”

Ordinary details can furnish a room, they can set a table, they can fill the time between hushed meetings of planned genocide or the quiet tapping at a computer, spreading hateful slurs to thousands of followers. If a writer can’t feel that fear, can’t show those feelings on a page, then all the reader is left with is Hitler at home.