Author Archives

Aaron Gilbreath
Aaron Gilbreath has written essays and articles for Harper's, The New York Times, Kenyon Review, The Dublin Review, Brick, Paris Review, The Threepenny Review, and Saveur. He's the author of This Is: Essays on Jazz, the personal essay Everything We Don't Know, and the forthcoming book Through the San Joaquin Valley: The Heart of California. @AaronGilbreath

Coronavirus Could End Trump’s Chance at Reelection, But Things Are Too Terrifying Right Now To Feel Hopeful

AP Photo/Ahn Young-joon

At The Atlantic, long-time Republican Peter Wehner writes what many of us hope is true: That the coronavirus crisis has shown how incapable a leader Trump is, and this crisis will end Trump’s presidency. Granted, a lot of us knew how incapable Trump was before the pandemic. He is a despicable, morally bereft “human being.” And even if Wehner’s prediction turns out to be correct, a national emergency is too great a cataclysm to make it feel worth celebrating right now. Right now, a lot of us are locked inside our homes, self-quarantining, entertaining our kids while protecting ourselves and others. An end to Trump’s presidency is an optimistic outcome; let’s just hope enough of us survive, and our economy endures, to enjoy it. Nothing seems cheery now, here indoors among the stockpiled cereal, canned beans, and coffee, during this isolated, anxious time when so many of us are wondering if our jobs will continue, reading too many coronavirus articles and tweets and updates, and wondering which of our elderly family members the virus will kill. Right now, we can definitely use something to feel hopeful about. The full pantries no longer provide much sense of relief. With the spreading pandemic creeping closer to each of us, it no longer seems like we have much time to wait for Trump’s reign to end. And yet, locked inside our homes, it also seems that time is all we have, one hour ticking slowly by after another. But back to Wehner.

Most of us know Trump’s moral and presidential failings, his lying, cheating, racism, misogyny, and unfortunate ability to get away with behavior that would have ruined other presidents, let alone small-town mayors. Wehner makes a strong case, though: How Trump ignored early warnings about COVID-19. How Trump circulated misinformation, blocked testing efforts, disbanded the NSC pandemic unit, kept shaking people’s hands despite warnings, and how he is clearly incapable of comforting or protecting the public in anyway. Now that we’re in crisis, Wehner believes that Trump can no longer hide his errors and presidential limitations, and it will cost him the election. “Day after day after day he brazenly denied reality, in an effort to blunt the economic and political harm he faced,” writes Wehner. “But Trump is in the process of discovering that he can’t spin or tweet his way out of a pandemic. There is no one who can do to the coronavirus what Attorney General William Barr did to the Mueller report: lie about it and get away with it.”

The coronavirus is quite likely to be the Trump presidency’s inflection point, when everything changed, when the bluster and ignorance and shallowness of America’s 45th president became undeniable, an empirical reality, as indisputable as the laws of science or a mathematical equation.

It has taken a good deal longer than it should have, but Americans have now seen the con man behind the curtain. The president, enraged for having been unmasked, will become more desperate, more embittered, more unhinged. He knows nothing will be the same. His administration may stagger on, but it will be only a hollow shell. The Trump presidency is over.

We will see. But I am grateful for Wehner’s gift of hope. What I want more than anything, is for my family, my friends, my neighbors, and people around the world, including you reading this, to survive, so that we can emerge from this and not only thrive, but can, as Dan Rather recently put it, “follow a path of renewal and improvement of how we structure our society, its economy, its health, its social obligations, and its politics.”

Read the story

The Zoo That Divided a Town

AP Photo/Kevin Anderson

In April 2020, Mark and Tammy Drysdale moved to the 2,500-person town of Grand Bend, Ontario and bought a shuttered roadside zoo. Then they started filling the property with lions, goats, lemurs, and various exotic animals. Unfortunately, their property was no longer zoned for zoos. It was now zoned residential. The outspoken owner, Mark, claimed his lions were basically domesticated cats, and local bylaws allowed domesticated animals. Certain neighbors said otherwise, and they worked to close down the zoo. For the Canadian quarterly magazine Maisonneuve, Kieran Delamont writes about the town’s struggle with the zoo, and what larger social and economic forces this resistence represents. Delamont sees the zoo as a barometer of town health, a way to measure the distance between the rich and poor, the past and the future, and the thin threads that often bind communities like Grand Bend, which only has one intersection.

None of it is enough, for Drysdale at least, so he keeps adding new animals to the mix like a roughshod Noah stocking his arc. In September, a baby zebra is born. In October, another lion cub arrives. These kinds of home-brew zoos have existed in Ontario for at least a hundred years—a network that, in the vacuum created by a lack of regulations, sprang up alongside the growing highway network. With a little ingenuity, and some cash on hand, these animals are not as hard to acquire as people imagine. Twice a year, an “Odd and Unusual” animal auction is held somewhere in southern Ontario, functioning like a trading post. Exotic cats are still a somewhat prized auction item, but someone could expect to see lynx, lemurs, llamas, reptiles, even wolves, up for sale. This exotic animal community is a tradition of rural Ontario, and Drysdale is deeply entrenched in it.

But if there’s no space for Drysdale in today’s Grand Bend, it’s at least partly because today’s Grand Bend is different from that old Grand Bend. The town has always been a place filled with lake people—its own Ontario character type, comprising enthusiastic cottagers and the more grizzly beachfront locals. Lake people earnestly own painted Adirondack chairs, insist on idiosyncratic house rules to various card games and probably have at least one nautically decorated bathroom. Every year since I was a baby, we lake people show up in Grand Bend for May Two-Four weekend and leave as late as we can on Labour Day.

It’s always been a culture of the leisured middle class, catered to by the labour of teenagers at the ice cream stand, supplied by travelling salesmen of the flea market, entertained by the hospitality of people like Drysdale who opened little roadside businesses and simply let the tourists come to them. But that kind of economic rejuvenation, it seems, may no longer be the kind people want.

Read the story

A Design Aesthetic That Lets You Succeed In a World That Doesn’t Care If You Fail

Christoph Hardt/Geisler-Fotopres/picture-alliance/dpa/AP Images

If I remember correctly, the term “fern and brass” has been used to describe the hideously bland interior design of lookalike chain restaurants from the 1980s. It didn’t matter what city you were in, be it Spokane or Topeka, they all looked the same. I’m glad I was too young to have fully experienced that, but each generation has their aesthetic and its burdens. For The Cut, Molly Fischer writes a richly detailed analysis and history of a modern design style she calls the “millennial aesthetic,” in a story that has more personality, color, and life in it than any of the interiors she describes. One photo caption in Fischer’s story names the aesthetic’s hallmarks as “Motivational ad copy, soft colors, and photogenic domesticity.” Originality isn’t the point. Sameness has certain virtues, even if they’re clichés. So where did this aesthetic come from? How does it work, and will it and its pinkness ever go away? Our era has eaten the pink pill that sells us design as palliative medicine, something that, as Fischer put it, “functions like a CBD seltzer.” Also, advertising, aspirational lifestyle branding, and consumption are all intertwined in this aesthetic.

If you simultaneously can’t afford any frills and can’t afford any failure, you end up with millennial design: crowd-pleasing, risk-averse, calling just enough attention to itself to make it clear that you tried. For a cohort reared to achieve and then released into an economy where achievement held no guarantees, the millennial aesthetic provides something that looks a little like bourgeois stability, at least. This is a style that makes basic success cheap and easy; it requires little in the way of special access, skills, or goods. It is style that can be borrowed, inhabited temporarily or virtually. At the very least, you can stay a few hours in a photogenic co-working venue. At the very least, Squarespace gives you the tools you need to build your own presentable online home.

Fischer explains the central role that the color pink plays, and its surprising popularity.

No account of the millennial aesthetic could fail to address pink: For the better part of a decade, millennial pink bedeviled anyone a color could bedevil. When Facebook rolled out a corporate rebrand last fall, the lead image in the press release showed the new logo — breezily spaced sans serif — in a muted shade somewhere on the ham-to-salmon spectrum. Samuel keeps wondering when people will get sick of the color, but they don’t; almost every client asks for pink. She thinks this is because it’s soothing. They want houses that remind them of vacations, suggest Mediterranean idylls.

“It kind of feels like a binky,” Deborah Needleman, the former editor of T, WSJ., and Domino, says of millennial interiors. (Boob-print pillows and bath mats are perhaps the most literal expression of a general tendency toward the comforts of babyhood.) Needleman sees not a trip to Greece but something more like childproofing. “It’s like it has no edge or sense of humor or sense of mystery,” she says. “There’s no weirdness. There’s nothing that clashes. It is very controlled.”

It’s amusing to note that the PRADA ad in this article’s sidebar turns the brand name into an acronym for “Play Responsibly And Dress Authentically” — one of the motivational ad slogans that Fischer mentions. The ad’s background is 80% pink, and its design elements lack the authenticity its motto tries to sell.

Read the story

Some Inland California History Begins with an Orange

AP Photo/Damian Dovarganes

For Riverside native and author Susan Straight, citrus and camaraderie were once the ties that bound people in the part of southern California called the Inland Empire. This area includes the many cities east of greater Los Angeles, and west of Palm Springs, in Riverside and San Bernardino counties. New arrivals used to plant lemons, tangerines, and oranges in their yards, as well as figs, persimmons, avocados, and loquats, and they shared their bounty with friends and neighbors. For California’s public broadcasting service KCET, Straight writes an evocative essay that mixes regional history with personal history, and celebrates the way these imported fruits have shaped the social fabric and local economy. She has an 80 year old apricot tree growing on her property. Even though this arid region isn’t known for its timber, Straight calls its planted gardens “non-native woods” and sees them as paradise, because they helped provide many people what was truly a piece of the good life. “The groves are nearly gone now,” Straight writes, “housing tracts named for what they’ve erased.” But locals don’t give up these traditions.

Eliza Tibbets started the first two seedling navel orange trees. A statue of her was recently unveiled in downtown Riverside, and it seems a fitting time to remind ourselves of the woman who transformed California’s landscape, not just with daring but with generosity. (I still drive past the Parent Navel Orange Trees, at the corner of Arlington and Magnolia Avenues, every week.)

She was married three times, an abolitionist (her third husband, Mr. Tibbets, campaigned as a “Radical Republican” who tried integration in Virginia), a suffragist who tried to vote in 1871, a spiritualist who led séances in Riverside when she got here. But in 1873, she sent to Washington’s new Bureau of Agriculture for the first two seedling trees of a new variety of seedless oranges from Bahia, Brazil, and planted them in her yard in Riverside. She kept them alive with dishwater, shared the fruit and more cuttings, and changed the economy and the very look of Southern California. (Neither she, born in Cincinnati, or the seedlings, were natives.)

By 1886, entire towns like Rialto, Bloomington, Corona and Redlands were laid out around groves of Washington navel orange trees. Packing houses for Sunkist Growers and other cooperatives were built, the Santa Fe Railroad took boxcars full of fruit all over the nation, and oranges were shipped around the world. By 1895, Riverside had the highest per capita income in America, thanks to the citrus industry.

The faces of Southern California changed with citrus, too.  Chinese laborers, Italians and Mexicans and Japanese and African-American southerners, Dust Bowl refugees from Oklahoma and Texas and Colorado — all picked and packed and trucked oranges.  I grew up with their kids.

Read the story

The People We Love to Hate on Social Media

Marshall Ritzel via AP

If you’ve ever kept certain people visible in your social media feeds just because you loathed or envied them, or because you couldn’t tell the difference between envy and irritation, then Emily Flake’s New Yorker post is for you. In it, the talented cartoonist examines her unflattering insistence on following a certain artsy, nouveau-hippie family on Instagram who causes her constant side-eye. Flake is hilarious, and she’s as insightful in her drawings as she is in her writing. “There are so many ways to be a creep these days,” she says. “One of the easier ways is to follow people on social media toward whom you have feelings that are other than warm.” As she examines her pettiness, you might see yourself, as I have, in this snapshot of our cultural moment. But her attraction to this family is about a lot more simple envy.

My contemplation of the life of this rustically hip family takes on the “Is it this or is it that?” quality of those trick drawings: Is this an old woman in a babushka or a young one in a hat? Are the choices the hip family makes arrogant or inspiring? Stupid or brave? Maybe they’re both, in the way that my drawing is both, simultaneously. My side-eye at their neo-pioneer lifestyle is accompanied by a thrum of envy for the freedom of their life (Who works? Is there a trust fund at play here, or are they just that good at living off the land?) and a desperate, shame-filled recognition of the disparity between their towering competence and my obvious lack thereof. Who would you want to link up with in the coming apocalypse? The hot, fit, loving family who knows how to build a house by hand, or the tubby middle-aged broad who can’t even drive stick? Exactly. My ability to provide wry commentary about my own cervix is an asset useful only in a pre-collapsed society.

Read the story

Why Do Seventh-Day Adventists Live Longer Than Most Americans?

Britta Pedersen/picture-alliance/dpa/AP Images

I was reheating some leftover cottage cheese loaf the other morning, savoring the phrase “cottage cheese loaf” as I anticipated its delicious, savory crunch, when I wondered if anyone had written a love letter to this or other classic Seventh-Day Adventist dishes.

My wife made this loaf. She grew up Seventh-Day Adventist and introduced me to what I call #LoafLife. Although her parents left the denomination by the time she was 14, much of its community-mindedness stayed with them, along with its food. A healthy diet and exercise are central Adventist tenets, because the group believes in a relationship between physical and spiritual health. This often means vegetarianism. My wife didn’t eat meat regularly until high school, and even after that, she’s always eaten it conservatively. The family’s love of vegetables and salads remains strong. They still make the veggies piled on chips called Adventist haystacks. They still make oatmeal-walnut patties. The cottage cheese loaf is a simple mixture of chopped onions, walnuts, parsley, salt, pepper, butter, and cottage cheese bound together with eggs and Wheaties for a nice wholesome texture.

To learn more about the ideas that produced so many wonderful meals for me, a non-practicing Jew, I did some sleuthing and found a few illuminating articles about the Seventh-Day Adventist diet. Howard Markel wrote a good short Smithsonian article entitled “The Secret Ingredient in Kellogg’s Corn Flakes Is Seventh-Day Adventism.” But my favorite is journalist Emily Esfahani Smith‘s 2013 Atlantic piece “The Lovely Hill: Where People Live Longer and Happier.

Smith focuses on Loma Linda, California, which has one of America’s largest Seventh-Day Adventist communities and, not surprisingly, is known for the health and longevity of its residents. For the Biblical origins of the sect’s dietary practices, Smith quotes Pastor Randy Roberts of Loma Linda University: “In Corinthians, Paul speaking of the human body says specifically, ‘you are the temple of the Holy spirit.’ Therefore, he says, whatever you do in your body, you do it to the honor, the glory and the praise of God.”

Interestingly, the diet closely resembles the Mediterranean diet. Smith includes some incredible findings about the benefits of eating nuts, avoiding fast food, and the role meat plays in heath:

Adventist men who do not eat meat outlive American men by seven years. Adventist women who do not eat meat outlive American women by five years. Many Adventists do not eat meat, but even those that do outlive their peers thanks to the amount of vegetables, fruits, and other healthy foods they eat. Meat-eating Adventist men live 7.3 years longer while the women live 4.4 years longer than other Californians.

But the correlation between diet and health goes beyond the body, also impacting depression and a nurturing sense of positive well-being:

Ford and her team at Loma Linda University examined the eating patterns of over 9,000 healthy Seventh-Day Adventists in North America over a four-year period. How often did they eat fast food? Did they eat meat? What kinds of dairy products were they consuming? What about nuts? Desserts? Fish? They then examined their self-reported feelings of positive and negative emotions—how often did they feel inspired? Excited? Enthusiastic? Upset? Scared? Distressed?

The researchers found that those who eat like Greeks feel more inspired, alert, excited, active, inspired, determined, attentive, proud, and enthusiastic than those who consume a more typically American diet consisting of highly processed foods, soda, and sweets like cookies and doughnuts. People who eat foods associated with a Mediterranean diet also experienced less negative emotions like being afraid, nervous, upset, irritable, scared, hostile, and distressed. The more people ate those foods that are more typically American — specifically, red meat, sweets, and fast food — the less of these positive emotions they felt.

Smith describes a Loma Linda centenarian named Marge Jetton whose gusto is impossible not to envy, even if you’d rather not share her diet or schedule.

At 100 years old, Jetton, a former nurse, would wake up at 4.30 am each morning. After getting dressed and reading from the Bible, she would work out. When she completed her mile-long walk and 6-8 miles on the stationary bike, she had oatmeal for breakfast. For lunch, she would mix up some raw vegetables and fruit. Occasionally, she would splurge on a treat like waffles made from soy and garbanzo beans. That wasn’t all. The centenarian volunteered regularly, barreled around town in her Cadillac Seville, and pumped iron. She also tended to a garden that grew tomatoes, corn, and hydrangeas.

I’ve always known my wife would outlive me, and not just because I’m older and exercise less —meaning, almost never — but because vegetarian dishes are her comfort foods. Old habits are hard to break: In my family, comfort food is Oklahoma country food like biscuits and gravy, cream pie, and the Sonoran-style Mexican food we grew up on in southern Arizona. For my wife, comfort food is cottage cheese loaf, haystacks, and oatmeal-walnut patties. Although I’ve eaten pretty healthily since college, my time eating her family’s Adventist holdovers has only made me see how much room my lifestyle has for improvement. This particular morning loaf and Atlantic article made me realize that, in midlife, I need to catch up with my wife’s enviable standards of self-care. I’ve been slacking during the last decade.

I was a vegetarian for three years in college, and a vegan for one, so my palate is primed for the Adventist nutty-loafy-patty menu. I shopped on Craigslist for a used stationary bike, I researched machines to make homemade soy milk, and I made a pact to eat less meat and way more tofu. She was like: Duh, I already do.

I always loved the loaf for its flavor, but now it’s a gateway to healthier habits that would likely please Seventh-Day Adventist co-founder Ellen G. White. And when my wife asks, “Want to make cottage cheese loaf this week?” I always say “Hell, yes.” No religious reference intended — I’m just a cursing heathen who wants to live a long life.

Read the story

I Have No Idea What You Corporate People Are Talking About

Ted S. Warren / AP Photo

Corporate lingo is all about obfuscation, group-think, and creating unnecessary work rather than clarity. For New York Magazine, Molly Young examines corporate jargon like “futureproofing” and “level-setting” to try and understand where it came from, why corporate employees opt-in (ha) to this group linguistic delusion, and what such gibberish does and does not do for people. Take, for example, the term “parallel-path,” which more simply means to do two things at once. Office workers already did multiple things at once constantly. Why did anyone need a confusing term for language that was already clear? “It was,” Young says, “in its fakery and puffery and lack of a reason to exist, the perfect corporate neologism.” Young calls all such lingo “garbage language,” a term borrowed from Anna Wiener, author of the new tech life memoir, Uncanny Valley. “The meaningful threat of garbage language,” Young writes, “is that it confirms delusion as an asset in the workplace.”

Another thing this language has in common with garbage is that we can’t stop generating it. Garbage language isn’t unique to start-ups; it’s endemic to business itself, and the form it takes tends to reflect the operating economic metaphors of its day. A 1911 book by Frederick Winslow Taylor called The Principles of Scientific Management borrows its language from manufacturing; men, like machines, are useful for their output and productive capacity. The conglomeration of companies in the 1950s and ’60s required organizations to address alienated employees who felt faceless amid a sea of identical gray-suited toilers, and managers were encouraged to create a climate conducive to human growth and to focus on the self-actualization needs of their employees. In the 1980s, garbage language smelled strongly of Wall Street: leverage, stakeholder, value-add. The rise of big tech brought us computing and gaming metaphors: bandwidth, hack, the concept of double-clicking on something, the concept of talking off-line, the concept of leveling up.

One of the most influential business books of the 1990s was Clayton Christensen’s The Innovator’s Dilemma. Christensen is responsible for the popularity of the word disruptive. (The term has since been diluted and tortured, but his initial definition was narrow: Disruption happens when a small company, such as a start-up, targets a limited segment of an incumbent’s audience and then uses that foothold to attract a bigger segment, by which point it’s too late for the incumbent to catch up.) The metaphors in that book had a militaristic strain: Firms won or lost battles. Business units were killed. A disk drive was revolutionary. The market was a radar screen. The missilelike attack of the desktop computer wounded minicomputer-makers. Over the next decade and a half, the language fully migrated from combative to New Agey: “I am now a true believer in bringing our whole selves to work,” wrote Sheryl Sandberg in Lean In, urging readers to seek their truth and find personal fulfillment. In Delivering Happiness, Zappos CEO Tony Hsieh described making conscious choices and evolving organically. In The Lean Startup, Eric Ries pitched his method as a movement to unlock a vast storehouse of human potential. You can always track the assimilation of garbage language by its shedding of scare quotes; in 1911, “initiative” and “incentive” were still cloaked in speculative punctuation.

Read the story

Novelist Charles Portis Was a True Original

True Grit, poster, John Wayne, Kim Darby, Glen Campbell, 1969. (Photo by LMPC via Getty Images)

For many people, Charles Portis will forever be remembered as the author of the 1968 book that became the 1969 film adaptation with John Wayne as Rooster Cogburn and then the Coen Brothers’ 2010 version. True Grit is a masterpiece. I mean that. It’s a perfect book. I feel the same about his first novel Norwood, which is a hilarious, weird road trip story. Portis’s third novel, The Dog of the South, is almost as good. I rarely say anything is perfect, but Portis’s first two novels strike me as completely satisfying, self-contained worlds that reveal greater wonders on repeat readings and are beyond improvement. I also rarely reread books, but when I’ve reread both of these, their facets only sparkle more brightly, and reveal greater finesse. Portis only published five novels in his lifetime, but by only five, I mean “only.” His legacy lies not in his total output but in his pages. These novels are dense with wit, a distinctive voice, and warped comic vision of the world, with plots driven by bumbling protagonists on long journeys that reward readers with constant laughs and endless surprises.

Portis died on February 17, 2020, at age 86. For The New Yorker, writer Wells Tower examines the author’s literary achievements, paints a brief portrait of a person who revealed little about himself, and celebrates a writer he believes was more than a comic, but a philosopher. Every fan Portis has their favorite passages, but part of his legacy is a tone that Tower calls “a shrug of quiet amusement.” His privacy also shaped his legacy. Portis avoided publicity. He dodged interviewers and kept to himself. Tower writes:

It’s hard to know whether Portis’s work ushered much comfort into his own life. My sense is that he was lonely. I imagine he had a fair bit in common with Jimmy Burns, described in “Gringos” as a “hard worker,” “solitary as a snake,” and, yes, “punctual.” Portis never married and had no children. He never published another novel after “Gringos,” from 1991. The closest he gets to self-portraiture comes in his short memoir “Combinations of Jacksons,” the essay published in The Atlantic. Toward the essay’s close, the author spots an “apparition” of his future self in the form of a geezer idling his station wagon alongside Portis at a traffic light in Little Rock. He wore “the gloat of a miser,” Portis writes. “Stiff gray hairs straggled out of the little relief hole at the back of his cap. . . . While not an ornament of our race, neither was he, I thought, the most depraved member of the gang.”

In his vision of himself at the wheel of the phantom station wagon, Portis goes on to write what feel like fitting instructions for how we ought to cope with this great and overlooked writer’s exit from the scene: “I could see myself all too clearly in that old butterscotch Pontiac, roaring flat out across the Mexican desert and laying down a streamer of smoke like a crop duster, with a goatherd to note my passing and (I flatter myself) to watch me until I was utterly gone, over a distant hill, and only then would he turn again with his stick to the straying flock. So be it.”

After reading Norwood, I fell in love with his narrative voice and wanted to know more about the person who created it. Information was scant.

Portis started his writing life as a journalist, eventually working beside future novelist Tom Wolf. By the time Portis published Norwood in 1966, he’d left the newsroom for what turned out to be forever. True Grit’s 1969 screen adaptation won John Wayne the only Oscar of his career, and generated so much money – $14.25 million at the box office – that Portis could lead a simple, quiet life in Little Rock, Arkansas, writing and frequenting local watering holes, where he was just another regular who smoked cigarettes and wet the four corners of his napkins so they didn’t stick to the bottom of his beer glass and make him look like an idiot. That’s the kind of detail Portis would have included in his books had he not been living it.

His love of beer joints made him sound accessible, so I tried to contact him back in April 2010.

Before Portis’s nonfiction miscellany Escape Velocity was published, I dug up every piece of his short nonfiction and fiction that I could in old issues of magazines like The Atlantic and Oxford American. They provided a biography, but they also generated more questions. I started piecing it all together in an essay about him and his work, where I tried to understand how his masterpieces existed in a biographical information vacuum, generating questions and speculation, what I called “a string of maybes.” His was just such a striking career turn: a lowly journalist sells his first novel to Hollywood and makes huge money, then takes increasing numbers of years to write each subsequent novel, before quiting publishing all together. Whatever his feelings about this transition from journalism to fiction, he seemed to have shared none of them with his fellow reporters. As Tom Wolfe says in The New Journalism, “One day [Portis] suddenly quit as London correspondent for the Herald Tribune. That was generally regarded as a very choice job in the newspaper business. Portis quit cold one day, just like that, without a warning.” And, after writing his first two novels, Portis “actually went on to live out the fantasy,” Wolfe says. “Portis did it in a way that was so much like the way it happens in the dream, it was unbelievable. …He sold both books to the movies…He made a fortune…A fishing shack! In Arkansas! It was too goddamned perfect to be true, and yet there it was. Which is to say that the old dream, The Novel, has never died.”

Knowing Portis refused most interviews, I decided to increase my chances of a response by asking the most pressing question I had: why, after six years as a reporter, did he decide to try writing novels for a living? I was curious about what factors went into his decision to write fiction, what his hopes were, his career concerns or frustrations with reporting, and what effect, if any, that era of literary publishing (at the dawn of the “new journalism”) had on his thinking. The most detailed treatment of the subject appeared in a rare Q&A Portis gave to the University of Arkansas in 2001. In it, he makes his decision seem simple: “As I say, the Tribune people had always treated me very well, but I wanted to try my hand at fiction, so I gave notice and went home.” He just decided to try his hand and went? Just like that? No way, I thought, rereading that; nothing is that simple.

Three months later, the literary agency kindly sent me Portis’s response to my question. It read: “I simply wanted to try my hand at fiction, and if it hadn’t worked out I would have gone back to journalism.”

I laughed out loud reading that: “try my hand at fiction.” He’d used nearly the exact same phrase in that 2001 interview. It was the phrase I was trying to get away from by emailing him. Oh well. Like everything he wrote, even his one-line email amused me. His mystery remained intact.

Read the story

When It’s Time to Tell

Roman Krompolc (CTK via AP Images

For The Rumpus, writer and book publisher Laura Stanfill movingly details the traumatic story of domestic abuse and male violence that she kept to herself for 25 years. A week-long writing retreat helped her to see that silence no longer protected her, talking openly did. At the retreat, she was surrounded by story, so she decided it was time, as she put it, to “add my voice to the chorus of women who have said, I survived.” The trouble started in college, when her boyfriend bought a pistol and quickly revealed who he really was.

Instead of confiding in a friend or alerting the college administration that my boyfriend shot at me, I moved into his dorm room. He wanted me where he could keep an eye on me. It seemed safest to follow his commands. To say, If that’s what you want. I covered the lodged bullet with putty, filling in the hole in the wood, wanting to hide the evidence. I stepped over that mismatched blotch on the way to and from class.

Through the rest of my college years, I hid from friends and classmates, inventing excuses when they extended invitations. I spent hours on the bathroom floor, sick to my stomach at the thought of appearing in public with my boyfriend. He might yell or humiliate or hurt me. He often became more dangerous if a stranger interfered. I wrote false, cheer-filled notes to old friends on other campuses. I lost weight, became bones and savage bile. The doctors couldn’t figure it out; I didn’t give them any context. My parents worried about me and paid the medical bills and never suspected the cause. Or, if they did, they didn’t confront me about him.

When I turned twenty-one, my boyfriend proposed at a fancy restaurant and called me stupid for not noticing the gold ring shining at the bottom of the champagne glass. He had gone to all this effort.

Read the story

Black America Unwittingly Provided the Soundtrack to Its Own Displacement

Smith Collection/Gado/Sipa USA)(Sipa via AP Images

While working from a coffee shop here in Portland, Oregon, I spotted a college student wearing a Nas t-shirt and reading Hanif Abdurraqib’s book Go Ahead In the Rain, about A Tribe Called Quest. (You can read an excerpt here.) Portland has been called the whitest city in America, and there was one person of color in this coffee shop. Shop staff frequently play R&B, Fugees, and beat tapes here, which keeps me coming back, but any longtime Portlander is aware of the way Black art frequently decorates our city’s white spaces, especially in neighborhoods where gentrification has ousted longtime Black residents. By chance, I was reading Tre Johnson‘s piece in Slate, “Heard but Not Seen.” Its subhead is “Black music in white spaces.” While visiting New Orleans, Johnson disturbed by how the music that captures the Black American experience now plays in the kinds of white restaurants, coffee shops, and spaces, where people of color are few, and where it embodies displacement.

A white friend said that Black culture is American culture, and that the two are, as a result, linked. True. And yet that’s what makes it all the more painful to find myself in mostly white spaces with their Black soundtracks, doing something intimate like eating with a friend, doing something public like shopping or working out—always in a place that’s using that music not only to create a vibe but a communal experience for their customers. The music’s been recycled for consumption, with little care for the context of this consumption. Embracing Black music is not the same as embracing Black people, after all, no matter how often our music is created with a specific gaze toward our experience. How many times, while our music plays, have one of us been dismissed, followed, or harassed in these spaces? What was playing when those two brothers were being kicked out of a Philadelphia Starbucks? On the loudspeakers and PA systems in stadiums, as hip-hop music blasts to keep the crowd hyped, and celebrate big plays, Black men and women tie on aprons and stand behind concession stands, walk the rows and aisles, sweep the floors—even as a nation denounces players’ rights to kneel in protest. It’s as if the music gets to stand in for us. Increasingly we’re in the background as our music is pushed to the fore.

My nana, Alice, and her best friend Ms. Sarah were two Black women among many who worked the assembly line at a General Motors factory back in Trenton, New Jersey. They wore their bodies down making cars that it would take them years to afford themselves, and I imagine them singing Tina, Aretha, the Supremes to get through hourslong shifts. How those anthems of Black homes, Black marriage, Black communities, Black love, Black sex, Black strength fell in lockstep with their lives! Now that’s all been replaced; the factories and homes and communities have gone away, often literally replaced by boutiques and upscale restaurants and Flywheels. Yet the music remains. As Tina, Aretha, and the Supremes have been replaced by Rihanna, Cardi, and Beyonce, so have the bodies. I once spent a summer as a high schooler working alongside Nana; now I’m an adult in the city, a Black man pedaling in the dark, alone with these rows of white bodies and Lizzo’s joyous, lonely voice.

Read the story