Search Results for: Frank Rich

True Roots

Daniel Berehulak/Getty Images

Ronnie Citron-Fink | True Roots | Island Press | June 2019 | 34 minutes (5.655 words)

 

How’d You do it? Are you doing that on purpose? Are you okay? Ever since I stopped coloring my silver hair, I’ve gotten a lot of questions. One of the most common during my hair transition was Why are you letting it go gray? While my roots didn’t ask permission before they stopped growing in dark brown, it was a complex mix of fear and determination that rearranged my beauty priorities. The question of why — why, after twenty-five years of using chemical dyes, I gave them up-is something I’ve thought about a lot.

My world began to shift four years ago. I was sitting in a meeting about toxics reform in Washington, DC, when an environmental scientist began to describe the buildup of chemicals in our bodies. As she rattled off a list of ingredients in personal care products-toluene, benzophenone, stearates, triclosan — my scalp started to tingle. “We’re just beginning to understand how these chemicals compromise long-term health,” she concluded.

Read more…

High Expectations: LSD, T.C. Boyle’s Women, and Me

Illustration by Homestead

Christine Ro | Longreads | May 2019 | 16 minutes (4,208 words)

I’m sweaty, exhausted, and red-faced when I finally emerge from my final acid trip. My apartment is a mess of objects my friends and I have tried feeling, smelling, or otherwise experiencing: loose dry pasta, drinks of every kind, hairbrushes, blankets. My voice is hoarse from talking or shouting all night. I’ve had more emotional cycles in the past 12 hours than in the last several months combined.

What made me want to drop acid wasn’t a friend or a festival, but a book. Specifically, T.C. Boyle’s new novel Outside Looking In. The book has its problems, but one thing it gets right is the intensely social experience of LSD. Even taken alone, even as a tool for introspective reflection, it rejigs attitudes towards other people. This can be a gift, or it can be a weapon. And as a woman, I’m especially aware of the potential for the latter. Read more…

A Woman’s Work: The Inside Story

All artwork by Carolita Johnson.

Carolita Johnson | Longreads | April 2019 | 23 minutes (5,178 words)

The subject of my pre-doctoral studies was medieval nuns and their relationship to their menstrual cycles. Long story short: my theory was that this relationship was determined by the very real divide between the early Christians who favored either the Old Testament or the New Testament on the inherent “sinfulness” or absence thereof of the human body. The traditional, Old Testament attitude that menstruation made women “unclean” somehow prevailed. Fancy that. Call me crazy, but I had to believe that the way the Church, the Patriarchy, and all of society saw women’s bodily functions had an effect on women’s relationships with their bodies.

Stories of menstruating women ruining mirrors they looked into, or causing soufflés to fall, causing farm animals to miscarry, mayonnaise to “not take,” etc., and menstrual blood used as an ingredient in cures for leprosy or magic potions, were common. But even if they were all but forgotten by modern times, they merge easily into my being taught, in the 1980s, to call my period “The Curse.”

I’d noticed, in many hagiographies, that one of the first signs a woman might be a saint, besides experiencing ecstatic “visions,” was that she’d barely, if at all, need to eat or drink anymore, and her various bodily secretions would cease. I wondered if nuns might be using herbs, self-starvation, and/or physical exertion to put an end to their secretions, amongst which, their periods.

Compare this to how, in modern times, many women, including myself, would use The Pill without the classic 7-day pause in dosage to skip an inconveniently timed period. This pause was designed to give women on the Pill a “period” that was more symbolic than functional, almost more of a superstition, and totally unnecessary, medically speaking. Recent years have even seen the introduction of contraceptive pills actually designed to limit a woman to 0-4 periods a year — hormonally inducing amenorrhea, or absence of menstruation. There are times when women want to avoid having their periods, for example, during vacations, sports events (with the notable exception of Kiran Ganhi), honeymoons; in other words, times when we want to be at our best and free of physical impairments or, let’s be frank: free from the anxiety of being discovered menstruating. Some of us opt to be free from that anxiety year-round now. I think medieval nuns would have loved to have that option.

Read more…

When Did Pop Culture Become Homework?

Kevin Winter / Getty, Collage by Homestead

Soraya Roberts | Longreads | April 2019 | 6 minutes (1,674 words)

I didn’t do my homework last weekend. Here was the assignment: Beyoncé’s Homecoming — a concert movie with a live album tie-in — the biggest thing in culture that week, which I knew I was supposed to watch, not just as a critic, but as a human being. But I didn’t. Just like I didn’t watch the premiere of Game of Thrones the week before, or immediately listen to Lizzo’s Cuz I Love You. Instead, I watched something I wanted to: RuPaul’s Drag Race. What worse place is there to hide from the demands of pop culture than a show about drag queens, a set of performance artists whose vocabulary is almost entirely populated by celebrity references? In the third episode of the latest season, Vietnamese contestant Plastique Tiara is dragged for her uneven performance in a skit about Mariah Carey, and her response shocks the judges. “I only found out about pop culture about, like, three years ago,” she says. To a comically sober audience, she then drops the biggest bomb of all: “I found out about Beyoncé legit four years ago.” I think Michelle Visage’s jaw might still be on the floor.

“This is where you all could have worked together as a group to educate each other,” RuPaul explains. It is the perfect framing of popular culture right now — as a rolling curriculum for the general populace which determines whether you make the grade as an informed citizen or not. It is reminiscent of an actual educational philosophy from the 1930s, essentialism, which was later adopted by E.D. Hirsch, the man who coined the term “cultural literacy” as “the network of information that all competent readers possess.” Essentialist education emphasizes standardized common knowledge for the entire population, which privileges the larger culture over individual creativity. Essentialist pop culture does the same thing, flattening our imaginations until we are all tied together by little more than the same vocabulary.

***

The year 1987 was when Aretha Franklin became the first woman inducted into the Rock and Roll Hall of Fame, the Simpson family arrived on television (via The Tracey Ullman Show), and Mega Man was released on Nintendo. It was also the year Hirsch published Cultural Literacy: What Every American Needs to Know. None of those three pieces of history were in it (though People published a list for the pop-culturally literate in response). At the back of Hirsch’s book, hundreds of words and quotes delineated the things Americans need to know — “Mary Had a Little Lamb (text),” for instance — which would be expanded 15 years later into a sort of CliffsNotes version of an encyclopedia for literacy signaling. “Only by piling up specific, communally shared information can children learn to participate in complex cooperative activities with other members of their community,” Hirsch wrote. He believed that allowing kids to bathe in their “ephemeral” and “confined” knowledge about The Simpsons, for instance, would result in some sort of modern Tower of Babel situation in which no one could talk to anyone about anything (other than, I guess, Krusty the Klown). This is where Hirsch becomes a bit of a cultural fascist. “Although nationalism may be regrettable in some of its worldwide political effects, a mastery of national culture is essential to mastery of the standard language in every modern nation,” he explained, later adding, “Although everyone is literate in some local, regional, or ethnic culture, the connection between mainstream culture and the national written language justifies calling mainstream culture the basic culture of the nation.”

Because I am not very well-read, the first thing I thought of when I found Hirsch’s book was that scene in Peter Weir’s 1989 coming-of-age drama Dead Poet’s Society. You know the one I mean,  where the prep school teacher played by Robin Williams instructs his class to tear the entire introduction to Understanding Poetry (by the fictional author J. Evans Pritchard) out of their textbooks. “Excrement,” he calls it. “We’re not laying pipe, we’re talking about poetry.” As an alternative, he expects this class of teenagers to think for themselves. “Medicine, law, business, engineering, these are all noble pursuits, and necessary to sustain life,” he tells them. “But poetry, beauty, romance, love, these are what we stay alive for.” Neither Pritchard nor Hirsch appear to have subscribed to this sort of sentiment. And their approach to high culture has of late seeped into low culture. What was once a privileging of certain aspects of high taste, has expanded into a privileging of certain “low” taste. Pop culture, traditionally maligned, now overcompensates, essentializing certain pieces of popular art as additional indicators of the new cultural literacy.

I’m not saying there are a bunch of professors at lecterns telling us to watch Game of Thrones, but there are a bunch of networks and streaming services that are doing that, and viewers and critics following suit, constantly telling us what we “have to” watch or “must” listen to or “should” read. Some people who are more optimistic than me have framed this prescriptive approach as a last-ditch effort to preserve shared cultural experiences. “Divided by class, politics and identity, we can at least come together to watch Game of Thrones — which averaged 32.8 million legal viewers in season seven,” wrote Judy Berman in Time. “If fantasy buffs, academics, TV critics, proponents of Strong Female Characters, the Gay of Thrones crew, Black Twitter, Barack Obama, J. Lo, Tom Brady and Beyoncé are all losing their minds over the same thing at the same time, the demise of that collective obsession is worth lamenting — or so the argument goes.” That may sound a little extreme, but then presidential-hopeful Elizabeth Warren blogs about Game of Thrones and you wonder.

Essentializing any form of art limits it, setting parameters on not only what we are supposed to receive, but how. As Wesley Morris wrote of our increasingly moralistic approach to culture, this “robs us of what is messy and tense and chaotic and extrajudicial about art.” Now, instead of approaching everything with a sense of curiosity, we approach with a set of guidelines. It’s like when you walk around a gallery with one of those audio tours held up to your ear, which is supposed to make you appreciate the art more fully, but instead tends to supplant any sort of discovery with one-size-fits-all analysis. With pop culture, the goal isn’t even that lofty. You get a bunch of white guys on Reddit dismantling the structure of a Star Wars trailer, for instance, reducing the conversation around it to mere mechanics. Or you get an exhaustive number of takes on Arya Stark’s alpha female sex scene in Game of Thrones. One of the most prestige-branded shows in recent memory, the latter in particular often occupies more web space than its storytelling deserves precisely because that is what it’s designed to do. As Berman wrote, “Game of Thrones has flourished largely because it was set up to flourish — because the people who bankroll prestige television decided before the first season even went into production that this story of battles, bastards and butts was worth an episodic budget three times as large as that of the typical cable series.” In this way, HBO — and the critics and viewers who stan HBO — have turned this show into one of the essentials even if it’s not often clear why.

Creating art to dominate this discursive landscape turns that art into a chore — in other words, cultural homework. This is where people start saying things like, “Do I HAVE to watch Captain Marvel?” and “feeling a lot of pressure to read sally rooney!” and “do i have to listen to the yeehaw album?” This kind of coercion has been known to cause an extreme side effect — reactance, a psychological phenomenon in which a person who feels their freedom being constricted adopts a combative stance, turning a piece of art we might otherwise be neutral about into an object of derision. The Guardian’s Oliver Burkeman called it “cultural cantankerousness” and used another psychological concept, optimal distinctiveness theory, to further explain it. That term describes how people try to balance feeling included and feeling distinct within a social group. Burkeman, however, favored his reactance as a form of self-protective FOMO avoidance. “My irritation at the plaudits heaped on any given book, film or play is a way of reasserting control,” he wrote. “Instead of worrying about whether I should be reading Ferrante, I’m defiantly resolving that I won’t.” (This was written in 2016; if it were written now, I’m sure he would’ve used Rooney).

***

Shortly after Beyoncé dropped Homecoming, her previous album, Lemonade, became available on streaming services. That one I have heard — a year after it came out. I didn’t write about it. I barely talked about it. No one wants to read why Beyoncé doesn’t mean much to me when there are a number of better critics who are writing about what she does mean to them and so many others (the same way there are smart, interested parties analyzing Lizzo and Game of Thrones and Avengers: Endgame and Rooney). I am not telling those people not to watch or listen to or read or find meaning there, I understand people have different tastes, that certain things are popular because they speak to us in a way other things haven’t. At the same time, I expect not to be told what to watch or listen to or read, because from what I see and hear around me, from what I read and who I talk to, I can define for myself what I need. After Lemonade came out, in a post titled “Actually,” Gawker’s Rich Juzwiak wrote, “It’s easier to explicate what something means than to illustrate what it does. If you want to know what it does, watch it or listen to it. It’s at your fingertips. … Right is right and wrong is wrong, but art at its purest defies those binaries.” In the same way, there is no art you have to experience, just as there is no art you have to not experience. There is only art — increasingly ubiquitous — and there is only you, and what happens between both of you is not for me to assign.

* * *

Soraya Roberts is a culture columnist at Longreads.

 

Just a Spoonful of Siouxsie

Illustration by Mark Wang

Alison Fields | Longreads | April 2019 | 14 minutes (3,609 words)

She showed up on an overcast Friday afternoon in January. She barreled into the driveway in an old mustard-gold Buick with a black vinyl top, its back dash decorated plastic bats, novelty skulls, and dried flowers. She was wrapped in black sweaters, black tights, black boots. She wore clunky bracelets, loads of them on the outside of her sleeves. Her hair was long and henna red. She carried an Army surplus satchel pinned with old rhinestone brooches and Cure buttons. She was 19 years old. When I opened the front door and she smiled at me, I thought she was the most perfect person I’d ever seen.

“I’m Gwen,” she said. “I’m here to interview for the nanny job.”

That’s when I noticed the nose ring and I blubbered something incoherent, then apologized because I was both overwhelmed and mortified that someone this cool was going to come into my stupid house.

Read more…

The Man Who’s Going to Save Your Neighborhood Grocery Store

Illustration by Vinnie Neuberg

Joe Fassler | The Counter & Longreads | April 2019 | 8,802 words (33 minutes)

This story is published in partnership with The Counter, with reporting supported by the 11th Hour Food and Farming Fellowship at the University of California, Berkeley.  


In 2014, Rich Niemann, president and CEO of the Midwestern grocery company Niemann Foods, made the most important phone call of his career. He dialed the Los Angeles office of Shook Kelley, an architectural design firm, and admitted he saw no future in the traditional grocery business. He was ready to put aside a century of family knowledge, throw away all his assumptions, completely rethink his brand and strategy — whatever it would take to carry Niemann Foods deep into the 21st century.

“I need a last great hope strategy,” he told Kevin Kelley, the firm’s cofounder and principal. “I need a white knight.”

Part square-jawed cattle rancher, part folksy CEO, Niemann is the last person you’d expect to ask for a fresh start. He’s spent his whole life in the business, transforming the grocery chain his grandfather founded in 1917 into a regional powerhouse with more than 100 supermarkets and convenience stores across four states. In 2014, he was elected chair of the National Grocery Association. It’s probably fair to say no one alive knows how to run a grocery store better than Rich Niemann. Yet Niemann was no longer sure the future had a place for stores like his.

He was right to be worried. The traditional American supermarket is dying. It’s not just Amazon’s purchase of Whole Foods, an acquisition that trade publication Supermarket News says marked “a new era” for the grocery business — or the fact that Amazon hopes to launch a second new grocery chain in 2019, according to a recent report from The Wall Street Journal, with a potential plan to scale quickly by buying up floundering supermarkets. Even in plush times, grocery is a classic red oceanindustry, highly undifferentiated and intensely competitive. (The term summons the image of a sea stained with the gore of countless skirmishes.) Now, the industrys stodgy old playbook — “buy one, get onesales, coupons in the weekly circular is hurtling toward obsolescence. And with new ways to sell food ascendant, legacy grocers like Rich Niemann are failing to bring back the customers they once took for granted. You no longer need grocery stores to buy groceries.

Niemann hired Kelley in the context of this imminent doom. The assignment: to conceive, design, and build the grocery store of the future. Niemann was ready to entertain any idea and invest heavily. And for Kelley, a man whos worked for decades honing his vision for what the grocery store should do and be, it was the opportunity of a lifetime carte blanche to build the working model hes long envisioned, one he believes can save the neighborhood supermarket from obscurity.

Kevin Kelley, illustration by Vinnie Neuberg

Rich Niemann, illustration by Vinnie Neuberg

The store that resulted is called Harvest Market, which opened in 2016. Its south of downtown Champaign, Illinois, out by the car dealerships and strip malls; 58,000 square feet of floor space mostly housed inside a huge, high-ceilinged glass barn. Its bulk calls to mind both the arch of a hayloft and the heavenward jut of a church. But you could also say its shaped like an ark, because its meant to survive an apocalypse.

Harvest Market is the anti-Amazon. Its designed to excel at what e-commerce can’t do: convene people over the mouth-watering appeal of prize ingredients and freshly prepared food. The proportion of groceries sold online is expected to swell over the next five or six years, but Harvest is a bet that behavioral psychology, spatial design, and narrative panache can get people excited about supermarkets again. Kelley isnt asking grocers to be more like Jeff Bezos or Sam Walton. Hes not asking them to be ruthless, race-to-the-bottom merchants. In fact, he thinks that grocery stores can be something far greater than we ever imagined a place where farmers and their urban customers can meet, a crucial link between the city and the country.

But first, if theyre going to survive, Kelley says, grocers need to start thinking like Alfred Hitchcock.

* * *

Kevin Kelley is an athletic-looking man in his mid-50s , with a piercing hazel gaze that radiates thoughtful intensity. In the morning, he often bikes two miles to Shook Kelley’s office in Hollywood — a rehabbed former film production studio on an unremarkable stretch of Melrose Avenue, nestled between Bogie’s Liquors and a driving school. Four nights a week, he visits a boxing gym to practice Muay Thai, a form of martial arts sometimes called “the art of eight limbs” for the way it combines fist, elbow, knee, and shin attacks. “Martial arts,” Kelley tells me, “are a framework for handling the unexpected.” That’s not so different from his main mission in life: He helps grocery stores develop frameworks for the unexpected, too.

You’ve never heard of him, but then it’s his job to be invisible. Kelley calls himself a supermarket ghostwriter: His contributions are felt more than seen, and the brands that hire him get all the credit. Countless Americans have interacted with his work in intimate ways, but will never know his name. Such is the thankless lot of the supermarket architect.

A film buff equally fascinated by advertising and the psychology of religion, Kelley has radical theories about how grocery stores should be built, theories that involve terms like emotional opportunity,” “brain activity,” “climax,and “mise-en-scène.But before he can talk to grocers about those concepts, he has to convince them of something far more elemental: that their businesses face near-certain annihilation and must change fundamentally to avoid going extinct.

It is the most daunting feeling when you go to a grocery store chain, and you meet with these starched-white-shirt executives,Kelley tells me. When we get a new job, we sit around this table we do it twenty, thirty times a year. Old men, generally. Don’t love food, progressive food. Just love their old food like Archie Bunkers, essentially. You meet these people and then you tour their stores. Then I’ve got to go convince Archie Bunker that there’s something called emotions, that there are these ideas about branding and feeling. It is a crazy assignment. I can’t get them to forget that they’re no longer in a situation where they’ve got plenty of customers. That its do-or-die time now.

Forget branding. Forget sales. Kelley’s main challenge is redirecting the attention of older male executives, scared of the future and yet stuck in their ways, to the things that really matter.

I make my living convincing male skeptics of the power of emotions,he says.

Human beings, it turns out, aren’t very good at avoiding large-scale disaster. As you read this, the climate is changing, thanks to the destructively planet-altering activities of our species. The past four years have been the hottest on record. If the trend continues — and virtually all experts agree it will — we’re likely to experience mass disruptions on a scale never before seen in human history. Drought will be epidemic. The ocean will acidify. Islands will be swallowed by the sea. People could be displaced by the millions, creating a new generation of climate refugees. And all because we didn’t move quickly enough when we still had time.

You know this already. But I bet you’re not doing much about it — not enough, at least, to help avert catastrophe. I’ll bet your approach looks a lot like mine: worry too much, accomplish too little. The sheer size of the problem is paralyzing. Vast, systemic challenges tend to short-circuit our primate brains. So we go on, as the grim future bears down.

Grocers, in their own workaday way, fall prey to the same inertia. They got used to an environment of relative stability. They don’t know how to prepare for an uncertain future. And they can’t force themselves to behave as if the good times are really going to go away — even if, deep down, they know it’s true.

I make my living convincing male skeptics of the power of emotions.

In the 1980s, you could still visit almost any community in the U.S. and find a thriving supermarket. Typically, it would be a dynasty family grocery store, one that had been in business for a few generations. Larger markets usually had two or three players, small chains that sorted themselves out along socioeconomic lines: fancy, middlebrow, thrifty. Competition was slack and demand — this is the beautiful thing about selling food — never waned. For decades, times were good in the grocery business. Roads and schools were named after local supermarket moguls, who often chaired their local chambers of commerce. “When you have that much demand, and not much competition, nothing gets tested. Kind of like a country with a military that really doesn’t know whether their bullets work,” Kelley says. “They’d never really been in a dogfight.”

It’s hard to believe now, but there was not a single Walmart on the West Coast until 1990. That decade saw the birth of the “hypermarket” and the beginning of the end for traditional grocery stores — Walmarts, Costcos, and Kmarts became the first aggressive competition supermarkets ever really faced, luring customers in with the promise of one-stop shopping on everything from Discmen to watermelon.

The other bright red flag: Americans started cooking at home less and eating out more. In 2010, Americans dined out more than in for the first time on record, the culmination of a slow shift away from home cooking that had been going on since at least the 1960s. That trend is likely to continue. According to a 2017 report from the USDA’s Economic Research Service, millennials shop at food stores less than any other age group, spend less time preparing food, and are more likely to eat carry-out, delivery, or fast food even when they do eat at home. But even within the shrinking market for groceries, competition has stiffened. Retailers not known for selling food increasingly specialize in it, a phenomenon called “channel blurring”; today, pharmacies like CVS sell pantry staples and packaged foods, while 99-cent stores like Dollar General are a primary source of groceries for a growing number of Americans. Then there’s e-commerce. Though only about 3 percent of groceries are currently bought online, that figure could rocket to 20 percent by 2025. From subscription meal-kit services like Blue Apron to online markets like FreshDirect and Amazon Fresh, shopping for food has become an increasingly digital endeavor — one that sidesteps traditional grocery stores entirely.

A cursory glance might suggest grocery stores are in no immediate danger. According to the data analytics company Inmar, traditional supermarkets still have a 44.6 percent market share among brick-and-mortar food retailers. And though a spate of bankruptcies has recently hit the news, there are actually more grocery stores today than there were in 2005. Compared to many industries — internet service, for example — the grocery industry is still a diverse, highly varied ecosystem. Forty-three percent of grocery companies have fewer than four stores, according to a recent USDA report. These independent stores sold 11 percent of the nation’s groceries in 2015, a larger collective market share than successful chains like Albertson’s (4.5 percent), Publix (2.25 percent), and Whole Foods (1.2 percent).

But looking at this snapshot without context is misleading — a little like saying that the earth can’t be warming because it’s snowing outside. Not long ago, grocery stores sold the vast majority of the food that was prepared and eaten at home — about 90 percent in 1988, according to Inmar. Today, their market share has fallen by more than half, even as groceries represent a diminished proportion of overall food sold. Their slice of the pie is steadily shrinking, as is the pie itself.

By 2025, the thinking goes, most Americans will rarely enter a grocery store. That’s according to a report called “Surviving the Brave New World of Food Retailing,” published by the Coca-Cola Retailing Research Council — a think tank sponsored by the soft drink giant to help retailers prepare for major changes. The report describes a retail marketplace in the throes of massive change, where supermarkets as we know them are functionally obsolete. Disposables and nonperishables, from paper towels to laundry detergent and peanut butter, will replenish themselves automatically, thanks to smart-home sensors that reorder when supplies are low. Online recipes from publishers like Epicurious will sync directly to digital shopping carts operated by e-retailers like Amazon. Impulse buys and last-minute errands will be fulfilled via Instacart and whisked over in self-driving Ubers. In other words, food — for the most part — will be controlled by a small handful of powerful tech companies.

The Coca-Cola report, written in consultation with a handful of influential grocery executives, including Rich Niemann, acknowledges that the challenges are dire. To remain relevant, it concludes, supermarkets will need to become more like tech platforms: develop a “robust set of e-commerce capabilities,” take “a mobile-first approach,” and leverage “enhanced digital assets.” They’ll need infrastructure for “click and collect” purchasing, allowing customers to order online and pick up in a jiffy. They’ll want to establish a social media presence, as well as a “chatbot strategy.” In short, they’ll need to become Amazon, and they’ll need to do it all while competing with Walmart — and its e-commerce platform, Jet.com — on convenience and price.

That’s why Amazon’s acquisition of Whole Foods Market was terrifying to so many grocers, sending the stocks of national chains like Kroger tumbling: It represents a future they can’t really compete in. Since August 2017, Amazon has masterfully integrated e-commerce and physical shopping, creating a muscular hybrid that represents an existential threat to traditional grocery stores. The acquisition was partially a real estate play: Whole Foods stores with Prime lockers now act as a convenient pickup depot for Amazon goods. But Amazon’s also doing its best to make it too expensive and inconvenient for its Prime members, who pay $129 a year for free two-day shipping and a host of other perks, to shop anywhere else. Prime members receive additional 10 percent discounts on select goods at Whole Foods, and Amazon is rolling out home grocery delivery in select areas. With the Whole Foods acquisition, then, Amazon cornered two markets: the thrift-driven world of e-commerce and the pleasure-seeking universe of high-end grocery. Order dish soap and paper towels in bulk on Amazon, and pick them up at Whole Foods with your grass-fed steak.

Traditional grocers are now expected to offer the same combination of convenience, flexibility, selection, and value. They’re understandably terrified by this scenario, which would require fundamental, complex, and very expensive changes. And Kelley is terrified of it, too, though for a different reason: He simply thinks it wont work. In his view, supermarkets will never beat Walmart and Amazon at what they do best. If they try to succeed by that strategy alone, theyll fail. That prospect keeps Kelley up at night because it could mean a highly consolidated marketplace overseen by just a handful of players, one at stark contrast to the regional, highly varied food retail landscape America enjoyed throughout the 20th century.

I’m afraid of what could happen if Walmart and Amazon and Lidl are running our food system, the players trying to get everything down to the lowest price possible,he tells me. What gives me hope is the upstarts who will do the opposite. Who arent going to sell convenience or efficiency, but fidelity.

The approach Kelley’s suggesting still means completely overhauling everything, with no guarantee of success. It’s a strategy that’s decidedly low-tech, though it’s no less radical. It’s more about people than new platforms. It means making grocery shopping more like going to the movies.

* * *

Nobody grows up daydreaming about designing grocery stores, including Kelley. As a student at the University of North Carolina at Charlotte, he was just like every other architect-in-training: He wanted to be a figure like Frank Gehry, building celebrated skyscrapers and cultural centers. But he came to feel dissatisfied with the culture of his profession. In his view, architects coldly fixate on the aesthetics of buildings and aren’t concerned enough with the people inside.

“Architecture worships objects, and Capital-A architects are object makers,” Kelley tells me. “They aren’t trying to fix social issues. People and their experience and their perceptions and behaviors don’t matter to them. They don’t even really want people in their photographs—or if they have to, they’ll blur them out.” What interested Kelley most was how people would use his buildings, not how the structures would fit into the skyline. He wanted to shape spaces in ways that could actually affect our emotions and personalities, bringing out the better angels of our nature. To his surprise, no one had really quantified a set of rules for how environment could influence behavior. Wasn’t it strange that advertising agencies spent so much time thinking about the links between storytelling, emotions, and decision-making — while commercial spaces, the places where we actually go to buy, often had no design principle beyond brute utility?

My ultimate goal was to create a truly multidisciplinary firm that was comprised of designers, social scientists and marketing types,” he says. “It was so unorthodox and so bizarrely new in terms of approach that everyone thought I was crazy.”

In 1992, when he was 28, Kelley cofounded Shook Kelley with the Charlotte, North Carolina–based architect and urban planner Terry Shook. Their idea was to offer a suite of services that bridged social science, branding, and design, a new field they called “perception management.” They were convinced space could be used to manage emotion, just the way cinema leads us through a guided sequence of feelings, and wanted to turn that abstract idea into actionable principles. While Shook focused on bigger, community-oriented spaces like downtown centers and malls, Kelley focused on the smaller, everyday commercial spaces overlooked by fancy architecture firms: dry cleaners, convenience stores, eateries, bars. One avant-garde restaurant Kelley designed in Charlotte, called Props, was an homage to the sitcom craze of the 1990s. It was built to look like a series of living rooms, based on the apartment scenes in shows like Seinfeld and Friends and featured couches and easy chairs instead of dining tables to encourage guests to mingle during dinner.

The shift to grocery stores didn’t happen until a few years later, almost by accident. In the mid-’90s, Americans still spent about 55 percent of their food dollars on meals eaten at home — but that share was declining quickly enough to concern top corporate brass at Harris Teeter, a Charlotte-area, North Carolina–based grocery chain with stores throughout the Southwestern United States. (Today, Harris Teeter is owned by Kroger, the country’s second-largest seller of groceries behind Walmart.) Harris Teeter execs reached out to Shook Kelley. “We hear you’re good with design, and you’re good with food,” Kelley remembers Harris Teeter reps saying. “Maybe you could help us.”

At first, it was Terry Shook’s account. He rebuilt each section of the store into a distinct “scene” that reinforced the themes and aesthetics of the type of food it sold. The deli counter became a mocked-up urban delicatessen, complete with awning and neon sign. The produce section resembled a roadside farmstand. The dairy cases were corrugated steel silos, emblazoned with the logo of a local milk supplier. And he introduced full-service cafés, a novelty for grocery stores at the time, with chrome siding like a vintage diner. It was pioneering work, winning that year’s Outstanding Achievement Award from the International Interior Design Association — according to Kelley, it was the first time the prestigious award had ever been given to a grocery store.

Shook backed off of grocery stores after launching the new Harris Teeter, but the experience sparked Kelley’s lifelong fascination with grocery stores, which he realized were ideal proving grounds for his ideas about design and behavior. Supermarkets contain thousands of products, and consumers make dozens of decisions inside them — decisions about health, safety, family, and tradition that get to the core of who they are. He largely took over the Harris Teeter account and redesigned nearly 100 of the chain’s stores, work that would go on to influence the way the industry saw itself and ultimately change the way stores are built and navigated.

Since then, Kelley has worked to show grocery stores that they don’t have to worship at the altar of supply-side economics. He urges grocers to appeal instead to our humanity. Kelley asks them to think more imaginatively about their stores, using physical space to evoke nostalgia, delight our senses, and appeal to the parts of us motivated by something bigger and more generous than plain old thrift. Shopping, for him, is all about navigating our personal hopes and fears, and grocery stores will only succeed when they play to those emotions.

When it works, the results are dramatic. Between 2003 and 2007, Whole Foods hired Shook Kelley for brand strategy and store design, working with the firm throughout a crucial period of the chain’s development. The fear was that as Whole Foods grew, its image would become too diffuse, harder to differentiate from other health food stores; at the same time, the company wanted to attract more mainstream shoppers. Kelley’s team was tasked with finding new ways to telegraph the brand’s singular value. Their solution was a hierarchical system of signage that would streamline the store’s crowded field of competing health and wellness claims.

Kelley’s view is that most grocery stores are “addicted” to signage, cramming their spaces with so many pricing details, promotions, navigational signs, ads, and brand assets that it “functionally shuts down [the customer’s] ability to digest the information in front of them.”

Kelley’s team stipulated that Whole Foods could only have seven layers of information, which ranged from evocative signage 60 feet away to descriptive displays six feet from customers to promotional info just six inches from their hands. Everything else was “noise,” and jettisoned from the stores entirely. If you’ve ever shopped at Whole Foods, you probably recognize the way that the store’s particular brand of feel-good, hippie sanctimony seems to permeate your consciousness at every turn. Kelley helped invent that. The system he created for pilot stores in Princeton, New Jersey, and Louisville, Kentucky, were scaled throughout the chain and are still in use today, he says. (Whole Foods did not respond to requests for comment for this story.)

With a carefully delineated set of core values guiding its purchasing and brand, Whole Foods was ripe for the kind of visual overhaul Kelley specializes in. But most regional grocery chains have a different set of problems: They don’t really have values to telegraph in the first place. Shook Kelley’s approach is about getting buttoned-down grocers to reflect on their beliefs, tapping into deeper, more primal reasons for wanting to sell food.

* * *

Today, Kelley and his team have developed a playbook for clients, a finely tuned process to get shoppers to think in terms that go beyond bargain-hunting. It embraces what he calls “the theater of retail” and draws inspiration from an unlikely place: the emotionally laden visual language of cinema. His goal is to convince grocers to stop thinking like Willy Loman — like depressed, dejected salesmen forever peddling broken-down goods, fixated on the past and losing touch with the present. In order to survive, Kelley says, grocers can’t be satisfied with providing a place to complete a chore. They’ll need to direct an experience.


Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.

Sign up


Today’s successful retail brands establish what Kelley calls a “brand realm,” or what screenwriters would call a story’s “setting.” We don’t usually think consciously about them, but realms subtly shape our attitude toward shopping the same way the foggy, noirishly lit streets in a Batman movie tell us something about Gotham City. Cracker Barrel is set in a nostalgic rural house. Urban Outfitters is set on a graffitied urban street. Tommy Bahama takes place on a resort island. It’s a well-known industry secret that Costco stores are hugely expensive to construct — they’re designed to resemble fantasy versions of real-life warehouses, and the appearance of thrift doesn’t come cheap. Some realms are even more specific and fanciful: Anthropologie is an enchanted attic, complete with enticing cupboards and drawers. Trader Joe’s is a crew of carefree, hippie traders shipping bulk goods across the sea. A strong sense of place helps immerse us in a store, getting us emotionally invested and (perhaps) ready to suspend the critical faculties that prevent a shopping spree.

Kelley takes this a few steps further. The Shook Kelly team, which includes a cultural anthropologist with a Ph.D., begins by conducting interviews with executives, staff, and locals, looking for the storytelling hooks they call “emotional opportunities.” These can stem from core brand values, but often revolve around the most intense, place-specific feelings locals have about food. Then Kelley finds ways to place emotional opportunities inside a larger realm with an overarching narrative, helping retailers tell those stories — not with shelves of product, but through a series of affecting “scenes.”

In Alberta, Canada, Shook Kelley redesigned a small, regional grocery chain now called Freson Bros. Fresh Market. In interviews, the team discovered that meat-smoking is a beloved pastime there, so Shook Kelley built huge, in-store smokers at each new location — a scene called “Banj’s Smokehouse” — that crank out pound after pound of the province’s signature beef, as well as elk, deer, and other kinds of meat (customers can even BYO meat to be smoked in-house). Kelley also designed stylized root cellars in each produce section, a cooler, darker corner of each store that nods to the technique Albertans use to keep vegetables fresh. These elements aren’t just novel ways to taste, touch, and buy. They reference cultural set points, triggering memories and personal associations. Kelley uses these open, aisle-less spaces, which he calls “perceptual rooms,” to draw customers through an implied sequence of actions, tempting them towards a specific purchase.

Something magical happens when you engage customers this way. Behavior changes in visible, quantifiable ways. People move differently. They browse differently. And they buy differently. Rather than progressing in a linear fashion, the way a harried customer might shoot down an aisle — Kelley hates aisles, which he says encourage rushed, menial shopping — customers zig-zag, meander, revisit. These behaviors are a sign a customer is “experimenting,” engaging with curiosity and pleasure rather than just trying to complete a task. “If I was doing a case study presentation to you, I would show you exact conditions where we don’t change the product, the price, the service. We just change the environment and we’ll change the behavior,” Kelley tells me. “That always shocks retailers. They’re like ‘Holy cow.’ They don’t realize how much environment really affects behavior.”

A strong sense of place helps immerse us in a store, getting us emotionally invested and (perhaps) ready to suspend the critical faculties that prevent a shopping spree.

In the mid-2000s, Nabisco approached Kelley’s firm, complaining that sales were down 16 percent in the cookie-and-cracker aisle. In response, Shook Kelley designed “Mom’s Kitchen,” which was piloted at Buehler’s, a 15-store chain in northern Ohio. Kelley took Nabisco’s products out of the center aisles entirely and installed them in a self-contained zone: a perceptual room built out to look like a nostalgic vision of suburban childhood, all wooden countertops, tile, and hanging copper pans. Shelves of Nabisco products from Ritz Crackers to Oreos lined the walls. Miniature packs of Animal Crackers waited out in a large bowl, drawers opened to reveal boxes of Saltines. The finishing touch had nothing to do with Nabisco and everything to do with childhood associations: Kelley had the retailers install fridge cases filled with milk, backlit and glowing. Who wants to eat Oreos without a refreshing glass of milk to wash them down?

The store operators weren’t sold. They found it confusing and inconvenient to stock milk in two places at once. But from a sales perspective, the experiment was a smash. Sales of Nabisco products increased by as much as 32 percent, and the entire cookie-and-cracker segment experienced a halo effect, seeing double-digit jumps. Then, the unthinkable: The stores started selling out of milk. They simply couldn’t keep it on the shelves.

You’d think that the grocery stores would be thrilled, that it would have them scrambling to knock over their aisles of goods, building suites of perceptual rooms. Instead, they retreated. Nabisco’s parent company at the time, Kraft, was excited by the results and kicked the idea over to a higher-up corporate division where it stalled. And Buehler’s, for its part, never did anything to capitalize on its success. When the Nabisco took “Mom’s Kitchen” displays down, Kelley says, the stores didn’t replace them.

Mom’s Kitchen, fully stocked. (Photo by Tim Buchman)

“We were always asking a different question: What is the problem you’re trying to solve through food?” Kelley says. “It’s not just a refueling exercise — instead, what is the social, emotional issue that food is solving for us? We started trying to work that into grocery. But we probably did it a little too early, because they weren’t afraid enough.”

Since then, Kelley has continued to build his case to unreceptive audiences of male executives with mixed success. He tells them that when customers experiment — when the process of sampling, engaging, interacting, and evaluating an array of options becomes a source of pleasure — they tend to take more time shopping. And that the more time customers spend in-store, the more they buy. In the industry, this all-important metric is called “dwell time.” Most retail experts agree that increasing dwell without increasing frustration (say, with long checkout times) will be key to the survival of brick-and-mortar retail. Estimates vary on how much dwell time increases sales; according to Davinder Jheeta, creative brand director of the British supermarket Simply Fresh, customers spent 1.3 percent more for every 1 percent increase in dwell time in 2015.

Another way to increase dwell time? Offer prepared foods. Delis, cafes, and in-store restaurants increase dwell time and facilitate pleasure while operating with much higher profit margins and recapturing some of the dining-out dollar that grocers are now losing. “I tell my clients, ‘In five years, you’re going to be in the restaurant business,” Kelley says, “‘or you’re going to be out of business.’”

Kelley’s job, then, is to use design in ways that get customers to linger, touch, taste, scrutinize, explore. The stakes are high, but the ambitions are startlingly low. Kelley often asks clients what he calls a provocative question: Rather than trying to bring in new customers, would it solve their problems if 20 percent of customers increased their basket size by just two dollars? The answer, he says, is typically an enthusiastic yes.

Just two more dollars per trip for every fifth customer — that’s what victory looks like. And failure? That looks like a food marketplace dominated by Walmart and Amazon, a world where the neighborhood supermarket is a thing of the past.

* * *

When Shook Kelley started working on Niemann’s account, things began the way they always did: looking for emotional opportunities. But the team was stumped. Niemann’s stores were clean and expertly run. There was nothing wrong with them. Niemann’s problem was that he had no obvious problem. There was no there there.

Many of the regionals Kelley works with have no obvious emotional hook; all they know is that they’ve sold groceries for a long time and would like to keep on selling them. When he asks clients what they believe in, they show him grainy black-and-white photos of the stores their parents and grandparents ran, but they can articulate little beyond the universal goal of self-perpetuation. So part of Shook Kelley’s specialty is locating the distinguishing spark in brands that do nothing especially well, which isn’t always easy. At Buehler’s Fresh Foods, the chain where “Mom’s Kitchen” was piloted, the store’s Shook Kelley–supplied emotional theme is “Harnessing the Power of Nice.”

Still, Niemann Foods was an especially challenging case. “We were like, ‘Is there any core asset here?’” Kelley told me. “And we were like, ‘No. You really don’t have anything.’”

What Kelley noticed most was how depressed Niemann seemed, how gloomy about the fate of grocery stores in general. Nothing excited him — with one exception. Niemann runs a cattle ranch, a family operation in northeast Missouri. “Whenever he talked about cattle and feed and antibiotics and meat qualities, his physical body would change. We’re like, ‘My god. This guy loves ranching.’ He only had three hundred cattle or something, but he had a thousand pounds of interest in it.”

Niemann’s farm now has about 600 cattle, though it’s still more hobby farm than full-time gig — but it ended up being a revelation. During an early phase of the process, someone brought up “So God Made a Farmer” — a speech radio host Paul Harvey gave at the 1978 Future Farmers of America Convention that had been used in an ad for Ram trucks in the previous year’s Super Bowl. It’s a short poem that imagines the eighth day of the biblical creation, where God looks down from paradise and realizes his new world needs a caretaker. What kind of credentials is God looking for? Someone “willing to get up before dawn, milk cows, work all day in the fields, milk cows again, eat supper and then go to town and stay past midnight at a meeting of the school board.” God needs “somebody willing to sit up all night with a newborn colt. And watch it die. Then dry his eyes and say, ‘Maybe next year.’” God needs “somebody strong enough to clear trees and heave bails, yet gentle enough to yean lambs and wean pigs and tend the pink-combed pullets, who will stop his mower for an hour to splint the broken leg of a meadow lark.” In other words, God needs a farmer.

Part denim psalm, part Whitmanesque catalogue, it’s a quintessential piece of Americana — hokey and humbling like a Norman Rockwell painting, and a bit behind the times (of course, the archetypal farmer is male). And when Kelley’s team played the crackling audio over the speakers in a conference room in Quincy, Illinois, something completely unexpected happened. Something that convinced Kelley that his client’s stores had an emotional core after all, one strong enough to provide the thematic backbone for a new approach to the grocery store.

Rich Niemann, the jaded supermarket elder statesman, broke down and wept.

* * *

I have never been a fan of shopping. Spending money stresses me out. I worry too much to enjoy it. So I wanted to see if a Kelley store could really be what he said it was, a meaningful experience, or if it would just feel fake and hokey. You know, like the movies. When I asked if there was one store I could visit to see his full design principles in action, he told me to go to Harvest, the most interesting store in America.

Champaign is two hours south of O’Hare by car. Crossing its vast landscape of unrelenting farmland, you appreciate the sheer scale of Illinois, how far the state’s lower half is from Chicago. It’s a college town, which comes with the usual trappings — progressive politics, cafes and bars, young people lugging backpacks with their earbuds in — but you forget that fast outside the city limits. In 2016, some townships in Champaign county voted for Donald Trump over Hillary Clinton by 50 points.

I was greeted in the parking lot by Gerry Kettler, Niemann Foods’ director of consumer affairs. Vintage John Deere tractors formed a caravan outside the store. The shopping cart vestibules were adorned with images of huge combines roving across fields of commodity crops. Outside the wide-mouthed entryway, local produce waited in picket-fence crates — in-season tomatoes from Johnstonville, sweet onions from Warrensburg.

And then we stepped inside.

Everywhere, sunlight poured in through the tall, glass facade, illuminating a sequence of discrete, airy, and largely aisle-less zones. Kettler bounded around the store, pointing out displays with surprised joy on his face, as if he couldn’t believe his luck. The flowers by the door come from local growers like Delight Flower Farm and Illinois Willows. “Can’t keep this shit in stock,” he said. He makes me hold an enormous jackfruit to admire its heft. The produce was beautiful, he was right, with more local options than I’ve ever seen in a grocery store. The Warrensville sweet corn is eye-poppingly cheap: two bucks a dozen. There were purple broccolini and clamshells filled with squash blossoms, a delicacy so temperamental that they’re rarely sold outside of farmers’ markets. Early on, they had to explain to some teenage cashiers what they were — they’d never seen squash blossoms before.

I started to sense the “realm” Harvest inhabits: a distinctly red-state brand of America, local food for fans of faith and the free market. It’s hunting gear. It’s Chevys. It’s people for whom commercial-scale pig barns bring back memories of home. Everywhere, Shook Kelley signage — a hierarchy of cues like what Kelley dreamed up for Whole Foods — drives the message home. A large, evocative sign on the far wall reads Pure Farm Flavor, buttressed by the silhouettes of livestock, so large it almost feels subliminal. Folksy slogans hang on the walls, sayings like FULL OF THE MILK OF HUMAN KINDNESS and THE CREAM ALWAYS RISES TO THE TOP.

Then there are the informational placards that point out suppliers and methods.

There are at least a half dozen varieties of small-batch honey; you can find pastured eggs for $3.69. The liquor section includes local selections, like whiskey distilled in DeKalb and a display with cutting boards made from local wood by Niemann Foods’ HR Manager. “Turns out we had some talent in our backyard,” Kettler said. Niemann’s willingness to look right under his nose, sidestepping middlemen distributors to offer reasonably priced, local goods, is a hallmark of Harvest Market.

That shortened chain of custody is only possible because of Niemann and the lifetime of supply-side know-how he brings to table. But finding ways to offer better, more affordable food has been a long-term goal of Kelley — who strained his relationship with Whole Foods CEO John Mackey over the issue. As obsessed as Kelley is with appearances, he insists to me that his work must be grounded in something “real”: that grocery stores only succeed when they really try to make the world a better place through food. In his view, Whole Foods wasn’t doing enough to address its notoriously high prices — opening itself up to be undercut by cheaper competition, and missing a kind of ethical opportunity to make better food available to more people.

“When,” Kelley remembers asking, “did you start to mistake opulence for success?”

In Kelley’s telling, demand slackened so much during the Great Recession that it nearly lead to Whole Foods’ downfall, a financial setback that the company never fully recovered from — and, one could argue, ultimately led to its acquisition. Harvest Market, for its part, has none of Whole Foods’ clean-label sanctimony. It takes an “all-of-the-above” approach: There’s local produce, but there’re also Oreos and Doritos and Coca-Cola; at Thanksgiving, you can buy a pastured turkey from Triple S Farms or a 20-pound Butterball. But that strong emphasis on making local food more accessible and affordable makes it an interesting counterpart to Kelley’s former client.

The most Willy Wonka–esque touch is the hulking piece of dairy processing equipment in a glass room by the cheese case. It’s a commercial-scale butter churner — the first one ever, Kettler told me, to grace the inside of a grocery store.

“So this was a Shook Kelley idea,” he said, “We said yes, without knowing how much it would cost. And the costs just kept accelerating. But we’re thrilled. People love it.”Harvest Market isn’t just a grocery store — it’s also a federally inspected dairy plant. The store buys sweet cream from a local dairy, which it churns into house-made butter, available for purchase by the brick and used throughout Harvest’s bakery and restaurant. The butter sells out as fast as they can make it. Unlike the grocers who objected to “Mom’s Kitchen,” the staff don’t seem to mind.

As I walked through the store, I couldn’t help wondering how impressed I really was. I found Harvest to be a beautiful example of a grocery store, no doubt, and a very unusual one. What was it that made me want to encounter something more outrageous, more radical, more theatrical and bizarre? I wanted animatronic puppets. I wanted fog machines.

I should have known better — Kelley had warned me that you can’t take the theater of retail too far without breaking the dream. He’d told me that he admires stores where “you’re just not even aware of the wonder of the scene, you’re just totally engrossed in it” — stores a universe away from the overwrought, hokey feel of Disneyland. But I had Amazon’s new stores in the back up my mind as a counterpoint, with all their cashierless bells and whistles, their ability to click and collect, their ability to test-drive Alexa and play a song or switch on a fan. I guess, deep down, I was wondering if something this subtle really could work.

“Here, this is Rich Niemann,” Kettler said, and I found myself face-to-face with Niemann himself. We shook hands and he asked if I’d ever been to Illinois before. Many times, I told him. My wife is from Chicago, so we’ve visited the city often.

He grinned at me.

“That’s not Illinois,” he said.

We walked to Harvest’s restaurant, a 40-person seating area plus an adjacent bar with a row of stools, that offers standards like burgers, salads, and flatbreads. There’s an additional 80-person seating area on the second-floor mezzanine, a simulated living room complete with couches and board games. Beyond that, they pointed out the brand-new wine bar — open, like the rest of the space, until midnight. There’s a cooking classroom by the corporate offices. Through the window, I saw a classroom full of children doing something to vegetables. Adult Cooking classes run two or three nights every week, plus special events for schools and other groups.

For a summer weekday at noon in a grocery store I’m amazed how many people are eating and working on laptops. One guy has his machine hooked up to a full-sized monitor he lugged up the stairs — he’s made a customized wooden piece that hooks into Harvest’s wrought-iron support beams to create a platform for his plus-size screen. He comes every day, like it’s his office. He’s a dwell-time dream.

We sit down, and Kettler insists I eat the corn first, slathering it with the house-made butter and eating it while it’s hot. He reminds me that it’s grown by the Maddoxes, a family in Warrensburg, about 50 miles west of Champaign.

The corn was good, but I wanted to ask Niemann if the grocery industry was really that bad, and he told me it is. I assume he’ll want to talk about Amazon and its acquisition of Whole Foods and the way e-commerce has changed the game. He acknowledges that, but to my surprise he said the biggest factor is something else entirely — a massive shift happening in the world of consumer packaged goods, or CPGs.

For years, grocery stores never had to advertise, because the largest companies in the world — Proctor and Gamble, Coca-Cola, Nestle — did their advertising for them, just the way Nabisco helped finance “Mom’s Kitchen” to benefit the stores. People came to supermarkets to buy the foods they saw on TV. But Americans are falling out of love with legacy brands. They’re looking for something different, locality, a sense of novelty and adventure. Kellogg’s and General Mills don’t have the pull they once had.

When their sales flag, grocery sales do too — and the once-bulletproof alliance between food brands and supermarkets is splitting. For the past two years, the Grocery Manufacturers’ Association, an influential trade group representing the biggest food companies in the world, started to lose members. It began with Campbell’s Soup. Dean Foods, Mars, Tyson Foods, Unilever, Hershey Company, the Kraft Heinz Company, and others followed. That profound betrayal was a rude awakening: CPG companies don’t need grocery stores. They have Amazon. They can sell directly through their websites. They can launch their own pop-ups.

It’s only then that I realized how dire the predicament of grocery stores really is, and why Niemann was so frustrated when he first called Kevin Kelley. It’s one thing when you can’t sell as cheaply and conveniently as your competitors. But it’s another thing when no one wants what you’re selling.

Harvest doesn’t feel obviously futuristic in the way an Amazon store might. If I went there as a regular shopper and not as a journalist sniffing around for a story, I’m sure I’d find it to be a lovely and transporting way to buy food. But what’s going on behind the scenes is, frankly, unheard of.

Grocery stores have two ironclad rules. First, that grocers set the prices, and farmers do what they can within those mandates. And second, that everyone works with distributors who oversee the aggregation and transport of all goods. Harvest has traditional relationships with companies like Coca-Cola, but it breaks those rules with local farmers and foodmakers. Suppliers — from the locally milled wheat to the local produce to the Kilgus Farms sweet cream that goes into the churner — truck their products right to the back. By avoiding middlemen and their surcharges, Harvest is able to pay suppliers more directly and charge customers less. And it keeps costs low. You can still find $4.29 pints of Halo Top ice cream in the freezer, but the produce section features stunning bargains. When the Maddox family pulls up with its latest shipment of corn, people sometimes start buying it off the back of the truck in the parking lot. Thats massive change, and its virtually unheard of in supermarkets. At the same time, suppliers get to set their own prices. Niemann’s suppliers tell him what they need to charge; Niemann adds a standard margin and lets customers decide if they’re willing to pay.

If there’s a reason Harvest matters, it’s only partly because of the aesthetics. It’s mainly because the model of what a grocery store is has been tossed out and rebuilt. And why not? The world as Rich Niemann knows it is ending.

* * *

In 2017, just months after Harvest Market’s opening, Niemann won the Thomas K. Zaucha Entrepreneurial Excellence Award — the National Grocers Association’s top honor, given for “persistence, vision, and creative entrepreneurship.” That spring, Harvest was spotlighted in a “Store of the Month” cover feature in the influential trade magazine Progressive Grocer. Characteristically, the contributions of Kelley and his firm were not mentioned in the piece.

Niemann tells me his company is currently planning to open a second Harvest Market in Springfield, Illinois, about 90 minutes west of Champaign, in 2020. Without sharing specifics about profitability or sales numbers, he says the store was everything he’d hoped it would be as far as the metrics that most matter — year-over-year sales growth and customer engagement. His only complaint about the store, has to do with parking. For years, Niemann has relied on the same golden ratio to determine the size of parking lot needed for his stores — a certain number of spots for every thousand dollars of expected sales. Harvest’s lot uses the same logic, and it’s nowhere near enough space.

“In any grocery store, the customer’s first objective is pantry fill — to take care of my needs as best I can on my budget,” Niemann says. “But we created a different atmosphere. These customers want to talk. They want to know. They want to experience. They want to taste. They’re there because it’s an adventure.”

They stay so much longer than expected that the parking lot sometimes struggles to fit all their cars at once. Unlike the Amazon stores that may soon be cropping up in a neighborhood near you — reportedly, the company is considering plans to open 3,000 of them in by 2021 — it’s not about getting in and out quickly without interacting with another human being. At Harvest, you stay awhile. And that’s the point.

But Americans are falling out of love with legacy brands. They’re looking for something different, locality, a sense of novelty and adventure. Kellogg’s and General Mills don’t have the pull they once had.

So far, Harvest’s success hasn’t made it any easier for Kelley, who still struggles to persuade clients to make fundamental changes. They’re still as scared as they’ve always been, clinging to the same old ideas. He tells them that, above all else, they need to develop a food philosophy — a reason why they do this in the first place, something that goes beyond mere nostalgia or the need to make money. They need to build something that means something, a store people return to not just to complete a task but because it somehow sustains them. For some, that’s too tall an order. “They go, ‘I’m not going to do that.’ I’m like, ‘Then what are you going to do?’ And they literally tell me: ‘I’m going to retire.’” It’s easier to cash out. Pass the buck, and consign the fate of the world to younger people with bolder dreams.

Does it even matter? The world existed before supermarkets, and it won’t end if they vanish. And in the ongoing story of American food, the 20th-century grocery store is no great hero. A&P — the once titanic chain, now itself defunct — was a great mechanizer, undercutting the countless smaller, local businesses that used to populate the landscape. More generally, the supermarket made it easier for Americans to distance ourselves from what we eat, shrouding food production behind a veil and letting us convince ourselves that price and convenience matter above all else. We let ourselves be satisfied with the appearance of abundance — even if great stacks of unblemished fruit contribute to waste and spoilage, even if the array of brightly colored packages are all owned by the same handful of multinational corporations.

But whatever springs up to replace grocery stores will have consequences, too, and the truth is that brick-and-mortar is not going away any time soon — far from it. Instead, the most powerful retailers in the world have realized that physical spaces have advantages they want to capitalize on. It’s not just that stores in residential neighborhoods work well as distribution depots, ones that help facilitate the home delivery of packages. And it’s not just that we can’t always be home to pick up the shipments we ordered when they arrive, so stores remain useful. The world’s biggest brands are now beginning to realize what Kelley has long argued: Physical stores are a way to capture attention, to subject customers to an experience, to influence the way they feel and think. What could be more useful? And what are Amazon’s proposed cashierless stores, but an illustration of Kelley’s argument? They take a brand thesis, a set of core values — that shopping should be quick and easy and highly mechanized — and seduce us with it, letting us feel the sweep and power of that vision as we pass with our goods through the doors without paying, flushed with the thrill a thief feels.

This is where new troubles start. Only a few companies in the world will be able to compete at Amazon’s scale — the scale where building 3,000 futuristic convenience stores in three years may be a realistic proposition. Unlike in the golden age of grocery, where different family owned chains catered to different demographics, we’ll have only a handful of players. We’ll have companies that own the whole value chain, low to high. Amazon owns the e-commerce site where you can find almost anything in the world for the cheapest price. And for when you want to feel the heft of an heirloom tomato in your hand or sample some manchego before buying, there is Whole Foods. Online retail for thrift, in-person shopping for pleasure. Except one massive company now owns them both.

If this new landscape comes to dominate, we may find there are things we miss about the past. For all its problems, the grocery industry is at least decentralized, owned by no one dominant company and carved up into more players than you could ever count. It’s run by people who often live alongside the communities they serve and share their concerns. We might miss that competition, that community. They are small. They are nimble. They are independently, sometimes even cooperatively, owned. They employ people. And if they are scrappy, and ingenious, and willing to change, there’s no telling what they might do. It is not impossible that they could use their assets — financial resources, industry connections, prime real estate — to find new ways to supply what we all want most: to be happier, to be healthier, to feel more connected. To be better people. To do the right thing.

I want to believe that, anyway. That stores — at least in theory — could be about something bigger, and better than mere commerce. The way Harvest seems to want to be, with some success. But I wonder if that’s just a fantasy, too: the dream that we can buy and sell our way to a better world, that it will take no more than that.

Which one is right?

I guess it depends on how you feel about the movies.

Maybe a film is just a diversion, a way to feel briefly better about our lives, the limitations and disappointments that define us, the things we cannot change. Most of us leave the theater, after all, and just go on being ourselves.

Still, maybe something else is possible. Maybe in the moment when the music swells, and our hearts beat faster, and we feel overcome by the beauty of an image — in the instant that we feel newly brave and noble, and ready to be different, braver versions of ourselves — that we are who we really are.

* * *

Joe Fassler, The Counter’s deputy editor, has covered the intersection of food, policy, technology, and culture for the magazine since 2015. His food reporting has twice been a finalist for the James Beard Foundation Award in Journalism. He’s also editor of Light the Dark: Writers on Creativity, Inspiration, and the Creative Process (Penguin, 2017), a book based on “By Heart,” his ongoing series of literary conversations for The Atlantic

Editor: Michelle Weber
Fact checker: Matt Giles
Copy editor: Jacob Z. Gross

The American Worth Ethic

Getty / Photo Illustration by Longreads

Bryce Covert | Longreads | April 2019 | 13 minutes (3,374 words)

“The American work ethic, the motivation that drives Americans to work longer hours each week and more weeks each year than any of our economic peers, is a long-standing contributor to America’s success.” Thus reads the first sentence of a massive report the Trump administration released in July 2018. Americans’ drive to work ever harder, longer, and faster is at the heart of the American Dream: the idea, which has become more mythology than reality in a country with yawning income inequality and stagnating upward economic mobility, that if an American works hard enough she can attain her every desire. And we really try: We put in between 30 to 90 minutes more each day than the typical European. We work 400 hours more annually than the high-output Germans and clock more office time than even the work-obsessed Japanese.

The story of individual hard work is embedded into the very founding of our country, from the supposedly self-made, entrepreneurial Founding Fathers to the pioneers who plotted the United States’ western expansion; little do we acknowledge that the riches of this country were built on the backs of African slaves, many owned by the Founding Fathers themselves, whose descendants live under oppressive policies that continue to leave them with lower incomes and overall wealth and in greater poverty. We — the “we” who write the history books — would rather tell ourselves that the people who shaped our country did it through their own hard work and not by standing on the shoulders, or stepping on the necks, of others. It’s an easier story to live with. It’s one where the people with power and money have it because they deserve it, not because they took it, and where we each have an equal shot at doing the same.

Because for all our national pride in our puritanical work ethic, the ethic doesn’t apply evenly. At the highest income levels, wealthy Americans are making money passively, through investments and inheritances, and doing little of what most would consider “work.” Basic subsistence may soon be predicated on whether and how much a poor person works, while the rich count on tax credits and carve-outs designed to protect stockpiles of wealth created by money begetting itself. It’s the poor who are expected to work the hardest to prove that they are worthy of Americanness, or a helping hand, or humanity. At the same time, we idolize and imitate the rich. If you’re rich, you must have worked hard. You must be someone to emulate. Maybe you should even be president.

* * *

Trump has a long history of antipathy to the poor, a word which he uses as a synonym for “welfare,” which he understands only as a pejorative. When he and his father were sued by the Department of Justice in 1973 for discriminating against black tenants in their real estate business, he shot back that he was being forced to rent to “welfare recipients.” Nearly 40 years later, he called President Obama “our Welfare & Food Stamp President,” saying he “doesn’t believe in work.” He wrote in his 2011 book Time To Get Tough, “There’s nothing ‘compassionate’ about allowing welfare dependency to be passed from generation to generation.”

Perhaps. But Trump certainly knows about relying on things passed from generation to generation. His self-styled origin story is that he got his start with a “small” $1 million loan from his real estate tycoon father, Fred C. Trump, which he used to grow his own empire. “I built what I built myself,” he has claimed. “I did it by working long hours, and working hard and working smart.”

It’s an interesting interpretation of “myself”: A New York Times investigation in October reported that, instead, Trump has received at least $413 million from his father’s businesses over the course of his life. “By age 3, Mr. Trump was earning $200,000 a year in today’s dollars from his father’s empire. He was a millionaire by age 8. By the time he was 17, his father had given him part ownership of a 52-unit apartment building,” reporters David Barstow, Susanne Craig, and Russ Buettner wrote. “Soon after Mr. Trump graduated from college, he was receiving the equivalent of $1 million a year from his father. The money increased with the years, to more than $5 million annually in his 40s and 50s.” The Times found 295 different streams of revenue Fred created to enrich his son — loans that weren’t repaid, three trust funds, shares in partnerships, lump-sum gifts — much of it further inflated by reducing how much went to the government. Donald and his siblings helped their parents dodge taxes with sham corporations, improper deductions, and undervalued assets, helping evade levies on gifts and inheritances.

If you’re rich, you must have worked hard. You must be someone to emulate. Maybe you should even be president.

Even the money that was made squarely owed a debt to the government. Fred Trump nimbly rode the rising wave of federal spending on housing that began with the New Deal and continued with the G.I. Bill. “Fred Trump would become a millionaire many times over by making himself one of the nation’s largest recipients of cheap government-backed building loans,” the Times reported. Donald carried on this tradition of milking government subsidies to accumulate fortunes. He obtained at least $885 million in perfectly legal grants, subsidies, and tax breaks from New York to build his real estate business.

Someone could have taken this largesse and worked hard to grow it into something more, but Donald Trump was not that someone. Much of his fortune comes not from the down and dirty work of running businesses, but from slapping his name on everything from golf courses to steaks. Many of these deals entail merely licensing his name while a developer actually runs things. And as president, he still doesn’t seem inclined to clock much time doing actual work.

That hasn’t stopped him from putting work at the center of his administration’s poverty-related policies. In the White House Council of Economic Advisers’ lengthy tome, it argued for adding work requirements to a new universe of public benefits. These requirements, which up until the Trump administration only existed for direct cash assistance and food stamps, require a recipient not just to put in a certain number of hours at a job or some other qualifying activity, but to amass paperwork to prove those hours each month. The CEA report is focused, supposedly, on “the importance and dignity of work.” But the benefits of engaging in labor are only deemed important for a particular population: “welfare recipients who society expects to work.” Over and over, it takes for granted that our country only expects the poorest to work in order to prove themselves worthy of government funds, specifically targeting those who get food stamps to feed their families, housing assistance to keep roofs over their heads, and Medicaid to stay healthy.

* * *

The report doesn’t just represent an ethos in the administration; it was also a justification for concrete actions it had already taken and more it would soon roll out. Last April, Trump signed an executive order that ordered federal agencies to review public assistance programs in order to see if they could impose work requirements unilaterally to “ensure that they are consistent with principles that are central to the American spirit — work, free enterprise, and safeguarding human and economic resources,” as the document states, while also “reserving public assistance programs for those who are truly in need.”

The administration has also pushed forward on its own. In 2017, it announced that states could apply for waivers that would allow them to implement work requirements in Medicaid for the first time, and so far more than a dozen states have taken it up on the offer, with Arkansas’s rule in effect since June 2018. (It has now been halted by a federal judge.) In that state, Medicaid recipients had to spend 80 hours a month at work, school, or volunteering, and report those activities to the government in order to keep getting health insurance. And in April 2018, Housing and Urban Development Secretary Ben Carson unveiled a proposal to let housing authorities implement work requirements for public housing residents and rental assistance recipients. Trump pushed Congress to include more stringent work requirements in the food stamp program as it debated the most recent farm bill, arguing it would “get America back to work.” When that effort failed, the Agriculture Department turned around and proposed a rule to impose the requirements by itself.

These aren’t fiscal necessities — they’re crackdowns on the poor, justified by the idea that they should prove themselves worthy of the benefits that help them survive, that are not just cruel but out of step with real life. Most people who turn to public programs already work, and those who don’t often have good reason. More than 60 percent of people on Medicaid are working. They remain on Medicaid because their pay isn’t enough to keep them out of poverty, and many of the low-wage jobs they work don’t offer health insurance they can afford. Of those not working, most either have a physical impairment or conflicting responsibilities like school or caregiving.

Enrollment in food stamps tells the same story. Among the “work-capable” adults on food stamps, about two thirds work at some point during the year, while 84 percent live in a household where someone works. But low-wage work is often chaotic and unpredictable. Recipients are more likely to turn to food stamps during a spell of unemployment or too few hours, then stop when they resume steadier employment. Many of those who are supposedly capable of work but don’t have a job have a health barrier or live with someone who has one; they’re in school, they’re caring for family, or they just can’t find work in their community.

Work requirements, then, fail to account for the reality of poor people’s lives. It’s not that there’s a widespread lack of work ethic among people who earn the least, but that there’s a lack of steady pay and consistent opportunities that allow someone to sustain herself and her family without assistance. We also know work requirements just don’t work. They’ve existed in the Temporary Assistance for Needy Families cash-assistance program for decades, yet they don’t help people find meaningful, lasting work; instead they serve as a way to shove them out of programs they desperately need. The result is more poverty, not more jobs.

If this country were so concerned about helping people who might face barriers to working get jobs, we might not be the second-lowest among OECD member countries by percentage of GDP spent on labor-market programs like job-search assistance or retraining. The poor in particular face barriers like affordable childcare and reliable transportation, and could use education or training to reach for better-paid, more meaningful work. But we do little to extend these supports. Instead, we chastise them for not pulling on their frayed bootstraps hard enough.

We also seem content with the notion that a person who doesn’t work — either out of inability or refusal — doesn’t deserve the building blocks of staying alive. The programs Trump is targeting, after all, are about basic needs: housing to stay safe from the elements, food to keep from going hungry, healthcare to receive treatment and avoid dying of neglect. Even if it were true that there was a horde of poor people refusing to work, do we want to condemn them to starvation and likely death? In one of the world’s richest countries, do we really balk at spending money on keeping our people — even lazy ones — alive?

We also know work requirements just don’t work. They’ve existed in the Temporary Assistance for Needy Families cash-assistance program for decades, yet they don’t help people find meaningful, lasting work; instead they serve as a way to shove them out of programs they desperately need. The result is more poverty, not more jobs.

Plenty of other countries don’t do so. Single mothers experience higher rates of destitution than coupled parents or people without children all over the world. But the higher poverty rate in the U.S. as compared to other developed countries isn’t because we have more single mothers; instead, it’s because we do so little to help them. Compare us to Denmark, which gives parents unconditional cash benefits for each of their children regardless of whether or how much they work, on top of generously subsidizing childcare, offering universal health coverage, and guaranteeing paid leave. It’s no coincidence that they also have a lower poverty rate, both generally and for single mothers specifically. A recent examination of poverty across countries found that children are at higher risk in the U.S because we have a sparse social safety net that’s so closely tied to demanding that people work. It makes us an international outlier, the world’s miser that only opens a clenched fist to the poor if they’re willing to demonstrate their worthiness first.

Here, too, America’s history of slavery and ongoing racism rears its head. According to a trio of renowned economists, we don’t have a European-style social safety net because “racial animosity in the U.S. makes redistribution to the poor, who are disproportionately black, unappealing to many voters.” White people turn against funding public benefit programs when they feel their racial status threatened, particularly benefits they (falsely) believe mainly accrue to black people. The black poor are seen as the most undeserving of help and most in need of proving their worthiness to get it. States with larger percentages of black residents, for example, focus less on TANF’s goal of providing cash to the needy and have stingier benefits with higher hurdles to enrollment.

* * *

The CEA’s report on work requirements claimed that being an adult who doesn’t work is particularly prevalent among “those living in low-income households.” But that’s debatable. The more income someone has, the less likely he is to be getting it from wages. In 2012, those earning less than $25,000 a year made nearly three quarters of that money from a job. Those making more than $10 million, on the other hand, made about half of their money from capital gains — in other words, returns on investments. The bottom half of the country has, on average, just $826 in income from capital investments each; the average for those in the top 1 percent is more than $16 million.

The richest are the least likely to have their money come from hard labor — yet there’s no moral panic over whether they’re coddled or lacking in self reliance. Instead, government benefits help the rich protect and grow idle wealth. Capital gains and dividends are taxed at a lower rate than regular salaried income. Inheritances were taxed at an average rate of 4 percent in 2009, compared to the average rate of 18 percent for money earned by working and saving. When investments are bequeathed, the recipient owes no taxes on any asset appreciation.


Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.

Sign up


In fact, government tax benefits that increase people’s take-home money at the expense of what the government collects for its own coffers overwhelmingly benefit the rich over the poor (or even the middle class). More than 60 percent of the roughly $900 billion in annual tax expenditures goes to the richest 20 percent of American families. That figure dwarfs what the government expends on many public benefit programs. The government spends more than three times as much on tax subsidies for homeowners, mostly captured by the well-to-do, than it does on rental assistance for the poor. The three benefit programs the Trump administration is concerned with — Medicaid, food stamps, and housing assistance — come to about $705 billion in combined spending.

While the administration has been concerned with what it can do to compel the poor to work, it’s handed out more largesse to the idle rich. Its signature tax-cut package, the Tax Cuts and Jobs Act, offered an extra cut for so-called “pass-through” businesses, like law or real estate firms. But the fine print included a wrinkle: If someone is considered actively involved in his pass-through business, only 30 percent of his earnings could qualify for the new discount. If someone is passively involved, however — a shareholder who doesn’t do much about the day-to-day work of the company — then he gets 100 percent of the new benefit.

Then there’s the law’s significant lowering of the estate tax. The tax is levied on only the biggest, most valuable inheritances passed down from wealthy parent to newly wealthy child. Before the Republicans’ tax bill, only the richest 0.2 percent of estates had to pay the tax when fortunes changed hands. Now it’s just the richest 0.1 percent, or a mere 1,800 very wealthy families worth more than $22 million. The rest get to pass money to their heirs tax-free. Those who do pay it will be paying less when tax time comes due — $4.4 million less, to be exact.

Despite the Republican rhetoric that lowering the estate tax is about saving family farms, it’s really about allowing an aristocracy to calcify — one in which rich parents ensure their children are rich before they lift a single finger in work. As those heirs receive their fortunes, they also receive the blessing that comes with riches: the halo of success and, therefore, deservedness without having to work to prove it. Yet there’s evidence that increasing taxes on inheritances has the potentially salutary effect of getting heirs to work more. The more their inheritances are taxed, the more they end up paying in labor taxes — evidence that they’re working harder for their livings, not just coasting on generational wealth. Perhaps our tax code could encourage rich heirs to experience the dignity of work.

* * *

Trump’s CEA report is accurate about at least one thing: Our country has a history of only offering public benefits to the poor either deemed worthy through their work or exempt through old age or disability. An outlier was the Aid to Families with Dependent Children program, which became Temporary Assistance for Needy Families after Bill Clinton signed welfare reform into law in the ’90s. But the 1996 transformation of the program took what was a promise of cash for poor mothers and changed it into an obstacle course of proving a mother’s worth before she can get anywhere close to a check. It paved the way for the current administration’s obsession with work requirements.

Largesse for the rich, on the other hand, has rarely included such tests. No one has been made to pee in a cup for tax breaks on their mortgages, which cost as much as the food stamp program but overwhelmingly benefit families that earn more than $100,000. No one has had to prove a certain number of work hours to get a lower tax rate on investment income or an inheritance. They get that discount on their money without having to do any work at all.

We haven’t always been so extreme in our dichotomous treatment of the rich and poor; throughout the 1940s, ’50s, and ’60s, we coupled high marginal taxes on the wealthy with a minimum wage that ensured that people who put in full-time work could rise out of poverty. The estate tax has been as high as 77 percent. As Dutch historian Rutger Bregman recently told an audience of the ultrawealthy at Davos, we’re living proof that high taxes can spread shared prosperity. “The United States, that’s where it has actually worked, in the 1950s, during Republican President Eisenhower,” he pointed out. “This is not rocket science.” It was during the same era that we also created significant anti-poverty programs such as Social Security, Medicare, and Medicaid. In fact, this country pioneered the idea of progressive taxation and has always had some form of tax on inheritance to avoid creating an aristocracy. But we’ve papered over that history as tax rates have cratered and poverty has climbed.

Instead, as Reaganomics and neoliberal ideas took hold of our politics, we turned back to the Horatio Alger myth that success is attained on an individual basis by hard work alone, and that riches are the proof of a dogged drive. Lower tax rates naturally follow under the theory that the rich should keep more of their deserved bounty. And if you’re poor, coming to the government seeking a helping hand up, you failed.

The country is due for a reckoning with our obsession with work. There are certainly financial and emotional benefits that come from having a job. But why are we only concerned with whether the poor reap those benefits? Is working ourselves to the bone the best signifier of our worth — and are there basic elements of life that we should guarantee regardless of work? It doesn’t mean dropping all emphasis on work ethic. But it does require a deeper examination of who we expect to work — and why.

* * *

Bryce Covert is an independent journalist writing about the economy and a contributing op-ed writer at The New York Times.

Editor: Michelle Weber
Fact checker: Ethan Chiel
Copy editor: Jacob Z. Gross   

Other Rachel Lyons

Getty / Photo illustration by Katie Kosma

Rachel Lyon | Longreads | April 2019 | 23 minutes (5,849 words)

 

I signed up for Gmail in 2005, a month after graduating college and outgrowing my .edu address. Technically the service was still in beta testing. It was early enough that I could claim my entire name, beginning to end, no numbers or crazy characters. The simplicity of my “OG handle”speaks to its vintage. I have to admit I’m rather proud of it. It also means I get a lot of correspondence not actually meant for me. Since I joined Gmail, it has grown to more than 1.5 billion active users: 20% of the world’s population. Since I joined Gmail, the world’s population itself has increased by 1+ billion! There are only so many words in the English language. There are only so many variations. Social media handles are stolen and sold like Uranium on the black market. IP addresses are finite.

I am included on the timesheet of a Melbourne store, Boost Juice — scheduled to work the closing shift on March 24 — and on the agenda for the 64th annual general meeting of the Citizens Advice Bureau in a small town outside of London. World Vision UK writes to thank me for my “donation of 10” (ten what, I don’t know). Kid to Kid Utah thanks me, too, for a donation of $9.32 worth of used children’s items. I am notified that my job application to teach at primary school in Leeds, UK, has been received. The school is rated 2.6 out of 5. One review reads: “Want your child to be bullied then send them there.”

One November I receive a note from Matt, who thinks he knows me from East High. “You Freshman Scum! A belated happy birthday this week. Hope all is going well.” (My birthday is in April, and no one would have called me “scum” when I was a high school freshman. I would have blushed. I might have cried.) December, I get a photo from Zoe — subject line: “SNOW,” body copy: “Happy Winter!” — of a courtyard, stone walls, and iron grate, blanketed in white. Adam sends me a photo, accompanied by no text at all, of three men in a lush, walled garden, one holding a Smart Water, the second holding a Starbucks cup, the third showing off three tickets to a Colts game. An American flag is stuck in a flowerpot.

Sophie writes to say how proud she is of my daughter, who “was such a sweet leader in the classroom today.” Marci tells me she signed up her son Cameron for the Abundant Life Garden Project, an after-school program at St. Philips Episcopal Church in Durham, NC, and she thinks my son Jack would have “a fabulous time” there, too. An automated message arrives from a public school in Cherryvale, KS, notifying me that my son Gary is failing English 11. His grade is 39%. What can you do with a kid like Gary? His future is looking bleak. I write to the school to let them know that the email address they’ve got on file for his mother, a different Rachel Lyon, is actually mine. They apologize and I don’t hear from them again — until the following year, when Marla writes to say she’s collecting pictures for a senior slideshow on graduation night, and will need photos of Gary no later than April 19. So Gary’s graduating after all! I’m glad he turned himself around.

One reason for all this misdirected correspondence is there are at least a few hundred people around the world who share my name. According to the dizzying website howmanyofme.com, there are 186 Rachel Lyons, Rachael Lyons, Rachel Lyonses, and Rachael Lyonses in the United States. The consonant-rich website uknames.gbgplc.com approximates 45 people in the UK, including spelling variations. (Canada — not known for its big egos, really — doesn’t seem to have an equivalent site; a search for an equivalent Australian site yielded suggestions for the following “related searches”: how many Daniels are in the world? how many people are named Mitchell? how many people in the world are named Humphrey? Apparently Daniels, Mitchells, and Humphreys are peculiarly given to egosurfing.) We Rachel Lyons are a not insignificant population.

Another reason I get so very much email, I suspect, is that when people are prompted to enter their email addresses to get something they want — free samples; access to 30 days of unlimited whatever — but don’t want to get all the spam that comes with doing so, they enter something else. What’s an easier address to think up than one’s-own-name@gmail? Given the number of digital receipts I get for things I didn’t buy, I know many Rachel Lyons have put my address down to misdirect their spam. If you’re a Rachel Lyon and you’re reading this, please know: I am here, I am real, I am receiving your correspondence, and I don’t want your spam any more than you do.

I do, however, very much enjoy the non-spam correspondence. An email is a glimpse into another life, a fragment of a story. Maybe I love getting other people’s mail because I am a fiction writer. Maybe I’m a fiction writer because I love getting other people’s mail. Chicken or egg, I do not know. All I know is it gives me a little rush. I read my misdirected correspondence carefully. I read it nosily. I read it with a little voyeuristic thrill and odd surprising pangs of envy. Rationally I know that to share a name with someone is a simple, random thing. Irrationally I can’t help but feel connected to the other Rachel Lyons of the world.

Read more…

The Unreliable Reader

Aditya Chinchure / Unsplash, Photo illustration by Katie Kosma

Wei Tchou | Longreads | April 2019 | 11 minutes (2,983 words)

“I write this while experiencing a strain of psychosis known as Cotard’s delusion, in which the patient believes that they are dead,” the novelist Esmé Weijun Wang writes at the beginning of “Perdition Days,” an essay from her new book, The Collected Schizophrenias. (Read an excerpt on Longreads.) “What the writer’s confused state means is not beside the point, because it is the point,” she continues. “I am in here, somewhere: cogito ergo sum.” The passage moves swiftly, from first person agency (“I am writing”) to distanced third person (“the patient,” “the writer”) to the famous Descartes assertion, in Latin, “I think, therefore I am.” As a reader, it’s astonishing and a little unnerving to consider the immediacy of the prose, your intimacy with a speaker searching to find the correct vantage from which to narrate the strangely drawn, difficult-to-map districts of her mind.

That same authorial compulsion to navigate and survey pervades the book, which is notable for its subject matter alone: a first-person investigation of “the schizophrenias,” as Wang describes the four overlapping classifications of the mental disorder listed by the Diagnostic and Statistical Manual of Mental Disorders, 5th Edition, often shortened to DSM-5. (Wang was diagnosed with schizoaffective disorder, bipolar type, in 2013.) Wang approaches the work of writing about her mental illness as if she were reporting from a foreign place, returning to it diligently, pursuing dark corners as if to case the joint. She publishes email correspondences between herself and her physician, written in a period of psychosis. She considers her desire for motherhood through the lens of her time as a counselor at Camp Wish, a bipolar youth camp. She recalls scenes from her three involuntary hospitalizations, describing the trauma of those stays, as well as the slippery interviews on which those hospitalizations were based. Read more…

When Zora and Langston Took a Road Trip

Library of Congress / Corbis Historical / Getty, Michael Ochs Archives / Getty

Yuval Taylor | An excerpt from Zora and Langston: A Story of Friendship and Betrayal | W. W. Norton & Company | March 2019 | 30 minutes (8,692 words)

 

Ornate and imposing, the century-old Gulf, Mobile and Ohio Passenger Terminal in downtown Mobile, Alabama, resembles a cross between a Venetian palace and a Spanish mission. Here, on St. Joseph Street, on July 23, 1927, one of the more fortuitous meetings in American literary history occurred, a chance incident that would seal the friendship of two of its most influential writers. “No sooner had I got off the train” from New Orleans, Langston wrote in The Big Sea, “than I ran into Zora Neale Hurston, walking intently down the main street. I didn’t know she was in the South [actually, he did, having received a letter from her in March, but he had no idea she was in Alabama], and she didn’t know I was either, so we were very glad to see each other.”

Zora was in town to interview Cudjo Lewis, purportedly the only person still living who had been born in Africa and enslaved in the United States. She then planned to drive back to New York, doing folklore research along the way. In late 1926, Franz Boas had recommended her to Carter Woodson, whose Association for the Study of Negro Life and History, together with Elsie Clews Parsons of the American Folklore Society, had decided to bankroll her to the tune of $1,400. With these funds, Zora had been gathering folklore in Florida all spring and summer. As the first Southern black to do this, her project was, even at this early stage, clearly of immense importance. It had, however, been frustrating. “I knew where the material was, all right,” she would later write. “But I went about asking, in carefully accented Barnardese, ‘Pardon me, but do you know any folk-tales or folk-songs?’ The men and women who had whole treasuries of material just seeping through their pores, looked at me and shook their heads. No, they had never heard of anything like that around there. Maybe it was over in the next county. Why didn’t I try over there?”

Langston, meanwhile, had been touring the South for months, penniless as usual, making some public appearances and doing his own research. He read his poems at commencement for Nashville’s Fisk University in June; he visited refugees from the Mississippi flood in Baton Rouge; he strolled the streets alone in New Orleans, ducking into voodoo shops; he took a United Fruit boat to Havana and back; and his next stop was to be the Tuskegee Institute in Alabama. It was his very first visit to the South.

When Zora invited him to join her expedition in her little old Nash coupe, nicknamed “Sassy Susie,” Langston happily accepted. (The car looked a lot like a Model T Ford, and could only seat two.) Langston adored the company of entertainers, and Zora was as entertaining as they came. Langston did not know how to drive, but Zora loved driving and didn’t mind a whit. They decided to make a real trip of it, “stopping on the way to pick up folk-songs, conjur [sic], and big old lies,” as Langston wrote. “Blind guitar players, conjur men, and former slaves were her quarry, small town jooks and plantation churches, her haunts. I knew it would be fun traveling with her. It was.” Read more…