Search Results for: Prospect

How Many Bones Would You Break to Get Laid?

Longreads Pick

‘“Incels” are going under the knife to reshape their faces, and their dating prospects.’ What they’re discovering after the swelling goes down is that the work they need to get done is on the inside: no plastic surgery can fix a poor self image or a skewed world view that dictates that life’s problems and roadblocks will magically evaporate with a surgically enhanced jawline.

Source: The Cut
Published: May 28, 2019
Length: 26 minutes (6,644 words)

Total Depravity: The Origins of the Drug Epidemic in Appalachia Laid Bare

Getty / Black Inc. Books

Richard Cooke | Excerpt from Tired of Winning: A Chronicle of American Decline | Black Inc. Books | May 2019 | 21 minutes (5,527 words)

They shall take up serpents; and if they drink any deadly thing it shall not hurt them; they shall lay hands on the sick, and they shall recover.

Mark 16:18

One night John Stephen Toler dreamed that the Lord had placed him high on a cliff, overlooking a forest-filled valley. He had this vision while living in Man, West Virginia, where some of the townsfolk thought he was a hell-bound abomination; he countered that God works in different ways. The mountains were where he sought sanctuary, so he felt no fear; but as he watched, all the trees he could see were consumed by wildfire. It was incredible, he said, to see ‘how quick it was devoured’, and the meaning of the parable was clear. The forest was Man and the fire was drugs, and when the drugs came to Man, that was exactly how it happened – it was devoured ‘so fast, that you didn’t even see it coming’, he said. We were in Huntington, West Virginia, and by now John Stephen Toler was in recovery.

Read more…

Becoming Family

Illustration by Tom Peake

Jennifer Berney | Longreads | May 2019 | 16 minutes (4,486 words)

“He’s really cute,” my partner Kellie whispered to me, moments after our first son arrived. He had a head of black hair and a pug nose. His eyes were alarmingly bright. Kellie rested one hand on the top of his head as he lay across my chest. “So cute,” she said.

Her declaration meant something to me. Because the baby wasn’t of her body, because he was of my egg and my womb and a donor’s sperm, I’d been haunted by the worry that she’d struggle to claim him as hers — that he’d seem to her like a foreign entity, like someone else’s newborn, red-faced and squirming.

Hours later, in the middle of the night, a nurse came into our room, tapped Kellie on the shoulder, and asked her to bring our newborn to the lab for a routine test. Kellie cradled the baby as the nurse poked his heel with a needle and squeezed drops of his blood onto a test card. Our baby, who was still nameless, wailed and shook. In that moment, she tells me, she was overwhelmed by biology, by the physical need to protect a tiny life.

* * *

In the fourth century BCE, Aristotle proposed a theory of reproduction that would persist for thousands of years. It’s a theory that, while scientifically inaccurate, still informs our cultural thinking about parenthood.

According to Aristotle, the man, via intercourse, planted his seed in the woman’s womb. The woman’s menstrual blood nourished that seed and allowed it to grow. She provided the habitat, he supplied the content. The resulting child was the product of the father, nourished by the mother.

When it came to parenthood, the woman’s essential role was to nurture what the man had planted within her. To father was simply to provide the material — a momentary job. Fathering was ejaculating. But mothering was nurturing. This job was ongoing, never-ending. Her care began at the moment of conception and continued into adulthood and beyond.

* * *

When Kellie and I came home from the hospital with our newborn, our house felt strangely quiet and bare. In the days preceding delivery, Kellie had cleaned and organized as a way of getting ready for the baby, and our house was now unusually tidy. We sat on the couch with our sleeping baby and admired him. We smoothed his hair so that it crested at the center of his forehead, Napoleon style, then we smoothed it to the side. We said his name — West — over and over, trying to teach ourselves the word for this new being. Every so often he twitched. I had the sense that our world was about to transform, that the quiet of the first newborn days was temporary.

In the days that followed, I roamed the house in mismatched pajamas and snacked on casseroles that friends had brought over. I nursed the baby and rocked the baby and watched the baby while he slept. Meanwhile, Kellie, wearing her daily uniform of work pants and a worn-out T-shirt, built walls around our back porch to create a mudroom for our house. In the months leading up to our baby’s birth, we’d agreed that our dogs would need such a room, a place set away from a baby who would one day be crawling and drooling and grabbing, and so we called Jesse — a carpenter acquaintance whom we had once, long ago, asked to be our donor, and who had considered it for two months before turning us down. He wasn’t game to donate sperm, but he was game to bang out some walls. All day, I heard Kellie and Jesse’s hammering and muffled conversation.

In this way we entered parenthood. I was the full-time nurser and the guardian of sleep; Kellie was the builder, the house-maintainer. At night, the baby slept between us.

* * *

The idea that paternity is primarily a genetic contribution, that a father’s role is simply to provide the seed, is a very stubborn one. An absent father is still considered a father. When we use father as a verb, we usually mean the physical act of conception, while to mother more often describes the act of tending to. When a father takes on some of the active parenting, when he drives the kids to school or makes them breakfast, we often refer to these acts as “helping,” as if he were doing tasks assigned to someone else. “He’s a good father,” I’ve heard people say, bemoaning a wife’s lack of gratitude. “He helps.”

“Who’s the dad?” is a question friends of friends ask at parties when they learn that my children have two mothers. It’s a question that distant relatives ask, eager for the inside scoop.

The idea that my son doesn’t have a dad, that it is indeed possible to not have a father, is a hard thing for people to wrap their minds around. They may understand the process of donor insemination, but still, they think, because conception requires sperm, every child must have a father. Even for me, it creates a kind of cognitive dissonance. When I say that my child has no father, I feel like I’m not telling the whole truth.

“Why doesn’t West have a father?” a wide-eyed boy asked me one day as he sat at a classroom table with West and three other first graders. I was helping them make illustrated pages, and somehow the topic of our family had come up. West looked at me anxiously.

“He has two moms,” I told the boy.

“But why?” he insisted.

Of the kids I knew in West’s class, one was being raised by grandparents and several more had stepparents or were being raised by a single mom. But I could see that our situation was the most confounding.

“That’s just the way our family works,” I said before rattling the crayon box and offering it around the table. The curious boy did not look satisfied, and West remained steady and silent.

* * *

* Some names have been changed to protect the privacy of individuals.

By the time our donor, Daniel*, met our baby, he and his wife Rebecca had a baby of their own and had resettled on the other side of the state. We met them at a pizza place on a weekday afternoon. It was spring in the Pacific Northwest and the sun glared on fresh puddles. They had come to town to visit family and meet with longtime friends who wanted to meet their new child. At the time, our relationships with one another were still undefined, and we counted more as friends than family.

I remember that meeting in fragments, like bits of color held up to the light: Trays of half-eaten pizza. Plastic cups filled with ice water. Rebecca holding her newborn, Wren, against her, a burp cloth draped over her shoulder. Wren’s bare baby feet and the creases in his chubby ankles. My own baby, old enough to crane his head, looking around with wide eyes and a two-tooth smile. All of us in constant motion — standing to rock the baby, sitting to feed the baby, slipping into the bathroom to change the baby’s wet diaper. We passed our babies from one parent to the other, then across the table. We lifted the babies, assessing their heft, then tried to meet their eyes so that we could bombard them with smiles.

I remember it this way: We were neither distant nor close, neither awkward nor easy. We’d all been remade by parenthood, and it was like we were meeting for the first time.

I had wondered before our meeting if West, at 6 months old, would connect to Daniel especially, if there really was some magic carried in their shared DNA, if our son would recognize him, cling to him, fall asleep against his chest. But he didn’t. West greeted Daniel with joyful curiosity, the same way he greeted any stranger, and then returned to my arms to nurse.

* * *

Several months later, Kellie and I drove six hours across the state — baby tucked in his infant car seat in the back — to meet Daniel and Rebecca again, in their new home.

The fog of new parenthood had lifted, and this time, the ease between us was instant. Rebecca and I each claimed a spot at her kitchen table, sat with coffee, and watched as our children chewed on toys and pulled themselves across the wood floors. Conversation between us was continuous. We found a rhythm of interrupting one thought with another, then picking up where we left off, all the while tending to our babies as needed — rising to lift and nurse them, to change a diaper on the floor, to pull a board book from a mouth. Time with Rebecca was a respite from the solitude and repetition of early motherhood, a dose of medicine I needed.

So I found something deeply healing in having an extended family that was at once chosen, but also truly family, tied by blood.

Kellie and Daniel found their places just as easily. They spent their time rewiring Daniel’s carpentry studio, or salvaging beams from a nearby teardown, or driving to the forest to cut up fallen trees for firewood. Each of them, I imagine, had experienced their own kind of solitude as they watched their partners devote themselves fully to another human, and they both, I imagine, felt relief in working side by side.

We became parallel, symbiotic. Two families on either side of the Cascade Mountains. Sometimes they traveled to us; other times we traveled to them. Our boys knew and remembered each other. They splashed each other in a steel trough in Daniel and Rebecca’s backyard, climbed trees that had grown sideways over the shore of Puget Sound, built forts together out of cardboard in our kitchen.

The beauty of our new extended family had little to do with anything we had asked for or planned. Two years earlier, a friend had suggested that Kellie and I ask Daniel to consider being our donor. We had met him only a handful of times, but we knew that we liked him. He was strong but soft-spoken, handsome but unassuming. We were nervous to ask him. We’d explored the prospect with several men already — with Jesse the carpenter, with a coworker, with other peripheral friends — but two ghosted, one said no, and another seemed to think that the resulting child would be his own. Daniel turned out to be different. When he and Rebecca showed up at our house to discuss the possibility, it seemed he was already clear. “What kind of involvement would you want?” he asked us. We had agreed only to stay in some kind of touch over the years, to not become strangers to one another.

And yet we wound up with something I’d never had and never would’ve thought to plan for. I grew up with cousins, but none my age. They were five years older, or 12 years older, or three years younger, or 20 years younger. They were also scattered far and wide across the country. My brother was seven years younger than me, and my half-siblings were so much older that they were almost like aunts and an uncle. So I found something deeply healing in having an extended family that was at once chosen, but also truly family, tied by blood.

Or was it even blood that tied us? In theory, we wanted to know Daniel forever because questions might arise about the DNA he’d shared with us. We might someday need to ask him about some rare disease or mental illness, to probe beyond the brief set of questions we’d asked over dinner that first night we talked. And then there was the way we’d been trained to see blood as a legitimizing factor, trained to understand that blood equals family. Like many queer families, Kellie and I, while challenging this notion, unconsciously embraced it. Daniel was blood-tied to our children and therefore he was kin.

But, even more than blood, it was fate that tied us. It was like that film cliché where one stranger saves another’s life and they are therefore bound to each other forever. Rebecca and Daniel had agreed to help us build a family, and their choice had a moral weight. Gratitude would forever bind me to them. The love that I felt for West contained a love for them. I couldn’t imagine it any other way.

So it made sense to me when, four years after we’d first shared a meal and talked about becoming family, three years after our sons were born, Rebecca called us to ask if we’d considered having another baby. We had.

“Do you guys want to get pregnant again?” Rebecca asked me that day on the phone. “Because, you know, we are.”

Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.

Sign up

We went to visit them two weeks later and stayed in a motel two miles away. On our first morning, Kellie woke up before me and left in search of coffee. She came back with two paper cups filled with coffee, and also a small mason jar that held a quarter inch of semen. Later she showed me the text that Rebecca had sent: “Good morning! Donation is ready. Cum on over.”

Rebecca delivered a second son, Ryan, in November. I delivered a second son, Cedar, in January.

* * *

I am a gestational and biological parent. Kellie is an adoptive parent. We come to our roles differently.

That I gestated and breastfed my sons carries immediate, clear meaning for me. When they were babies, my smell, my voice, my touch meant sustenance. Kellie held them and bathed them and changed them, but she did not offer milk. In the middle of the night, it was my body they reached for. My role as gestational parent had immediate consequence: for the first three years, my children’s need for me was more urgent, more connected to their survival.

The other difference, the difference of biology, is far less clear. What does it mean to my family that Kellie shares no DNA with our children? Does it mean next to nothing? Or does it mean more than I want to admit?

In the 10 years that Kellie and I have raised children together, I’ve avoided asking her how she feels about being the adoptive parent. I’ve avoided it because I was afraid — afraid that she would confide that our children never fully felt like her own. I’ve been worried she might say that they felt more like small people she lived with and cared about, but that if our own relationship ended she wouldn’t know exactly where they fit.

In my own community of lesbians, there’s a legacy of loosely defined second parents. I know a number of women who conceived in the ’80s (back when artificial insemination was just beginning to be available to lesbians) and planned to be single parents. But then, during pregnancy or early in the child’s life, a partner entered the picture, stayed for a year or two, then left. The partner had no legal claim to the child, but in many cases continued to parent from a distance. I’ve spoken to some of their children — grown now — who have trouble defining their role with a single term. “She’s certainly my other parent,” one of them told me over beers, then went on to explain that the word Mom doesn’t feel right when her gestational mom “did every load of laundry, packed every lunch, and cooked every meal.”

“We had no blueprint,” she told me. “She was kind of like a weekend dad.”

Though Kellie is much more than a weekend dad, I’ve long worried about the ways in which her role as other-mother remain ambiguous and undefined.

“I feel like they’re mine” is the first thing Kellie told me when I finally summoned the nerve to ask her. But sometimes she worries that if I died, the world would not recognize her as a parent, and that our own kids might reject her. She feels secure in her own attachment, but the role the world assigns her is a tenuous one.

What does it mean to my family that Kellie shares no DNA with our children? Does it mean next to nothing? Or does it mean more than I want to admit?

In her book Recreating Motherhood, Barbara Katz Rothman writes that the value our society places on genetic relationships is inherently patriarchal, tied to our initial false belief — based in Aristotle’s “flowerpot theory” — that men were the sole genetic contributors. Because the child was of the man, he belonged to the man. Once we recognized that mothers contribute half of the genetic material, we began to see mother and father as having equal claim to their child. Rothman asserts that this is still an inherently patriarchal position, one in which blood ties indicate a kind of ownership, and one in which the work of nurturance is not accounted for.

In our own contemporary culture, we may sometimes act as though we value nurture over nature. These days I see the truism “love is love” everywhere I turn — on signs, in social media, spoken aloud by celebrities and friends. The statement suggests that love alone is the element that legitimizes a couple or a family. Still, we track our ancestry and meet new genetic relatives — strangers whom we’ve been told are family — through services like 23andMe, and we marvel at the overlapping traits and mannerisms of close relatives raised apart from one another.

We’ve learned to be careful, when speaking of adoptees, to use terms like “birth mother” instead of “real mother,” acknowledging that genes and gestation are not the only thing that make a parent real. And yet, when someone does say “real mother,” we know exactly what they mean.

“Kellie’s not your real mom,” a neighborhood kid once told Cedar, who stood there agape because he had not yet thought to wonder too hard about his origins. At the time, he already understood that his family was different. When other people asked about his father, he had learned to explain, “I have two moms.” But as far as I could tell, this was the first moment someone had invited him to wonder about the actual legitimacy of his family — its realness.

* * *

Rebecca and I are tied by blood tangentially, but not directly. Our children are blood-related. She and I are not. Still, she feels more like family than many of my actual blood relations. Rebecca’s sister and nieces feel like family too, though they are not tied to my family by heredity. We live in the same community, so when Rebecca and Daniel come to town we have large family get-togethers: picnics at parks and birthday celebrations at restaurants. Sometimes Rebecca’s mom joins us too. When we meet she always hugs me and says my first name sweetly. She knows about what ties us, and so she feels tied to me too.

Meanwhile, Daniel’s family of origin is a mystery to me, for reasons of geographical distance and family culture. I see pictures of his relatives on Facebook and have to remind myself that his kin are also my children’s blood kin. My children’s faces may grow to bear resemblance to the faces I see in these photos: the long jawline, the aquiline nose. Or, pieces of these relatives’ histories may give clues to my own children’s futures — special talents and obsessions, illnesses and struggles. Even when I remind myself of this, it feels distant, hard to reach.

Why do I look so hard to find my reflection in blood kin, as if seeing myself in my ancestors will somehow legitimize me?

Kin: Your mother who birthed and nursed you, your father who bore witness to your childhood. Your grandmother who let you sleep beside her in the bed when you came to visit. Your aunt who drove you to her home for long weekends, where you lay alongside her golden retriever and looked at the forest through her windows.

Kin: The grandfather you never met who was a ne’er-do-well, whose legacy is a stack of letters and a rainbow painted on a barn. The uncle who joked around with you in childhood, but became distant as you got older. Your second cousin who discovered you online and now sends you a Christmas card every year.

Kin: Your brother who you speak to only a few times a year, but who you carry in your heart. Your aunt by marriage (then lost through divorce) who delighted you with her easy brand of sarcasm.

Kin: The cousins you’ve only met once or twice in a lifetime. When you see photos of them, some of them look like people you might easily know. Others look like strangers, like someone you might pass in a grocery store and immediately forget.

* * *

Kellie told me once that she hesitates when telling our kids about her family’s history. It’s not quite clear to her: Is her history their history, or is it something else? Long before she spoke this aloud to me the same question hung in my mind. Does her history matter to our kids because it’s their mother’s history, or because it is, somehow, their own?

When I look at my own ancestral family photos, I seek clues to who I am, traces of a self that predate me. Are these connections real, I wonder, or are they lore? Why does ancestral connection hold a sense of magic? Why do I look so hard to find my reflection in blood kin, as if seeing myself in my ancestors will somehow legitimize me?

And yet it turns out that some of my ancestors are not related to me genetically any more than Kellie is genetically related to our sons. Over the course of generations, our genetic ties to individual ancestors dissolve. Geneticist Graham Coop writes that if you trace your genetic heritage, after seven generations “many of your ancestors make no major genetic contribution to you.” In other words, your cells carry no trace of their DNA. They are no longer your genetic relatives, and yet they are still, of course, your ancestors. “Genetics is not genealogy,” he writes.

What if, more than heredity, families are really a collection of stories, some of them spoken, some of them withheld? Kellie’s ancestors were pioneers. My boys spent the first years of their lives in a house that her grandfather and great-grandfather built together. Kellie spends most of her free time splitting wood, building fences and sheds, capturing bee swarms. Cedar can now spot a swarm from a great distance. West is learning to measure wood and use a chop saw. They may one day raise their own families on the same land they grew up on. They may add new walls, new buildings, new fixtures. They do not require Kellie’s genes to carry on her legacy.

* * *

Four years after West was born, he asked me where he came from. It was a bright summer day and his brother — a baby then — was on a walk with Kellie, strapped against her chest. We were staying at a ranch in Colorado and the land was expansive: trails that went over bare hills and into forests, rocks and brush under wide blue sky. That afternoon West and I were inside our dark cabin, with light streaming through the windows and making patches on the floor.

I asked if he wanted to know who his donor was. “Do you want to guess?” I asked him. I was curious to see if he already had a sense.

“JoAnn?” he said, referring to a close family friend.

“The person who helped us is a man,” I said.

“Oh right,” he said. He thought and guessed some more, until I finally told him.

“It’s Daniel,” I said. “Wren’s dad.”

I watched him closely to see how he’d respond, but I detected neither joy nor surprise nor disappointment.

“Did Daniel help make Cedar too?”

“Yes,” I said.

He smiled. It didn’t surprise me that this was the thing that mattered to him — that he and his brother had the same origin story, that he wasn’t alone in the world.

* * *

We tend to understand our DNA as a simple blueprint for who we are and what we might become. We see experience as the tool that can push a person toward or away from their full potential, yet we see the potential itself as innate and fixed.

But in truth DNA and experience interact with each other. The field of epigenetics tells us that genes are turned on and off by experience, that the food we consume, the air we breathe, and how we are nurtured help determine which genes are expressed and which ones are repressed. Our DNA coding isn’t static. For instance, drinking green tea may help regulate the genes that suppress tumors. A sudden loss may trigger depression. And the amount of nurturing and physical contact a child receives in the early years may help determine whether or not he’ll suffer from anxiety as an adult. Currently researchers are investigating to what degree trauma in one person’s experience can cause a change in DNA that is transmitted from one generation to the next. Experience might become a legacy carried in blood.

Frances Champagne, a psychologist and genetic researcher, writes that “tactile interaction,” physical contact between parent and child, “is so important for the developing brain.” Her research shows that “the quality of the early-life environment can change the activity of genes.”

When Kellie held our newborn sons against her chest, when she bounced them and rocked them until they slept, she was not simply soothing them in the moment. She was helping program their DNA, contributing to their genetic legacy. Parents, through the way they nurture, contribute to the child’s nature. There is no clear line between the two.

* * *

In her memoir on adoption, Nicole Chung discusses the concept of family lineage and writes that she has been “grafted” onto her adoptive parents’ family tree. The graft strikes me as an apt metaphor. The scion is not of the receiving tree, and yet it is nourished and sustained by the tree. In the process of grafting, the tree is changed. The scion is changed. Through a process called vascular connection, they become one body.

The rootstock does not automatically reject the scion. The human body does not automatically reject an embryo conceived with a donor egg and sperm. A baby is comforted by warm skin, a smell, a heartbeat. A body loves a body. The baby may care that the source is familiar, but not that the DNA matches his own.

When Kellie’s mother visits with us, she often compares our boys to other members of her family. “It’s funny how Cedar’s blonde just like Noah, and wild like him too,” she’ll say, or, “West’s eyes are that same shade of hazel your grandpa’s were.”

I used to think she was forgetting that our children are donor conceived, or maybe just being silly. Now I realize it’s the opposite. Kellie’s mother doesn’t forget. She knows. She’s claiming them: tying her family’s present, past, and future, like stringing lights around the branches of her family tree, affirming that we belong to one another.

Jennifer Berney’s essays have appeared in the New York Times, The Offing, Tin House, and Brevity. She is currently working on a memoir that examines the patriarchal roots of the fertility industry, and the ways that queer families have both engaged with and avoided it.

* * *

Editor: Cheri Lucas Rowlands
Copy editor: Jacob Gross

The Fraught Culture of Online Mourning

Illustration by Homestead

Rachel Vorona Cote | Longreads | May 21, 2019 | 15 minutes (3,975 words)


My mother died shortly after 4 a.m. in the pitch black of a November morning. By roughly 8:30 a.m. that day, the 29th, I had alerted my Twitter and Instagram followers, as well as my Facebook friends. I copied and pasted a few lines across the three platforms, words hastily cobbled together in something akin to a fugue state, accompanied by stray photos of my mother that I had saved on my phone — I had posted about her frequently as her condition worsened, particularly after she arrived at that grim point at which death became imminent death.

Read more…

Reimagining Harper Lee’s Lost True Crime Novel: An Interview with Casey Cep

Ben Martin / HarperCollins

Adam Morgan  | Longreads | May 2019 | 14 minutes (3,793 words)


Four years ago, when the news broke that a second Harper Lee novel had been discovered fifty years after To Kill a Mockingbird, the literary world was shocked. Some readers were thrilled by the prospect of returning to the world of Scout, Atticus Finch, and Boo Radley. Others were concerned the 88-year-old Lee might have been pressured to publish an unfinished draft. But Casey Cep, an investigative reporter for the New Yorker and the New York Times, drove down to Alabama to get to the bottom of it. And what she found wasn’t a publishing conspiracy, but another lost book Lee had attempted to write for more than a decade, but never finished.

The book was called The Reverend. It would have been a true-crime novel like In Cold Blood (a book Lee helped Truman Capote research, write, and edit, despite his failure to give her any credit). The Reverend would have told the story of Willie Maxwell, a black preacher who murdered five members of his own family in the 1970s in order to collect life insurance money. It would have touched on voodoo, racial politics in post-industrial Alabama, and a courtroom setpiece that rivaled To Kill a Mockingbird for drama. But Harper Lee never finished writing The Reverend, and now, thanks to Casey Cep, we know why.

Cep’s debut, Furious Hours: Murder, Fraud, and the Last Trial of Harper Lee, is fascinating, addicting, and unbearably suspenseful. Cep actually tells three concentric stories: the crimes of Willie Maxwell, the trials of his lawyer Tom Radney, and Harper Lee’s failed attempt to write about them. When I called Cep from “a Southern phone number” on an unseasonably hot spring afternoon, she initially thought I was one of her sources calling with a “some bombshell thing they want to show me, far too late to help with the book.”


Read more…

I’m Not Queer to Make Friends

Illustration by Eric Chow

Logan Scherer | Longreads | May 2019 | 12 minutes (3,274 words)

On a Sunday morning at a Chicago bowling alley, I soothed five strangers almost as desperate to manipulate people on TV as I was. Then I eviscerated them. After years of being too embarrassed to try out for Big Brother, I’d finally brought myself to attend an open-call audition. I was determined to play the social strategy game I’d followed religiously since 2005.

“I’ve only seen a few episodes here and there,” I said to the tall, gorgeous man and two normcore women standing next to me in line. “I saw an ad for this a few days ago and randomly decided to come. I have no idea what they’re going to ask us to do.”

I didn’t want them to know I had an encyclopedic knowledge of Big Brother and had done extensive research into how to manage reality TV casting call dynamics as an introvert, and that I’d been practicing this for six years. I wanted to seem harmless, to make them feel comfortable to tell me things about themselves.

Read more…

Technology Is as Biased as Its Makers

"Patty Ramge appears dejected as she looks at her Ford Pinto." Bettmann / Getty

Lizzie O’Shea | an excerpt adapted from Future Histories: What Ada Lovelace, Tom Paine, and the Paris Commune Teach Us about Digital Technology | Verso | May 2019 | 30 minutes (8,211 words)

In the late spring of 1972, Lily Gray was driving her new Ford Pinto on a freeway in Los Angeles, and her thirteen-year-old neighbor, Richard Grimshaw, was in the passenger seat. The car stalled and was struck from behind at around 30 mph. The Pinto burst into flames, killing Gray and seriously injuring Grimshaw. He suffered permanent and disfiguring burns to his face and body, lost several fingers and required multiple surgeries.

Six years later, in Indiana, three teenaged girls died in a Ford Pinto that had been rammed from behind by a van. The body of the car reportedly collapsed “like an accordion,” trapping them inside. The fuel tank ruptured and ignited into a fireball.

Both incidents were the subject of legal proceedings, which now bookend the history of one of the greatest scandals in American consumer history. The claim, made in these cases and most famously in an exposé in Mother Jones by Mike Dowie in 1977, was that Ford had shown a callous recklessness for the lives of its customers. The weakness in the design of the Pinto — which made it susceptible to fuel leaks and hence fires — was known to the company. So too were the potential solutions to the problem. This included a number of possible design alterations, one of which was the insertion of a plastic buffer between the bumper and the fuel tank that would have cost around a dollar. For a variety of reasons, related to costs and the absence of rigorous safety regulations, Ford mass-produced the Pinto without the buffer.

Most galling, Dowie documented through internal memos how at one point the company prepared a cost-benefit analysis of the design process. Burn injuries and burn deaths were assigned a price ($67,000 and $200,000 respectively), and these prices were measured against the costs of implementing various options that could have improved the safety of the Pinto. It turned out to be a monumental miscalculation, but, that aside, the morality of this approach was what captured the public’s attention. “Ford knows the Pinto is a firetrap,” Dowie wrote, “yet it has paid out millions to settle damage suits out of court, and it is prepared to spend millions more lobbying against safety standards.” Read more…

Critics: Endgame

Illustration by Homestead

Soraya Roberts | Longreads | May 2019 | 9 minutes (2,309 words)

It’s a strange feeling being a cultural critic at this point in history. It’s like standing on the deck of the Titanic, feeling it sink into the sea, hearing the orchestra play as they go down — then reviewing the show. Yes, it feels that stupid. And useless. And beside the point. But what if, I don’t know, embedded in that review, is a dissection of class hierarchy, of the fact that the players are playing because what else are you supposed to do when you come from the bottom deck? And what if the people left behind with them are galvanized by this knowledge? And what if, I don’t know, one of them does something about it, like stowing away their kids on a rich person’s boat? And what if someone is saved who might otherwise not have been? If art can save your soul, can’t writing about it do something similar?

The climate report, that metaphorical iceberg, hit in October. You know, the one that said we will all be royally screwed by 2040 unless we reduce carbon emissions to nothing. And then came news story after news story, like a stream of crime scene photos — submerged villages, starving animals, bleached reefs — again and again, wave after wave. It all coalesced into the moment David Attenborough — the man famous for narrating documentaries on the wonders of nature — started narrating the earth’s destruction. I heard about that scene in Our Planet, the one where the walruses start falling off the cliffs because there is no ice left to support them, and I couldn’t bring myself to watch it. Just like I couldn’t bring myself to read about the whales failing to reproduce and the millions of people being displaced. As a human being I didn’t know what to do, and as a cultural critic I was just as lost. So when Columbia Journalism Review and The Nation launched “Covering Climate Change: A New Playbook for a 1.5-Degree World,” along with a piece on how to get newsrooms to prioritize the environment, I got excited. Here is the answer, I thought. Finally.

But there was no answer for critics. I had to come up with one myself.

* * *

Four years ago, William S. Smith, soon to be the editor of Art in America, attended the Minneapolis-based conference “Superscript: Arts Journalism and Criticism in a Digital Age” and noticed the same strange feeling I mentioned. “The rousing moments when it appeared that artists could be tasked with emergency management and that critics could take on vested interests were, however, offset by a weird — and I would say mistaken — indulgence of powerlessness,” he wrote, recalling one speaker describing “criticism as the ‘appendix’ of the art world; it could easily be removed without damaging the overall system.” According to CJR, arts criticism has been expiring at a faster rate than newspapers themselves (is that even possible?). And when your job is devalued so steadily by the industry, it’s hard not to internalize. In these precarious circumstances, exercising any power, let alone taking it on, starts to feel Herculean.

Last week’s bloody battle — not that one — was only the latest reminder of critics’ growing insignificance. In response to several celebrities questioning their profession, beleaguered critics who might have proven they still matter by addressing larger, more urgent issues, instead made their critics’ point by making it all about themselves. First there was Saturday Night Live writer Michael Che denigrating Uproxx writer Steven Hyden on Instagram for critiquing Che’s Weekend Update partner Colin Jost. Then there was Lizzo tweeting that music reviewers should be “unemployed” after a mixed Pitchfork review. And finally, Ariana Grande calling out “all them blogs” after an E! host criticized Justin Bieber’s performance during her show. Various wounded critics responded in kind, complaining that people with so much more clout were using it to devalue them even more than they already have been. “It’s doubtful, for instance, that Lizzo or Grande would have received such blowback if they hadn’t invoked the specter of joblessness in a rapidly deteriorating industry,” wrote Alison Herman at The Ringer, adding, “They’re channeling a deeply troubling trend in how the public exaggerates media members’ power, just as that power — such as it is — has never been less secure.” 

That was the refrain of the weeklong collective wound-lick: “We’re just doing our jobs.” But it all came to a head when Olivia Munn attacked Go Fug Yourself, the fashion criti-comic blog she misconstrued as objectifying snark. “Red carpet fashion is a big business and an art form like any other, and as such there is room to critique it,” site owners Heather Cocks and Jessica Morgan responded, while a number of other critics seized the moment to redefine their own jobs, invoking the anti-media stance of the current administration to convey the gravity of misinterpreting their real function, which they idealized beyond reproach. At Vanity Fair, chief critic Richard Lawson wrote of his ilk offering “a vital counterbalance in whatever kind of cultural discourse we’re still able to have.” The Ringer’s Herman added that criticism includes “advocacy and the provision of context in addition to straightforward pans,” while Caroline Framke at Variety simply said, “Real critics want to move a conversation forward.” Wow, it almost makes you want to be one.

I understand the impulse to lean into idolatry in order to underscore the importance of criticism. Though it dates back as far as art itself, the modern conception of the critic finds its roots in 18th-century Europe, in underground socially aware critiques of newly arrived public art. U.K. artist James Bridle summed up this modern approach at “Superscript,” when he argued that the job of art is “to disrupt and complicate” society, adding, “I don’t see how criticism can function without making the same level of demands and responding to the same challenges as art itself — in a form of solidarity, but also for its own survival.” Despite this unifying objective, it’s important to be honest about what in actual practice passes for criticism these days (and not only in light of the time wasted by critics defending themselves). A lot of it — a lot — kowtows to fandom. And not just within individual reviews, but in terms of what is covered; “criticism” has largely become a publicity-fueled shill of the most high-profile popular culture. The positivity is so pervasive that the odd evisceration of a Bret Easton Ellis novel, for instance, becomes cause for communal rejoicing. An element of much of this polarized approach is an auteur-style analysis that treats each subject like a hermetically sealed objet d’art that has little interaction with the world.

The rare disruption these days tends to come from — you guessed it — writers of color, from K. Austin Collins turning a Green Book review into a meditation on the erasure of black history to Doreen St. Felix’s deconstruction of a National Geographic cover story into the erasure of a black future. This is criticism which does not just wrestle with the work, but also wrestles with the work within the world, parsing the way it reflects, feeds, fights — or none of the above — the various intersections of our circumstances. “For bold and original reviews that strove to put stage dramas within a real-world cultural context, particularly the shifting landscape of gender, sexuality and race,” the Pulitzer committee announced in awarding New Yorker theatre critic Hilton Als in 2017. A year later the prize for feature writing went to Rachel Kaadzi Ghansah, the one freelancer among the nominated staffers, for a GQ feature on Dylann Roof. Profiling everyone from Dave Chappelle to Missy Elliott, Ghansah situates popular culture within the present, the past, the personal, the political — everywhere, really. And this is what the best cultural criticism does. It takes the art and everything around it, and it reckons with all of that together.

But the discourse around art has not often included climate change, barring work which specifically addresses it. Following recent movements that have awoken the general populace to various systemic inequities, we have been slowly shifting toward an awareness of how those inequities inform contemporary popular culture. This has manifested in criticism with varying levels of success, from clunky references to Trump to more considered analyses of how historic disparity is reflected in the stories that are currently told. And while there has been an expansion in representation in the arts as a result, the underlying reality of these systemic shifts is that they don’t fundamentally affect the bottom line of those in power. There is a social acceptability to these adaptations, one which does not ask the 1 Percent to confront its very existence, ending up subsumed under it instead. A more threatening prospect would be reconsidering climate change, which would also involve reconsidering the economy — and the people who benefit from it the most.  

We are increasingly viewing extreme wealth not as success but as inequity — Disney’s billion-dollar opening weekend with Avengers: Endgame was undercut not only by critics who questioned lauding a company that is cannibalizing the entertainment industry, but by Bernie Sanders: “What would be truly heroic is if Disney used its profits from Avengers to pay all of its workers a middle class wage, instead of paying its CEO Bob Iger $65.6 million — over 1,400 times as much as the average worker at Disney makes.” More pertinent, however, is how environmentally sustainable these increasingly elaborate productions are. I am referring to not only literal productions, involving sets and shoots, but everything that goes into making and distributing any kind of art. (That includes publicity — what do you think the carbon footprint of BTS is?) In 2006, a report conducted by UCLA found that the film and television industries contributed more to air pollution in the region than almost all five of the other sectors studied. “From the environmental impact estimates, greenhouse gas emissions are clearly an area where the motion picture industry can be considered a significant contributor,” it stated, concluding, “it is clear that very few people in the industry are actively engaged with greenhouse gas emission reduction, or even with discussions of the issue.”

The same way identity politics has taken root in the critic’s psyche, informing the writing we do, so too must climate change. Establishing a sort of cultural carbon footprint will perhaps encourage outlets not to waste time hiring fans to write outdated consumers reviews that do no traffic in Rotten Tomatoes times. Instead of distracting readers with generic takes, they might shift their focus to the specifics of, for instance, an environmental narrative, such as the one in the lame 2004 disaster movie The Day After Tomorrow, which has since proven itself to be (if nothing else) a useful illustration of how climate change can blow cold as well as hot. While Game of Thrones also claimed a climate-driven plot, one wonders whether, like the aforementioned Jake Gyllenhaal blockbuster, the production planted $200,000 worth of trees to offset the several thousand tons of carbon dioxide it emitted. If the planet is on our minds, perhaps we will also feature Greta Thunberg in glossy magazines instead of Bari Weiss or Kellyanne Conway. Last year, The New York Times’ chief film critic, A.O. Scott, who devoted an entire book to criticism, wrote, “No reader will agree with a critic all the time, and no critic requires obedience or assent from readers. What we do hope for is trust. We try to earn it through the quality of our writing and the clarity of our thought, and by telling the truth.” And the most salient truth of all right now is that there is no art if the world doesn’t exist.

* * *

I am aware that I’m on one of the upper decks of this sinking ship. I have a contract with Longreads, which puts me somewhere in the lower middle class (that may sound unimpressive, but writers have a low bar). Perhaps even better than that, I work for a publication for which page views are not the driving force, so I can write to importance rather than trends. I am aware, also, that a number of writers do not have this luxury, but misrepresenting themselves as the vanguards of criticism not only does them a disservice but also discredits the remaining thoughtful discourse around art. A number of critics, however, are positioned better than me. Yet they personalize the existential question into one that is merely about criticism when the real question is wider: It’s about criticism in the world.

I am not saying that climate change must be shoehorned into every article‚ though even a non sequitur would be better than nothing — but I am saying that just as identity politics is now a consideration when we write, our planet should be too. What I am asking for is simply a widening of perspective, besides economics, besides race, beyond all things human, toward a cultural carbon footprint, one which becomes part of the DNA of our critiques and determines what we choose to talk about and what we say when we do. After more than 60 years of doing virtually the same thing, even nonagenarian David Attenborough knew he had to change tacks; it wasn’t enough just to show the loss of natural beauty, he had to point out how it affects us directly. As he told the International Monetary Fund last month: “We are in terrible, terrible trouble and the longer we wait to do something about it the worse it is going to get.” In Our Planet, Attenborough reminds us over and over that our survival depends on the earth’s. For criticism to survive, it must remind us just as readily.

* * *

Soraya Roberts is a culture columnist at Longreads.

The Man Who’s Going to Save Your Neighborhood Grocery Store

Illustration by Vinnie Neuberg

Joe Fassler | The New Food Economy & Longreads | April 2019 | 8,802 words (33 minutes)

This story is published in partnership with The New Food Economy, with reporting supported by the 11th Hour Food and Farming Fellowship at the University of California, Berkeley.  

In 2014, Rich Niemann, president and CEO of the Midwestern grocery company Niemann Foods, made the most important phone call of his career. He dialed the Los Angeles office of Shook Kelley, an architectural design firm, and admitted he saw no future in the traditional grocery business. He was ready to put aside a century of family knowledge, throw away all his assumptions, completely rethink his brand and strategy — whatever it would take to carry Niemann Foods deep into the 21st century.

“I need a last great hope strategy,” he told Kevin Kelley, the firm’s cofounder and principal. “I need a white knight.”

Part square-jawed cattle rancher, part folksy CEO, Niemann is the last person you’d expect to ask for a fresh start. He’s spent his whole life in the business, transforming the grocery chain his grandfather founded in 1917 into a regional powerhouse with more than 100 supermarkets and convenience stores across four states. In 2014, he was elected chair of the National Grocery Association. It’s probably fair to say no one alive knows how to run a grocery store better than Rich Niemann. Yet Niemann was no longer sure the future had a place for stores like his.

He was right to be worried. The traditional American supermarket is dying. It’s not just Amazon’s purchase of Whole Foods, an acquisition that trade publication Supermarket News says marked “a new era” for the grocery business — or the fact that Amazon hopes to launch a second new grocery chain in 2019, according to a recent report from The Wall Street Journal, with a potential plan to scale quickly by buying up floundering supermarkets. Even in plush times, grocery is a classic red oceanindustry, highly undifferentiated and intensely competitive. (The term summons the image of a sea stained with the gore of countless skirmishes.) Now, the industrys stodgy old playbook — “buy one, get onesales, coupons in the weekly circular is hurtling toward obsolescence. And with new ways to sell food ascendant, legacy grocers like Rich Niemann are failing to bring back the customers they once took for granted. You no longer need grocery stores to buy groceries.

Niemann hired Kelley in the context of this imminent doom. The assignment: to conceive, design, and build the grocery store of the future. Niemann was ready to entertain any idea and invest heavily. And for Kelley, a man whos worked for decades honing his vision for what the grocery store should do and be, it was the opportunity of a lifetime carte blanche to build the working model hes long envisioned, one he believes can save the neighborhood supermarket from obscurity.

Kevin Kelley, illustration by Vinnie Neuberg

Rich Niemann, illustration by Vinnie Neuberg

The store that resulted is called Harvest Market, which opened in 2016. Its south of downtown Champaign, Illinois, out by the car dealerships and strip malls; 58,000 square feet of floor space mostly housed inside a huge, high-ceilinged glass barn. Its bulk calls to mind both the arch of a hayloft and the heavenward jut of a church. But you could also say its shaped like an ark, because its meant to survive an apocalypse.

Harvest Market is the anti-Amazon. Its designed to excel at what e-commerce can’t do: convene people over the mouth-watering appeal of prize ingredients and freshly prepared food. The proportion of groceries sold online is expected to swell over the next five or six years, but Harvest is a bet that behavioral psychology, spatial design, and narrative panache can get people excited about supermarkets again. Kelley isnt asking grocers to be more like Jeff Bezos or Sam Walton. Hes not asking them to be ruthless, race-to-the-bottom merchants. In fact, he thinks that grocery stores can be something far greater than we ever imagined a place where farmers and their urban customers can meet, a crucial link between the city and the country.

But first, if theyre going to survive, Kelley says, grocers need to start thinking like Alfred Hitchcock.

* * *

Kevin Kelley is an athletic-looking man in his mid-50s , with a piercing hazel gaze that radiates thoughtful intensity. In the morning, he often bikes two miles to Shook Kelley’s office in Hollywood — a rehabbed former film production studio on an unremarkable stretch of Melrose Avenue, nestled between Bogie’s Liquors and a driving school. Four nights a week, he visits a boxing gym to practice Muay Thai, a form of martial arts sometimes called “the art of eight limbs” for the way it combines fist, elbow, knee, and shin attacks. “Martial arts,” Kelley tells me, “are a framework for handling the unexpected.” That’s not so different from his main mission in life: He helps grocery stores develop frameworks for the unexpected, too.

You’ve never heard of him, but then it’s his job to be invisible. Kelley calls himself a supermarket ghostwriter: His contributions are felt more than seen, and the brands that hire him get all the credit. Countless Americans have interacted with his work in intimate ways, but will never know his name. Such is the thankless lot of the supermarket architect.

A film buff equally fascinated by advertising and the psychology of religion, Kelley has radical theories about how grocery stores should be built, theories that involve terms like emotional opportunity,” “brain activity,” “climax,and “mise-en-scène.But before he can talk to grocers about those concepts, he has to convince them of something far more elemental: that their businesses face near-certain annihilation and must change fundamentally to avoid going extinct.

It is the most daunting feeling when you go to a grocery store chain, and you meet with these starched-white-shirt executives,Kelley tells me. When we get a new job, we sit around this table we do it twenty, thirty times a year. Old men, generally. Don’t love food, progressive food. Just love their old food like Archie Bunkers, essentially. You meet these people and then you tour their stores. Then I’ve got to go convince Archie Bunker that there’s something called emotions, that there are these ideas about branding and feeling. It is a crazy assignment. I can’t get them to forget that they’re no longer in a situation where they’ve got plenty of customers. That its do-or-die time now.

Forget branding. Forget sales. Kelley’s main challenge is redirecting the attention of older male executives, scared of the future and yet stuck in their ways, to the things that really matter.

I make my living convincing male skeptics of the power of emotions,he says.

Human beings, it turns out, aren’t very good at avoiding large-scale disaster. As you read this, the climate is changing, thanks to the destructively planet-altering activities of our species. The past four years have been the hottest on record. If the trend continues — and virtually all experts agree it will — we’re likely to experience mass disruptions on a scale never before seen in human history. Drought will be epidemic. The ocean will acidify. Islands will be swallowed by the sea. People could be displaced by the millions, creating a new generation of climate refugees. And all because we didn’t move quickly enough when we still had time.

You know this already. But I bet you’re not doing much about it — not enough, at least, to help avert catastrophe. I’ll bet your approach looks a lot like mine: worry too much, accomplish too little. The sheer size of the problem is paralyzing. Vast, systemic challenges tend to short-circuit our primate brains. So we go on, as the grim future bears down.

Grocers, in their own workaday way, fall prey to the same inertia. They got used to an environment of relative stability. They don’t know how to prepare for an uncertain future. And they can’t force themselves to behave as if the good times are really going to go away — even if, deep down, they know it’s true.

I make my living convincing male skeptics of the power of emotions.

In the 1980s, you could still visit almost any community in the U.S. and find a thriving supermarket. Typically, it would be a dynasty family grocery store, one that had been in business for a few generations. Larger markets usually had two or three players, small chains that sorted themselves out along socioeconomic lines: fancy, middlebrow, thrifty. Competition was slack and demand — this is the beautiful thing about selling food — never waned. For decades, times were good in the grocery business. Roads and schools were named after local supermarket moguls, who often chaired their local chambers of commerce. “When you have that much demand, and not much competition, nothing gets tested. Kind of like a country with a military that really doesn’t know whether their bullets work,” Kelley says. “They’d never really been in a dogfight.”

It’s hard to believe now, but there was not a single Walmart on the West Coast until 1990. That decade saw the birth of the “hypermarket” and the beginning of the end for traditional grocery stores — Walmarts, Costcos, and Kmarts became the first aggressive competition supermarkets ever really faced, luring customers in with the promise of one-stop shopping on everything from Discmen to watermelon.

The other bright red flag: Americans started cooking at home less and eating out more. In 2010, Americans dined out more than in for the first time on record, the culmination of a slow shift away from home cooking that had been going on since at least the 1960s. That trend is likely to continue. According to a 2017 report from the USDA’s Economic Research Service, millennials shop at food stores less than any other age group, spend less time preparing food, and are more likely to eat carry-out, delivery, or fast food even when they do eat at home. But even within the shrinking market for groceries, competition has stiffened. Retailers not known for selling food increasingly specialize in it, a phenomenon called “channel blurring”; today, pharmacies like CVS sell pantry staples and packaged foods, while 99-cent stores like Dollar General are a primary source of groceries for a growing number of Americans. Then there’s e-commerce. Though only about 3 percent of groceries are currently bought online, that figure could rocket to 20 percent by 2025. From subscription meal-kit services like Blue Apron to online markets like FreshDirect and Amazon Fresh, shopping for food has become an increasingly digital endeavor — one that sidesteps traditional grocery stores entirely.

A cursory glance might suggest grocery stores are in no immediate danger. According to the data analytics company Inmar, traditional supermarkets still have a 44.6 percent market share among brick-and-mortar food retailers. And though a spate of bankruptcies has recently hit the news, there are actually more grocery stores today than there were in 2005. Compared to many industries — internet service, for example — the grocery industry is still a diverse, highly varied ecosystem. Forty-three percent of grocery companies have fewer than four stores, according to a recent USDA report. These independent stores sold 11 percent of the nation’s groceries in 2015, a larger collective market share than successful chains like Albertson’s (4.5 percent), Publix (2.25 percent), and Whole Foods (1.2 percent).

But looking at this snapshot without context is misleading — a little like saying that the earth can’t be warming because it’s snowing outside. Not long ago, grocery stores sold the vast majority of the food that was prepared and eaten at home — about 90 percent in 1988, according to Inmar. Today, their market share has fallen by more than half, even as groceries represent a diminished proportion of overall food sold. Their slice of the pie is steadily shrinking, as is the pie itself.

By 2025, the thinking goes, most Americans will rarely enter a grocery store. That’s according to a report called “Surviving the Brave New World of Food Retailing,” published by the Coca-Cola Retailing Research Council — a think tank sponsored by the soft drink giant to help retailers prepare for major changes. The report describes a retail marketplace in the throes of massive change, where supermarkets as we know them are functionally obsolete. Disposables and nonperishables, from paper towels to laundry detergent and peanut butter, will replenish themselves automatically, thanks to smart-home sensors that reorder when supplies are low. Online recipes from publishers like Epicurious will sync directly to digital shopping carts operated by e-retailers like Amazon. Impulse buys and last-minute errands will be fulfilled via Instacart and whisked over in self-driving Ubers. In other words, food — for the most part — will be controlled by a small handful of powerful tech companies.

The Coca-Cola report, written in consultation with a handful of influential grocery executives, including Rich Niemann, acknowledges that the challenges are dire. To remain relevant, it concludes, supermarkets will need to become more like tech platforms: develop a “robust set of e-commerce capabilities,” take “a mobile-first approach,” and leverage “enhanced digital assets.” They’ll need infrastructure for “click and collect” purchasing, allowing customers to order online and pick up in a jiffy. They’ll want to establish a social media presence, as well as a “chatbot strategy.” In short, they’ll need to become Amazon, and they’ll need to do it all while competing with Walmart — and its e-commerce platform, — on convenience and price.

That’s why Amazon’s acquisition of Whole Foods Market was terrifying to so many grocers, sending the stocks of national chains like Kroger tumbling: It represents a future they can’t really compete in. Since August 2017, Amazon has masterfully integrated e-commerce and physical shopping, creating a muscular hybrid that represents an existential threat to traditional grocery stores. The acquisition was partially a real estate play: Whole Foods stores with Prime lockers now act as a convenient pickup depot for Amazon goods. But Amazon’s also doing its best to make it too expensive and inconvenient for its Prime members, who pay $129 a year for free two-day shipping and a host of other perks, to shop anywhere else. Prime members receive additional 10 percent discounts on select goods at Whole Foods, and Amazon is rolling out home grocery delivery in select areas. With the Whole Foods acquisition, then, Amazon cornered two markets: the thrift-driven world of e-commerce and the pleasure-seeking universe of high-end grocery. Order dish soap and paper towels in bulk on Amazon, and pick them up at Whole Foods with your grass-fed steak.

Traditional grocers are now expected to offer the same combination of convenience, flexibility, selection, and value. They’re understandably terrified by this scenario, which would require fundamental, complex, and very expensive changes. And Kelley is terrified of it, too, though for a different reason: He simply thinks it wont work. In his view, supermarkets will never beat Walmart and Amazon at what they do best. If they try to succeed by that strategy alone, theyll fail. That prospect keeps Kelley up at night because it could mean a highly consolidated marketplace overseen by just a handful of players, one at stark contrast to the regional, highly varied food retail landscape America enjoyed throughout the 20th century.

I’m afraid of what could happen if Walmart and Amazon and Lidl are running our food system, the players trying to get everything down to the lowest price possible,he tells me. What gives me hope is the upstarts who will do the opposite. Who arent going to sell convenience or efficiency, but fidelity.

The approach Kelley’s suggesting still means completely overhauling everything, with no guarantee of success. It’s a strategy that’s decidedly low-tech, though it’s no less radical. It’s more about people than new platforms. It means making grocery shopping more like going to the movies.

* * *

Nobody grows up daydreaming about designing grocery stores, including Kelley. As a student at the University of North Carolina at Charlotte, he was just like every other architect-in-training: He wanted to be a figure like Frank Gehry, building celebrated skyscrapers and cultural centers. But he came to feel dissatisfied with the culture of his profession. In his view, architects coldly fixate on the aesthetics of buildings and aren’t concerned enough with the people inside.

“Architecture worships objects, and Capital-A architects are object makers,” Kelley tells me. “They aren’t trying to fix social issues. People and their experience and their perceptions and behaviors don’t matter to them. They don’t even really want people in their photographs—or if they have to, they’ll blur them out.” What interested Kelley most was how people would use his buildings, not how the structures would fit into the skyline. He wanted to shape spaces in ways that could actually affect our emotions and personalities, bringing out the better angels of our nature. To his surprise, no one had really quantified a set of rules for how environment could influence behavior. Wasn’t it strange that advertising agencies spent so much time thinking about the links between storytelling, emotions, and decision-making — while commercial spaces, the places where we actually go to buy, often had no design principle beyond brute utility?

My ultimate goal was to create a truly multidisciplinary firm that was comprised of designers, social scientists and marketing types,” he says. “It was so unorthodox and so bizarrely new in terms of approach that everyone thought I was crazy.”

In 1992, when he was 28, Kelley cofounded Shook Kelley with the Charlotte, North Carolina–based architect and urban planner Terry Shook. Their idea was to offer a suite of services that bridged social science, branding, and design, a new field they called “perception management.” They were convinced space could be used to manage emotion, just the way cinema leads us through a guided sequence of feelings, and wanted to turn that abstract idea into actionable principles. While Shook focused on bigger, community-oriented spaces like downtown centers and malls, Kelley focused on the smaller, everyday commercial spaces overlooked by fancy architecture firms: dry cleaners, convenience stores, eateries, bars. One avant-garde restaurant Kelley designed in Charlotte, called Props, was an homage to the sitcom craze of the 1990s. It was built to look like a series of living rooms, based on the apartment scenes in shows like Seinfeld and Friends and featured couches and easy chairs instead of dining tables to encourage guests to mingle during dinner.

The shift to grocery stores didn’t happen until a few years later, almost by accident. In the mid-’90s, Americans still spent about 55 percent of their food dollars on meals eaten at home — but that share was declining quickly enough to concern top corporate brass at Harris Teeter, a Charlotte-area, North Carolina–based grocery chain with stores throughout the Southwestern United States. (Today, Harris Teeter is owned by Kroger, the country’s second-largest seller of groceries behind Walmart.) Harris Teeter execs reached out to Shook Kelley. “We hear you’re good with design, and you’re good with food,” Kelley remembers Harris Teeter reps saying. “Maybe you could help us.”

At first, it was Terry Shook’s account. He rebuilt each section of the store into a distinct “scene” that reinforced the themes and aesthetics of the type of food it sold. The deli counter became a mocked-up urban delicatessen, complete with awning and neon sign. The produce section resembled a roadside farmstand. The dairy cases were corrugated steel silos, emblazoned with the logo of a local milk supplier. And he introduced full-service cafés, a novelty for grocery stores at the time, with chrome siding like a vintage diner. It was pioneering work, winning that year’s Outstanding Achievement Award from the International Interior Design Association — according to Kelley, it was the first time the prestigious award had ever been given to a grocery store.

Shook backed off of grocery stores after launching the new Harris Teeter, but the experience sparked Kelley’s lifelong fascination with grocery stores, which he realized were ideal proving grounds for his ideas about design and behavior. Supermarkets contain thousands of products, and consumers make dozens of decisions inside them — decisions about health, safety, family, and tradition that get to the core of who they are. He largely took over the Harris Teeter account and redesigned nearly 100 of the chain’s stores, work that would go on to influence the way the industry saw itself and ultimately change the way stores are built and navigated.

Since then, Kelley has worked to show grocery stores that they don’t have to worship at the altar of supply-side economics. He urges grocers to appeal instead to our humanity. Kelley asks them to think more imaginatively about their stores, using physical space to evoke nostalgia, delight our senses, and appeal to the parts of us motivated by something bigger and more generous than plain old thrift. Shopping, for him, is all about navigating our personal hopes and fears, and grocery stores will only succeed when they play to those emotions.

When it works, the results are dramatic. Between 2003 and 2007, Whole Foods hired Shook Kelley for brand strategy and store design, working with the firm throughout a crucial period of the chain’s development. The fear was that as Whole Foods grew, its image would become too diffuse, harder to differentiate from other health food stores; at the same time, the company wanted to attract more mainstream shoppers. Kelley’s team was tasked with finding new ways to telegraph the brand’s singular value. Their solution was a hierarchical system of signage that would streamline the store’s crowded field of competing health and wellness claims.

Kelley’s view is that most grocery stores are “addicted” to signage, cramming their spaces with so many pricing details, promotions, navigational signs, ads, and brand assets that it “functionally shuts down [the customer’s] ability to digest the information in front of them.”

Kelley’s team stipulated that Whole Foods could only have seven layers of information, which ranged from evocative signage 60 feet away to descriptive displays six feet from customers to promotional info just six inches from their hands. Everything else was “noise,” and jettisoned from the stores entirely. If you’ve ever shopped at Whole Foods, you probably recognize the way that the store’s particular brand of feel-good, hippie sanctimony seems to permeate your consciousness at every turn. Kelley helped invent that. The system he created for pilot stores in Princeton, New Jersey, and Louisville, Kentucky, were scaled throughout the chain and are still in use today, he says. (Whole Foods did not respond to requests for comment for this story.)

With a carefully delineated set of core values guiding its purchasing and brand, Whole Foods was ripe for the kind of visual overhaul Kelley specializes in. But most regional grocery chains have a different set of problems: They don’t really have values to telegraph in the first place. Shook Kelley’s approach is about getting buttoned-down grocers to reflect on their beliefs, tapping into deeper, more primal reasons for wanting to sell food.

* * *

Today, Kelley and his team have developed a playbook for clients, a finely tuned process to get shoppers to think in terms that go beyond bargain-hunting. It embraces what he calls “the theater of retail” and draws inspiration from an unlikely place: the emotionally laden visual language of cinema. His goal is to convince grocers to stop thinking like Willy Loman — like depressed, dejected salesmen forever peddling broken-down goods, fixated on the past and losing touch with the present. In order to survive, Kelley says, grocers can’t be satisfied with providing a place to complete a chore. They’ll need to direct an experience.

Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.

Sign up

Today’s successful retail brands establish what Kelley calls a “brand realm,” or what screenwriters would call a story’s “setting.” We don’t usually think consciously about them, but realms subtly shape our attitude toward shopping the same way the foggy, noirishly lit streets in a Batman movie tell us something about Gotham City. Cracker Barrel is set in a nostalgic rural house. Urban Outfitters is set on a graffitied urban street. Tommy Bahama takes place on a resort island. It’s a well-known industry secret that Costco stores are hugely expensive to construct — they’re designed to resemble fantasy versions of real-life warehouses, and the appearance of thrift doesn’t come cheap. Some realms are even more specific and fanciful: Anthropologie is an enchanted attic, complete with enticing cupboards and drawers. Trader Joe’s is a crew of carefree, hippie traders shipping bulk goods across the sea. A strong sense of place helps immerse us in a store, getting us emotionally invested and (perhaps) ready to suspend the critical faculties that prevent a shopping spree.

Kelley takes this a few steps further. The Shook Kelly team, which includes a cultural anthropologist with a Ph.D., begins by conducting interviews with executives, staff, and locals, looking for the storytelling hooks they call “emotional opportunities.” These can stem from core brand values, but often revolve around the most intense, place-specific feelings locals have about food. Then Kelley finds ways to place emotional opportunities inside a larger realm with an overarching narrative, helping retailers tell those stories — not with shelves of product, but through a series of affecting “scenes.”

In Alberta, Canada, Shook Kelley redesigned a small, regional grocery chain now called Freson Bros. Fresh Market. In interviews, the team discovered that meat-smoking is a beloved pastime there, so Shook Kelley built huge, in-store smokers at each new location — a scene called “Banj’s Smokehouse” — that crank out pound after pound of the province’s signature beef, as well as elk, deer, and other kinds of meat (customers can even BYO meat to be smoked in-house). Kelley also designed stylized root cellars in each produce section, a cooler, darker corner of each store that nods to the technique Albertans use to keep vegetables fresh. These elements aren’t just novel ways to taste, touch, and buy. They reference cultural set points, triggering memories and personal associations. Kelley uses these open, aisle-less spaces, which he calls “perceptual rooms,” to draw customers through an implied sequence of actions, tempting them towards a specific purchase.

Something magical happens when you engage customers this way. Behavior changes in visible, quantifiable ways. People move differently. They browse differently. And they buy differently. Rather than progressing in a linear fashion, the way a harried customer might shoot down an aisle — Kelley hates aisles, which he says encourage rushed, menial shopping — customers zig-zag, meander, revisit. These behaviors are a sign a customer is “experimenting,” engaging with curiosity and pleasure rather than just trying to complete a task. “If I was doing a case study presentation to you, I would show you exact conditions where we don’t change the product, the price, the service. We just change the environment and we’ll change the behavior,” Kelley tells me. “That always shocks retailers. They’re like ‘Holy cow.’ They don’t realize how much environment really affects behavior.”

A strong sense of place helps immerse us in a store, getting us emotionally invested and (perhaps) ready to suspend the critical faculties that prevent a shopping spree.

In the mid-2000s, Nabisco approached Kelley’s firm, complaining that sales were down 16 percent in the cookie-and-cracker aisle. In response, Shook Kelley designed “Mom’s Kitchen,” which was piloted at Buehler’s, a 15-store chain in northern Ohio. Kelley took Nabisco’s products out of the center aisles entirely and installed them in a self-contained zone: a perceptual room built out to look like a nostalgic vision of suburban childhood, all wooden countertops, tile, and hanging copper pans. Shelves of Nabisco products from Ritz Crackers to Oreos lined the walls. Miniature packs of Animal Crackers waited out in a large bowl, drawers opened to reveal boxes of Saltines. The finishing touch had nothing to do with Nabisco and everything to do with childhood associations: Kelley had the retailers install fridge cases filled with milk, backlit and glowing. Who wants to eat Oreos without a refreshing glass of milk to wash them down?

The store operators weren’t sold. They found it confusing and inconvenient to stock milk in two places at once. But from a sales perspective, the experiment was a smash. Sales of Nabisco products increased by as much as 32 percent, and the entire cookie-and-cracker segment experienced a halo effect, seeing double-digit jumps. Then, the unthinkable: The stores started selling out of milk. They simply couldn’t keep it on the shelves.

You’d think that the grocery stores would be thrilled, that it would have them scrambling to knock over their aisles of goods, building suites of perceptual rooms. Instead, they retreated. Nabisco’s parent company at the time, Kraft, was excited by the results and kicked the idea over to a higher-up corporate division where it stalled. And Buehler’s, for its part, never did anything to capitalize on its success. When the Nabisco took “Mom’s Kitchen” displays down, Kelley says, the stores didn’t replace them.

Mom’s Kitchen, fully stocked. (Photo by Tim Buchman)

“We were always asking a different question: What is the problem you’re trying to solve through food?” Kelley says. “It’s not just a refueling exercise — instead, what is the social, emotional issue that food is solving for us? We started trying to work that into grocery. But we probably did it a little too early, because they weren’t afraid enough.”

Since then, Kelley has continued to build his case to unreceptive audiences of male executives with mixed success. He tells them that when customers experiment — when the process of sampling, engaging, interacting, and evaluating an array of options becomes a source of pleasure — they tend to take more time shopping. And that the more time customers spend in-store, the more they buy. In the industry, this all-important metric is called “dwell time.” Most retail experts agree that increasing dwell without increasing frustration (say, with long checkout times) will be key to the survival of brick-and-mortar retail. Estimates vary on how much dwell time increases sales; according to Davinder Jheeta, creative brand director of the British supermarket Simply Fresh, customers spent 1.3 percent more for every 1 percent increase in dwell time in 2015.

Another way to increase dwell time? Offer prepared foods. Delis, cafes, and in-store restaurants increase dwell time and facilitate pleasure while operating with much higher profit margins and recapturing some of the dining-out dollar that grocers are now losing. “I tell my clients, ‘In five years, you’re going to be in the restaurant business,” Kelley says, “‘or you’re going to be out of business.’”

Kelley’s job, then, is to use design in ways that get customers to linger, touch, taste, scrutinize, explore. The stakes are high, but the ambitions are startlingly low. Kelley often asks clients what he calls a provocative question: Rather than trying to bring in new customers, would it solve their problems if 20 percent of customers increased their basket size by just two dollars? The answer, he says, is typically an enthusiastic yes.

Just two more dollars per trip for every fifth customer — that’s what victory looks like. And failure? That looks like a food marketplace dominated by Walmart and Amazon, a world where the neighborhood supermarket is a thing of the past.

* * *

When Shook Kelley started working on Niemann’s account, things began the way they always did: looking for emotional opportunities. But the team was stumped. Niemann’s stores were clean and expertly run. There was nothing wrong with them. Niemann’s problem was that he had no obvious problem. There was no there there.

Many of the regionals Kelley works with have no obvious emotional hook; all they know is that they’ve sold groceries for a long time and would like to keep on selling them. When he asks clients what they believe in, they show him grainy black-and-white photos of the stores their parents and grandparents ran, but they can articulate little beyond the universal goal of self-perpetuation. So part of Shook Kelley’s specialty is locating the distinguishing spark in brands that do nothing especially well, which isn’t always easy. At Buehler’s Fresh Foods, the chain where “Mom’s Kitchen” was piloted, the store’s Shook Kelley–supplied emotional theme is “Harnessing the Power of Nice.”

Still, Niemann Foods was an especially challenging case. “We were like, ‘Is there any core asset here?’” Kelley told me. “And we were like, ‘No. You really don’t have anything.’”

What Kelley noticed most was how depressed Niemann seemed, how gloomy about the fate of grocery stores in general. Nothing excited him — with one exception. Niemann runs a cattle ranch, a family operation in northeast Missouri. “Whenever he talked about cattle and feed and antibiotics and meat qualities, his physical body would change. We’re like, ‘My god. This guy loves ranching.’ He only had three hundred cattle or something, but he had a thousand pounds of interest in it.”

Niemann’s farm now has about 600 cattle, though it’s still more hobby farm than full-time gig — but it ended up being a revelation. During an early phase of the process, someone brought up “So God Made a Farmer” — a speech radio host Paul Harvey gave at the 1978 Future Farmers of America Convention that had been used in an ad for Ram trucks in the previous year’s Super Bowl. It’s a short poem that imagines the eighth day of the biblical creation, where God looks down from paradise and realizes his new world needs a caretaker. What kind of credentials is God looking for? Someone “willing to get up before dawn, milk cows, work all day in the fields, milk cows again, eat supper and then go to town and stay past midnight at a meeting of the school board.” God needs “somebody willing to sit up all night with a newborn colt. And watch it die. Then dry his eyes and say, ‘Maybe next year.’” God needs “somebody strong enough to clear trees and heave bails, yet gentle enough to yean lambs and wean pigs and tend the pink-combed pullets, who will stop his mower for an hour to splint the broken leg of a meadow lark.” In other words, God needs a farmer.

Part denim psalm, part Whitmanesque catalogue, it’s a quintessential piece of Americana — hokey and humbling like a Norman Rockwell painting, and a bit behind the times (of course, the archetypal farmer is male). And when Kelley’s team played the crackling audio over the speakers in a conference room in Quincy, Illinois, something completely unexpected happened. Something that convinced Kelley that his client’s stores had an emotional core after all, one strong enough to provide the thematic backbone for a new approach to the grocery store.

Rich Niemann, the jaded supermarket elder statesman, broke down and wept.

* * *

I have never been a fan of shopping. Spending money stresses me out. I worry too much to enjoy it. So I wanted to see if a Kelley store could really be what he said it was, a meaningful experience, or if it would just feel fake and hokey. You know, like the movies. When I asked if there was one store I could visit to see his full design principles in action, he told me to go to Harvest, the most interesting store in America.

Champaign is two hours south of O’Hare by car. Crossing its vast landscape of unrelenting farmland, you appreciate the sheer scale of Illinois, how far the state’s lower half is from Chicago. It’s a college town, which comes with the usual trappings — progressive politics, cafes and bars, young people lugging backpacks with their earbuds in — but you forget that fast outside the city limits. In 2016, some townships in Champaign county voted for Donald Trump over Hillary Clinton by 50 points.

I was greeted in the parking lot by Gerry Kettler, Niemann Foods’ director of consumer affairs. Vintage John Deere tractors formed a caravan outside the store. The shopping cart vestibules were adorned with images of huge combines roving across fields of commodity crops. Outside the wide-mouthed entryway, local produce waited in picket-fence crates — in-season tomatoes from Johnstonville, sweet onions from Warrensburg.

And then we stepped inside.

Everywhere, sunlight poured in through the tall, glass facade, illuminating a sequence of discrete, airy, and largely aisle-less zones. Kettler bounded around the store, pointing out displays with surprised joy on his face, as if he couldn’t believe his luck. The flowers by the door come from local growers like Delight Flower Farm and Illinois Willows. “Can’t keep this shit in stock,” he said. He makes me hold an enormous jackfruit to admire its heft. The produce was beautiful, he was right, with more local options than I’ve ever seen in a grocery store. The Warrensville sweet corn is eye-poppingly cheap: two bucks a dozen. There were purple broccolini and clamshells filled with squash blossoms, a delicacy so temperamental that they’re rarely sold outside of farmers’ markets. Early on, they had to explain to some teenage cashiers what they were — they’d never seen squash blossoms before.

I started to sense the “realm” Harvest inhabits: a distinctly red-state brand of America, local food for fans of faith and the free market. It’s hunting gear. It’s Chevys. It’s people for whom commercial-scale pig barns bring back memories of home. Everywhere, Shook Kelley signage — a hierarchy of cues like what Kelley dreamed up for Whole Foods — drives the message home. A large, evocative sign on the far wall reads Pure Farm Flavor, buttressed by the silhouettes of livestock, so large it almost feels subliminal. Folksy slogans hang on the walls, sayings like FULL OF THE MILK OF HUMAN KINDNESS and THE CREAM ALWAYS RISES TO THE TOP.

Then there are the informational placards that point out suppliers and methods.

There are at least a half dozen varieties of small-batch honey; you can find pastured eggs for $3.69. The liquor section includes local selections, like whiskey distilled in DeKalb and a display with cutting boards made from local wood by Niemann Foods’ HR Manager. “Turns out we had some talent in our backyard,” Kettler said. Niemann’s willingness to look right under his nose, sidestepping middlemen distributors to offer reasonably priced, local goods, is a hallmark of Harvest Market.

That shortened chain of custody is only possible because of Niemann and the lifetime of supply-side know-how he brings to table. But finding ways to offer better, more affordable food has been a long-term goal of Kelley — who strained his relationship with Whole Foods CEO John Mackey over the issue. As obsessed as Kelley is with appearances, he insists to me that his work must be grounded in something “real”: that grocery stores only succeed when they really try to make the world a better place through food. In his view, Whole Foods wasn’t doing enough to address its notoriously high prices — opening itself up to be undercut by cheaper competition, and missing a kind of ethical opportunity to make better food available to more people.

“When,” Kelley remembers asking, “did you start to mistake opulence for success?”

In Kelley’s telling, demand slackened so much during the Great Recession that it nearly lead to Whole Foods’ downfall, a financial setback that the company never fully recovered from — and, one could argue, ultimately led to its acquisition. Harvest Market, for its part, has none of Whole Foods’ clean-label sanctimony. It takes an “all-of-the-above” approach: There’s local produce, but there’re also Oreos and Doritos and Coca-Cola; at Thanksgiving, you can buy a pastured turkey from Triple S Farms or a 20-pound Butterball. But that strong emphasis on making local food more accessible and affordable makes it an interesting counterpart to Kelley’s former client.

The most Willy Wonka–esque touch is the hulking piece of dairy processing equipment in a glass room by the cheese case. It’s a commercial-scale butter churner — the first one ever, Kettler told me, to grace the inside of a grocery store.

“So this was a Shook Kelley idea,” he said, “We said yes, without knowing how much it would cost. And the costs just kept accelerating. But we’re thrilled. People love it.”Harvest Market isn’t just a grocery store — it’s also a federally inspected dairy plant. The store buys sweet cream from a local dairy, which it churns into house-made butter, available for purchase by the brick and used throughout Harvest’s bakery and restaurant. The butter sells out as fast as they can make it. Unlike the grocers who objected to “Mom’s Kitchen,” the staff don’t seem to mind.

As I walked through the store, I couldn’t help wondering how impressed I really was. I found Harvest to be a beautiful example of a grocery store, no doubt, and a very unusual one. What was it that made me want to encounter something more outrageous, more radical, more theatrical and bizarre? I wanted animatronic puppets. I wanted fog machines.

I should have known better — Kelley had warned me that you can’t take the theater of retail too far without breaking the dream. He’d told me that he admires stores where “you’re just not even aware of the wonder of the scene, you’re just totally engrossed in it” — stores a universe away from the overwrought, hokey feel of Disneyland. But I had Amazon’s new stores in the back up my mind as a counterpoint, with all their cashierless bells and whistles, their ability to click and collect, their ability to test-drive Alexa and play a song or switch on a fan. I guess, deep down, I was wondering if something this subtle really could work.

“Here, this is Rich Niemann,” Kettler said, and I found myself face-to-face with Niemann himself. We shook hands and he asked if I’d ever been to Illinois before. Many times, I told him. My wife is from Chicago, so we’ve visited the city often.

He grinned at me.

“That’s not Illinois,” he said.

We walked to Harvest’s restaurant, a 40-person seating area plus an adjacent bar with a row of stools, that offers standards like burgers, salads, and flatbreads. There’s an additional 80-person seating area on the second-floor mezzanine, a simulated living room complete with couches and board games. Beyond that, they pointed out the brand-new wine bar — open, like the rest of the space, until midnight. There’s a cooking classroom by the corporate offices. Through the window, I saw a classroom full of children doing something to vegetables. Adult Cooking classes run two or three nights every week, plus special events for schools and other groups.

For a summer weekday at noon in a grocery store I’m amazed how many people are eating and working on laptops. One guy has his machine hooked up to a full-sized monitor he lugged up the stairs — he’s made a customized wooden piece that hooks into Harvest’s wrought-iron support beams to create a platform for his plus-size screen. He comes every day, like it’s his office. He’s a dwell-time dream.

We sit down, and Kettler insists I eat the corn first, slathering it with the house-made butter and eating it while it’s hot. He reminds me that it’s grown by the Maddoxes, a family in Warrensburg, about 50 miles west of Champaign.

The corn was good, but I wanted to ask Niemann if the grocery industry was really that bad, and he told me it is. I assume he’ll want to talk about Amazon and its acquisition of Whole Foods and the way e-commerce has changed the game. He acknowledges that, but to my surprise he said the biggest factor is something else entirely — a massive shift happening in the world of consumer packaged goods, or CPGs.

For years, grocery stores never had to advertise, because the largest companies in the world — Proctor and Gamble, Coca-Cola, Nestle — did their advertising for them, just the way Nabisco helped finance “Mom’s Kitchen” to benefit the stores. People came to supermarkets to buy the foods they saw on TV. But Americans are falling out of love with legacy brands. They’re looking for something different, locality, a sense of novelty and adventure. Kellogg’s and General Mills don’t have the pull they once had.

When their sales flag, grocery sales do too — and the once-bulletproof alliance between food brands and supermarkets is splitting. For the past two years, the Grocery Manufacturers’ Association, an influential trade group representing the biggest food companies in the world, started to lose members. It began with Campbell’s Soup. Dean Foods, Mars, Tyson Foods, Unilever, Hershey Company, the Kraft Heinz Company, and others followed. That profound betrayal was a rude awakening: CPG companies don’t need grocery stores. They have Amazon. They can sell directly through their websites. They can launch their own pop-ups.

It’s only then that I realized how dire the predicament of grocery stores really is, and why Niemann was so frustrated when he first called Kevin Kelley. It’s one thing when you can’t sell as cheaply and conveniently as your competitors. But it’s another thing when no one wants what you’re selling.

Harvest doesn’t feel obviously futuristic in the way an Amazon store might. If I went there as a regular shopper and not as a journalist sniffing around for a story, I’m sure I’d find it to be a lovely and transporting way to buy food. But what’s going on behind the scenes is, frankly, unheard of.

Grocery stores have two ironclad rules. First, that grocers set the prices, and farmers do what they can within those mandates. And second, that everyone works with distributors who oversee the aggregation and transport of all goods. Harvest has traditional relationships with companies like Coca-Cola, but it breaks those rules with local farmers and foodmakers. Suppliers — from the locally milled wheat to the local produce to the Kilgus Farms sweet cream that goes into the churner — truck their products right to the back. By avoiding middlemen and their surcharges, Harvest is able to pay suppliers more directly and charge customers less. And it keeps costs low. You can still find $4.29 pints of Halo Top ice cream in the freezer, but the produce section features stunning bargains. When the Maddox family pulls up with its latest shipment of corn, people sometimes start buying it off the back of the truck in the parking lot. Thats massive change, and its virtually unheard of in supermarkets. At the same time, suppliers get to set their own prices. Niemann’s suppliers tell him what they need to charge; Niemann adds a standard margin and lets customers decide if they’re willing to pay.

If there’s a reason Harvest matters, it’s only partly because of the aesthetics. It’s mainly because the model of what a grocery store is has been tossed out and rebuilt. And why not? The world as Rich Niemann knows it is ending.

* * *

In 2017, just months after Harvest Market’s opening, Niemann won the Thomas K. Zaucha Entrepreneurial Excellence Award — the National Grocers Association’s top honor, given for “persistence, vision, and creative entrepreneurship.” That spring, Harvest was spotlighted in a “Store of the Month” cover feature in the influential trade magazine Progressive Grocer. Characteristically, the contributions of Kelley and his firm were not mentioned in the piece.

Niemann tells me his company is currently planning to open a second Harvest Market in Springfield, Illinois, about 90 minutes west of Champaign, in 2020. Without sharing specifics about profitability or sales numbers, he says the store was everything he’d hoped it would be as far as the metrics that most matter — year-over-year sales growth and customer engagement. His only complaint about the store, has to do with parking. For years, Niemann has relied on the same golden ratio to determine the size of parking lot needed for his stores — a certain number of spots for every thousand dollars of expected sales. Harvest’s lot uses the same logic, and it’s nowhere near enough space.

“In any grocery store, the customer’s first objective is pantry fill — to take care of my needs as best I can on my budget,” Niemann says. “But we created a different atmosphere. These customers want to talk. They want to know. They want to experience. They want to taste. They’re there because it’s an adventure.”

They stay so much longer than expected that the parking lot sometimes struggles to fit all their cars at once. Unlike the Amazon stores that may soon be cropping up in a neighborhood near you — reportedly, the company is considering plans to open 3,000 of them in by 2021 — it’s not about getting in and out quickly without interacting with another human being. At Harvest, you stay awhile. And that’s the point.

But Americans are falling out of love with legacy brands. They’re looking for something different, locality, a sense of novelty and adventure. Kellogg’s and General Mills don’t have the pull they once had.

So far, Harvest’s success hasn’t made it any easier for Kelley, who still struggles to persuade clients to make fundamental changes. They’re still as scared as they’ve always been, clinging to the same old ideas. He tells them that, above all else, they need to develop a food philosophy — a reason why they do this in the first place, something that goes beyond mere nostalgia or the need to make money. They need to build something that means something, a store people return to not just to complete a task but because it somehow sustains them. For some, that’s too tall an order. “They go, ‘I’m not going to do that.’ I’m like, ‘Then what are you going to do?’ And they literally tell me: ‘I’m going to retire.’” It’s easier to cash out. Pass the buck, and consign the fate of the world to younger people with bolder dreams.

Does it even matter? The world existed before supermarkets, and it won’t end if they vanish. And in the ongoing story of American food, the 20th-century grocery store is no great hero. A&P — the once titanic chain, now itself defunct — was a great mechanizer, undercutting the countless smaller, local businesses that used to populate the landscape. More generally, the supermarket made it easier for Americans to distance ourselves from what we eat, shrouding food production behind a veil and letting us convince ourselves that price and convenience matter above all else. We let ourselves be satisfied with the appearance of abundance — even if great stacks of unblemished fruit contribute to waste and spoilage, even if the array of brightly colored packages are all owned by the same handful of multinational corporations.

But whatever springs up to replace grocery stores will have consequences, too, and the truth is that brick-and-mortar is not going away any time soon — far from it. Instead, the most powerful retailers in the world have realized that physical spaces have advantages they want to capitalize on. It’s not just that stores in residential neighborhoods work well as distribution depots, ones that help facilitate the home delivery of packages. And it’s not just that we can’t always be home to pick up the shipments we ordered when they arrive, so stores remain useful. The world’s biggest brands are now beginning to realize what Kelley has long argued: Physical stores are a way to capture attention, to subject customers to an experience, to influence the way they feel and think. What could be more useful? And what are Amazon’s proposed cashierless stores, but an illustration of Kelley’s argument? They take a brand thesis, a set of core values — that shopping should be quick and easy and highly mechanized — and seduce us with it, letting us feel the sweep and power of that vision as we pass with our goods through the doors without paying, flushed with the thrill a thief feels.

This is where new troubles start. Only a few companies in the world will be able to compete at Amazon’s scale — the scale where building 3,000 futuristic convenience stores in three years may be a realistic proposition. Unlike in the golden age of grocery, where different family owned chains catered to different demographics, we’ll have only a handful of players. We’ll have companies that own the whole value chain, low to high. Amazon owns the e-commerce site where you can find almost anything in the world for the cheapest price. And for when you want to feel the heft of an heirloom tomato in your hand or sample some manchego before buying, there is Whole Foods. Online retail for thrift, in-person shopping for pleasure. Except one massive company now owns them both.

If this new landscape comes to dominate, we may find there are things we miss about the past. For all its problems, the grocery industry is at least decentralized, owned by no one dominant company and carved up into more players than you could ever count. It’s run by people who often live alongside the communities they serve and share their concerns. We might miss that competition, that community. They are small. They are nimble. They are independently, sometimes even cooperatively, owned. They employ people. And if they are scrappy, and ingenious, and willing to change, there’s no telling what they might do. It is not impossible that they could use their assets — financial resources, industry connections, prime real estate — to find new ways to supply what we all want most: to be happier, to be healthier, to feel more connected. To be better people. To do the right thing.

I want to believe that, anyway. That stores — at least in theory — could be about something bigger, and better than mere commerce. The way Harvest seems to want to be, with some success. But I wonder if that’s just a fantasy, too: the dream that we can buy and sell our way to a better world, that it will take no more than that.

Which one is right?

I guess it depends on how you feel about the movies.

Maybe a film is just a diversion, a way to feel briefly better about our lives, the limitations and disappointments that define us, the things we cannot change. Most of us leave the theater, after all, and just go on being ourselves.

Still, maybe something else is possible. Maybe in the moment when the music swells, and our hearts beat faster, and we feel overcome by the beauty of an image — in the instant that we feel newly brave and noble, and ready to be different, braver versions of ourselves — that we are who we really are.

* * *

Joe Fassler, The New Food Economy’s deputy editor, has covered the intersection of food, policy, technology, and culture for the magazine since 2015. His food reporting has twice been a finalist for the James Beard Foundation Award in Journalism. He’s also editor of Light the Dark: Writers on Creativity, Inspiration, and the Creative Process (Penguin, 2017), a book based on “By Heart,” his ongoing series of literary conversations for The Atlantic

Editor: Michelle Weber
Fact checker: Matt Giles
Copy editor: Jacob Z. Gross

Notes on Citizenship

Getty / Collage by Katie Kosma

Nina Li Coomes | Longreads | April 2019 | 14 minutes (3,609 words)

A month after Donald Trump is inaugurated president, my mother visits me in Boston. I have lived in the city for only a month, and my apartment is furnished, but barely. During the day, while I sit in a windowless office, my mother drags a suitcase down snowy Commonwealth Avenue to TJ Maxx, where she fills the rolling bag with comforting objects: a teal ceramic pitcher; a wire kitchen cart; a swirling, blue-and-white rug. She makes at least three trips down the hill to the store and back again.

When she is not buying knickknacks, she scrubs my buckling apartment floors. She wrings a rag in warm water, palms it over the wood, her posture and form impeccable as usual. Though I’d beg her not to do this, her actions make sense. For the 20 years we have lived in the United States, my mother has made a ritual of scrubbing the floors of all of our homes. In our first American house, in the unwelcoming cornfields of Illinois, I would know that all was well if I came through the front door to see the warm gleam of freshly scrubbed wood. In my parents’ house in Chicago, if I ever walked across the kitchen in my shoes by accident or, more likely, in a careless hurry, guilt would course down my back, the memory of her hunched by the radiator busily scrubbing flooding my mind. After college, when I lived in New York, she visited me there and insisted on getting down on her hands and knees again, though my roommate had a dog who shed constant, ungrateful clouds of black fur, making a clean floor impossible. In each place we have lived, no matter where we are, my mother has labored over the floor to make it home.

* * *

I was born in Japan to a Japanese mother and a white American father. After my birth, my parents sent an application the U.S. consulate for my American citizenship. The application included my Japanese birth certificate and an accompanying English translation, proof of their marriage in both languages, as well as proof of my father’s U.S. citizenship. My mother’s status as an ethnically Japanese national qualified me for Japanese citizenship upon birth. I have always been a dual citizen of both the United States and Japan.

As a child, I bragged about this status to my peers. I had two countries I could claim as my own, I would crow, two places to call home. My parents often chided me for this bragging, but my willful girl-self ignored them. Though my status as mixed race was most often confusing and other times painful, this was one place I found pride, a jolt of pleasure pulsing through my hands as I touched the spines of one blue and one red passport, both with my name emblazoned on the inside. At the customs kiosk in airports, I liked the momentary juggle my parents did, swapping out our U.S. passports for Japanese ones in Tokyo, and back again in Chicago. All of the coming and going resulted in my American passport looking like an absurdist travel log, appearing as if I left the country and came back a month later without ever entering another country. Though I was only ever just shuttling between the same two nations to visit one set of grandparents or another, childishly I imagined my dual citizenship as a secret mission, a doorway into which I could walk and disappear, existing in secret for a short while. Other times, my passports felt like a double-headed key, easing the pain of leaving one home with the improbable solution of arriving at a different one. My passports — their primary-colored bindings, their grainy texture and heavy pages, these were magical tokens of my childish belief in my double-belonging.

This was one place I found pride, a jolt of pleasure pulsing through my hands as I touched the spines of one blue and one red passport, both with my name emblazoned on the inside.

Dual citizenship is technically only legal in Japan until the age of 22, at which point an individual is required to make “declaration of citizenship,” effectively asking dual citizens to give up their claim on at least one of their countries of origin. There are, of course, ways around this. There are an estimated 700,000 dual citizens past the age of 22 living in Japan, though this number is probably skewed by the willingness of illegal dual citizens to come forward regarding their legal status. Some dual citizens choose never to declare, trusting in the inefficiencies of a labyrinthine bureaucracy to forget about legal technicalities. Others make their declaration in remote locations far from metropolises like Tokyo or Osaka with the hopes that less-urban officials will not take the time to ask for a renunciation of non-Japanese passports. Some, like me, renewed their passport on the eve of their 22nd birthday, effectively buying another four years to weigh the choice, hoping that laws might shift to allow for legally sustained dual citizenship.

* * *

In Japan, a person obtains citizenship not by birthplace but by blood: This is called jus sanguinis citizenship, or citizenship as defined by the “right of blood.” It does not matter if you are born in the country or out of it. You are only a citizen if you have at least one parent whose blood can be classified as Japanese. (There are some exceptions based on naturalization and statelessness.) Requiring Japanese blood as a tenet of citizenship implies that there is such a thing; that Japaneseness can be traced back to one, biologically determined race. In 2008, conservative lawmakers proposed that DNA testing become part of the process necessary to determine Japanese citizenship, suggesting that biological markers could identify Japanese blood over foreign blood. Though the proposal was ultimately thrown out on grounds of logistical and financial impossibility, it lays bare the use of Japanese citizenship to promote a Japanese ethnostate. Simply put, to Japan, an ideal citizen is someone who is 100 percent racially Japanese.

In the United States, people become citizens through a combination of jus sanguinis, “right of blood,” and jus soli, “right of soil.” If you are born within the boundaries of the United States of America, or born to a parent who is a U.S. citizen, you are granted U.S. citizenship. This idea is introduced in the 14th Amendment of the Constitution: “All persons born or naturalized in the United States, and subject to the jurisdiction thereof are citizens of the United States and of the State wherein they reside.” It is tempting to say that the U.S. is egalitarian, that it is not founded on ethnocentrism, but the citizenship clause of the 14th Amendment was written only as a result of the Civil War. It granted citizenship to Black Americans nearly a century after the nation’s founding and in many ways did so in name only.

Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.

Sign up

Though Asian Americans were granted citizenship in 1898, the Chinese Exclusion Act of 1882 insured that immigrant laborers were not given easily accessible avenues to permanent citizenship. By the same token, Supreme Court cases in the 1920s (Ozawa v. United States and United States v. Bhagat Singh Thind) established a further precedent barring Asians from naturalizing as citizens on account of their not being “free white persons.” The “free white persons” clause of naturalization in U.S. law was dissolved in 1952, but strict immigration quotas continued to be official policy until 1965. Before 1924, Native Americans were only considered citizens if they could be taxed, if they served in a war, married a white person, or disavowed their tribal allegiance. By the time the Indian Citizenship Act of 1924 passed, most had already followed these alternate paths to citizenship, and even then, states with large Native American populations refused to grant citizenship to their population for fear of the Native American vote. It took almost 25 years for the Indian Citizenship Act to be adopted by all 50 of the United States of America.

No matter the intention of our Founding Fathers or the text of the 14th Amendment, citizenship in the United States is complicated, fraught; at once given and taken away, fickle and traitorous, seemingly color-blind and yet in service to a majority of “free white persons.”

My passports — their primary-colored bindings, their grainy texture and heavy pages, these were magical tokens of my childish belief in my double-belonging.

This duplicity isn’t unique to the United States or Japan. It is the nature of citizenship to uphold humanity while simultaneously denying it. For the Roman philosopher Cicero, one of the first to consider the idea of the citizen, this duality was best explained as a trade-off between citizen and state. In return for completing certain civic responsibilities (say, paying your taxes and following road signs), citizens are offered rights: protection from the state, the ability to claim nationality, and the like. More than a thousand years later, German-born American philosopher and writer Hannah Arendt echoed this same sentiment by famously calling citizenship “the right to have rights.” In her view, citizenship was a necessary vehicle to deliver human rights. Simply being human didn’t give you access to things like life and liberty. One needs a state to fulfill them. Taken backwards, this implies that without a government’s acknowledgement of citizenship, a person can be stripped of the rights inherent to their existence. In other words, if you’re not a citizen, you’re not fully a person.

* * *

At the end of my mother’s Boston visit, her busy homemaking and floor-scrubbing now at an end, I take her to a donut shop for breakfast. Inside, a Cambodian family slips rings of hot fried dough glazed in honey into paper envelopes, handing them to construction workers, police officers, and university students. Behind the counter, on the other side of the kitchen door, no English exists. Instead, Cambodian wafts, punctured by laughter and sighs, tossed by the woman pouring coffee with her hand balled at her hip, the smiling man behind the counter, the surly teenager bussing half-finished plates of buttery scrambled eggs. Above the cash register proud signs hang declaring the store a “Boston Favorite,” a “Chosen Community Partner,” and the recipient of numerous other local awards.

At our sticky table, I find myself unexpectedly moved. Passing by the donut shop on my daily commute, I assumed that the curly pink neon signage, a relic from the ’50s preserved on a triangular storefront, was surely the property of a white family. Instead what I found was a family of South Asian immigrants, making a classic American food and serving it in their own fashion with aplomb. The donut shop seemed unconcerned with assimilation. Months later, I’d take my sister to the same donut shop and she’d say that she was confused. The decor inside made her feel like she should be eating some sort of noodles but instead she was eating a chocolate glazed cake donut.

As a rule, I am skeptical of the American Dream. I’m suspicious of what it sells and at what cost. What does it mean to believe in “life, liberty, and the pursuit of happiness” when the state reserves the right to take it away at a moment’s notice, to inter you and your family for looking like the enemy? What is freedom if it is a specific, circumscribed kind of freedom? A labored freedom? An unfair freedom? A tilted, volatile, violent freedom?

But at the donut shop, picking apart a vanilla-and-chocolate twist, I see a glimpse of what this country might offer: a promise of evolution, integrity, and acceptance. Perhaps this is what belonging in this country might mean, at its best: that something as classically American as a 1950s corner donut store could be taken over by a family of refugees from South Asia without pomp or angst. That the store and the family that run it can exist without concerning themselves with assimilating to a white American standard, but instead remain rooted in their own traditions and languages. Sitting in the corner table with my mother, I feel as if happiness, freedom, equality, these are hard to come by and elusive. But change, the potential for newness and its embrace, these might yet flourish. These prospects feel solid, somehow, steady and unconditional, vivacious in comparison to the pale two-faced promise of a passport. A hint that perhaps making a home for oneself actually has nothing to do with the cold detachment of a customs official, and more to do with the warmth of feeding your kin on a cold morning.

* * *

Here is how I once passed through customs in Tokyo:

After 14 hours of sitting in an economy class seat, the overhead bin bumping precariously along to turbulence, sleep evasive and slippery, I am greasy and dry-eyed. Everything feels dreamlike. Time moves in stilted seconds, late afternoon sunlight pouring in through pristine panels of glass when my mind is clamoring that it ought to be night. Passengers are herded like badly behaved cattle along moving walkways, the robotic woman’s voice telling us to please watch our step. The path curves, and soon the windows are replaced by gray walls and fluorescent lights. I continue to trudge forward, dragging my stubbornly lagging suitcase. On the walls are signs advertising caution about various strains of influenza.

Sitting in the corner table with my mother, I feel as if happiness, freedom, equality, these are hard to come by and elusive. But change, the potential for newness and its embrace, these might yet flourish.

At customs, placards hang from the ceiling, directing the flight crew to the right, followed by foreigners and tourists, with Japanese nationals and permanent residents filing to the far left. I take my place in the line to the left, feeling at once indignant and like an imposter. An anxious, scrambling feeling chases its tail under my collarbone. As I approach the sunken booth, I try to sound as local as it can get, hoping that the country bumpkin slur of my words will score me a point in the invisible tally of Japaneseness I imagine each customs official keeping. I answer questions about where I am staying, why I am here. Images of the kerosene stove in my grandmother’s front room, my grandfather’s furled fists, their unruly garden — these blossom in my mind, a talisman of home to hold tightly under my breath. Believe me, I pray, believe that I belong here. Inside my backpack, I can feel my other passport, my other citizenship, pulsating like a treacherous living thing.

* * *

It is not lost on me that the language of citizenship traffics in metaphors of life and death, but delivers on promise and rumors. We are given weighty, destiny-scaled ultimatums, discussions of blood and soil evoking images of birth and death, sustenance and longevity. Identification implies belonging, our membership to a country playing on notions of larger, state-bound families. The nation is our mother. The nation is our father. In giving us the gift of citizenship, it has labored to give us life and will lay us weeping in the ground.

But in delivery, citizenship becomes elusive and hard to pin down. It is promised to us with outstretched arms, then snatched away with ease. We are assured home and kinship; we arrive to find an empty house. We are drawn to the visage of a guardian — “Give me your tired, your poor, your huddled masses yearning to breathe free” — but we are greeted by a ghost.

* * *


After finishing our breakfast at the donut shop, my mother and I take a cab to Logan Airport so she can catch her flight home to Chicago. When we arrive, I help her check in and walk her to the TSA cordoned security area. She waves me away at the mouth of the line, the oblong maze of tangled tape empty at this apparently unpopular time to fly. “Go,” she says. I shake my head, watching her hoist her navy canvas bag over one shoulder, taking mincing steps through the open line in front of her. This shooing-and-staying, like the floor-washing, is another one of our family’s traditions. Whenever one of us leaves their home, whether it is in Japan or the U.S., whomever they are leaving staunchly refuses to leave the side of the security line until they can no longer see them. This staying put is an act of loyalty, of love, of claiming each other as our own. We are stating that no border crossing, no officialdom, no distance or space can slice its way through our bonds.

That day I watch my mother’s small body turn even smaller in the distance, and I feel a familiar animal anxiety dig its claws into my chest. Earlier that week, crowds of people poured into U.S. airports, protesting Donald Trump’s travel ban. Scenes of lobbies filled with protesters flooded televisions, mouths moving in angry unison on muted screens. Reports of families separated at customs, of loved ones canceling plans to visit their relatives in the U.S., patients unable to access American hospitals — these are the stories that dominated the news cycle.

Suddenly, as if someone had passed a transparency over my eyes, I see the TSA agent taking a closer look at my mother’s green card. I imagine his voice, meaty and rough when raised. I imagine my mother’s English, flattening as frustration crept into her voice. I imagine what I might do if someone emerged from the wings of the security booth to grab her by the arm, roughly escorting her to a private room. I imagine if I would shout, run, or stay rooted to the spot. At least she would be OK in Japan, a small voice, at once guilty and relieved, says inside me.

My mother passes through the security checkpoint without incident. She waves from behind the metal detector, her hand cleaving a wide, swinging arc in the air. 

* * *

Citizenship comes into sharp relief at the most important junctures of life. Two years after my mother’s visit to Boston, my now-husband and I go to the Cook County Clerk’s office, in Chicago, to obtain our marriage license. We are presented with a list of appropriate documents to prove our citizenship — driver’s licenses, passports, birth certificates. Above us, a looming sign proclaims: COOK COUNTY CLERK | BIRTH MARRIAGE DEATH. Birth, marriage, death: To be acknowledged, all these require proof of belonging to a nation. Plunking down my own driver’s license, I wonder what one does without the proper identification. A man ahead of us in line is turned away for not having the correct paperwork to claim his infant daughter’s birth certificate. Without the necessary government-issued credentials, no matter how strange it seemed, he could not receive proof that his daughter now existed outside the womb. Without citizenship, could you be born? Without it, could you die?

This staying put is an act of loyalty, of love, of claiming each other as our own. We are stating that no border crossing, no officialdom, no distance or space can slice its way through our bonds.

My wondering is of course borne of a certain kind of privilege. Undocumented and stateless people know exactly what it is like to live without citizenship. People dear to me have struggled for acknowledgement in the eyes of a mercurial state, granting and revoking rights with the turn of an administration. In many ways I am lucky to be presented with the conundrum of citizenship after 22 years of dual citizenship. I have had not one but two homes.

* * *

On my most recent trip home to Japan, this time to celebrate my new marriage with my family, I exited the plane groggy and barely awake. I followed the familiar corridor, the paneled light flickering, the woman’s voice telling us to mind the gap. Passengers plodded on, all of us filing forward to customs, noting the warnings for newer, more varied strains of flu. This time, I did not take the far left lane. Instead, I entered the country for the first time on a U.S. passport, my lapsed Japanese one tucked in my backpack, safely away from questions of allegiance, loyalty, and citizenship. A small part of me was relieved to filter through the droning line of tourists, no need to prove my worthiness of entry to a stony-faced official. A larger part of me wallowed in a shallow sadness, as if a pale premonition of grief, suspecting that this might be the first step toward exile.

Why do you speak Japanese so well? the man at customs barked, suspicious. Because my mother is Japanese, I answered, the image of her running a rag over my Boston floors, the homes she has created the world over for us, blurring my vision. Is this your only passport? he jabbed a finger at my solitary blue book. Yes, I smiled, three red booklets pulsing against my back.

* * *

Nina Li Coomes is a Japanese and American writer from Nagoya and Chicago. Her work can be found in The Atlantic, EATER, Catapult and elsewhere.

Editor: Danielle A. Jackson

Copy editor: Jacob Z. Gross