Soraya Roberts | Longreads | April 2019 | 6 minutes (1,674 words)
I didn’t do my homework last weekend. Here was the assignment: Beyoncé’s Homecoming — a concert movie with a live album tie-in — the biggest thing in culture that week, which I knew I was supposed to watch, not just as a critic, but as a human being. But I didn’t. Just like I didn’t watch the premiere of Game of Thrones the week before, or immediately listen to Lizzo’s Cuz I Love You. Instead, I watched something I wanted to: RuPaul’s Drag Race. What worse place is there to hide from the demands of pop culture than a show about drag queens, a set of performance artists whose vocabulary is almost entirely populated by celebrity references? In the third episode of the latest season, Vietnamese contestant Plastique Tiara is dragged for her uneven performance in a skit about Mariah Carey, and her response shocks the judges. “I only found out about pop culture about, like, three years ago,” she says. To a comically sober audience, she then drops the biggest bomb of all: “I found out about Beyoncé legit four years ago.” I think Michelle Visage’s jaw might still be on the floor.
“This is where you all could have worked together as a group to educate each other,” RuPaul explains. It is the perfect framing of popular culture right now — as a rolling curriculum for the general populace which determines whether you make the grade as an informed citizen or not. It is reminiscent of an actual educational philosophy from the 1930s, essentialism, which was later adopted by E.D. Hirsch, the man who coined the term “cultural literacy” as “the network of information that all competent readers possess.” Essentialist education emphasizes standardized common knowledge for the entire population, which privileges the larger culture over individual creativity. Essentialist pop culture does the same thing, flattening our imaginations until we are all tied together by little more than the same vocabulary.
***
The year 1987 was when Aretha Franklin became the first woman inducted into the Rock and Roll Hall of Fame, the Simpson family arrived on television (via The Tracey Ullman Show), and Mega Man was released on Nintendo. It was also the year Hirsch published Cultural Literacy: What Every American Needs to Know. None of those three pieces of history were in it (though People publisheda list for the pop-culturally literate in response). At the back of Hirsch’s book, hundreds of words and quotes delineated the things Americans need to know — “Mary Had a Little Lamb (text),” for instance — which would be expanded 15 years later into a sort of CliffsNotes version of an encyclopedia for literacy signaling. “Only by piling up specific, communally shared information can children learn to participate in complex cooperative activities with other members of their community,” Hirsch wrote. He believed that allowing kids to bathe in their “ephemeral” and “confined” knowledge about The Simpsons, for instance, would result in some sort of modern Tower of Babel situation in which no one could talk to anyone about anything (other than, I guess, Krusty the Klown). This is where Hirsch becomes a bit of a cultural fascist. “Although nationalism may be regrettable in some of its worldwide political effects, a mastery of national culture is essential to mastery of the standard language in every modern nation,” he explained, later adding, “Although everyone is literate in some local, regional, or ethnic culture, the connection between mainstream culture and the national written language justifies calling mainstream culture the basic culture of the nation.”
Because I am not very well-read, the first thing I thought of when I found Hirsch’s book wasthat scene in Peter Weir’s 1989 coming-of-age drama Dead Poet’s Society. You know the one I mean, where the prep school teacher played by Robin Williams instructs his class to tear the entire introduction to Understanding Poetry (by the fictional author J. Evans Pritchard) out of their textbooks. “Excrement,” he calls it. “We’re not laying pipe, we’re talking about poetry.” As an alternative, he expects this class of teenagers to think for themselves. “Medicine, law, business, engineering, these are all noble pursuits, and necessary to sustain life,” he tells them. “But poetry, beauty, romance, love, these are what we stay alive for.” Neither Pritchard nor Hirsch appear to have subscribed to this sort of sentiment. And their approach to high culture has of late seeped into low culture. What was once a privileging of certain aspects of high taste, has expanded into a privileging of certain “low” taste. Pop culture, traditionally maligned, now overcompensates, essentializing certain pieces of popular art as additional indicators of the new cultural literacy.
I’m not saying there are a bunch of professors at lecterns telling us to watch Game of Thrones, but there are a bunch of networks and streaming services that are doing that, and viewers and critics following suit, constantly telling us what we “have to” watch or “must” listen to or “should” read. Some people who are more optimistic than me have framed this prescriptive approach as a last-ditch effort to preserve shared cultural experiences. “Divided by class, politics and identity, we can at least come together to watch Game of Thrones — which averaged 32.8 million legal viewers in season seven,” wrote Judy Bermanin Time. “If fantasy buffs, academics, TV critics, proponents of Strong Female Characters, the Gay of Thrones crew, Black Twitter, Barack Obama, J. Lo, Tom Brady and Beyoncé are all losing their minds over the same thing at the same time, the demise of that collective obsession is worth lamenting — or so the argument goes.” That may sound a little extreme, but then presidential-hopeful Elizabeth Warrenblogs aboutGame of Thrones and you wonder.
Essentializing any form of art limits it, setting parameters on not only what we are supposed to receive, but how. AsWesley Morris wrote of our increasingly moralistic approach to culture, this “robs us of what is messy and tense and chaotic and extrajudicial about art.” Now, instead of approaching everything with a sense of curiosity, we approach with a set of guidelines. It’s like when you walk around a gallery with one of those audio tours held up to your ear, which is supposed to make you appreciate the art more fully, but instead tends to supplant any sort of discovery with one-size-fits-all analysis. With pop culture, the goal isn’t even that lofty. You get a bunch of white guys on Reddit dismantling the structure of a Star Wars trailer, for instance, reducing the conversation around it to mere mechanics. Or you get an exhaustive number of takes on Arya Stark’s alpha female sex scene in Game of Thrones. One of the most prestige-branded shows in recent memory, the latter in particular often occupies more web space than its storytelling deserves precisely because that is what it’s designed to do. As Berman wrote, “Game of Thrones has flourished largely because it was set up to flourish — because the people who bankroll prestige television decided before the first season even went into production that this story of battles, bastards and butts was worth an episodic budget three times as large as that of the typical cable series.” In this way, HBO — and the critics and viewers who stan HBO — have turned this show into one of the essentials even if it’s not often clear why.
Creating art to dominate this discursive landscape turns that art into a chore — in other words, cultural homework. This is where people start saying things like, “Do I HAVE to watch Captain Marvel?”and “feeling a lot of pressure to read sally rooney!”and “do i have to listen to the yeehaw album?” This kind of coercion has been known to cause an extreme side effect — reactance, a psychological phenomenon in which a person who feels their freedom being constricted adopts a combative stance, turning a piece of art we might otherwise be neutral about into an object of derision. The Guardian’s Oliver Burkeman called it “cultural cantankerousness” and used another psychological concept, optimal distinctiveness theory, to further explain it. That term describes how people try to balance feeling included and feeling distinct within a social group. Burkeman, however, favored his reactance as a form of self-protective FOMO avoidance. “My irritation at the plaudits heaped on any given book, film or play is a way of reasserting control,” he wrote. “Instead of worrying about whether I should be reading Ferrante, I’m defiantly resolving that I won’t.” (This was written in 2016; if it were written now, I’m sure he would’ve used Rooney).
***
Shortly after Beyoncé dropped Homecoming, her previous album, Lemonade, became available on streaming services. That one I have heard — a year after it came out. I didn’t write about it. I barely talked about it. No one wants to read why Beyoncé doesn’t mean much to me when there are a number of better critics who are writing about what she does mean to them and so many others (the same way there are smart, interested parties analyzing Lizzo and Game of Thrones and Avengers: Endgame and Rooney). I am not telling those people not to watch or listen to or read or find meaning there, I understand people have different tastes, that certain things are popular because they speak to us in a way other things haven’t. At the same time, I expect not to be told what to watch or listen to or read, because from what I see and hear around me, from what I read and who I talk to, I can define for myself what I need. After Lemonade came out, in a post titled “Actually,” Gawker’s Rich Juzwiak wrote, “It’s easier to explicate what something means than to illustrate what it does. If you want to know what it does, watch it or listen to it. It’s at your fingertips. … Right is right and wrong is wrong, but art at its purest defies those binaries.” In the same way, there is no art you have to experience, just as there is no art you have to not experience. There is only art — increasingly ubiquitous — and there is only you, and what happens between both of you is not for me to assign.
I have one job — building the Pepper-Crusted Beef on Brioche with Celery Root Salad, an elegant little bite to be passed during cocktail hour at the Park Avenue Armory Gala, a black-tie dinner for 760 people. In theory, it’s an easy hors d’oeuvre, a thin coin of rosy beef on bread with a tuft of salad on top. It’s 4:50 now and the doors open at 6:30, so I’ve got some time to assemble this thing. The ingredients can be served at room temperature — any temperature, really — and they were prepared earlier today by a separate team of cooks at the caterer’s kitchen on the far West Side of town, then packaged on sheet pans and in plastic deli containers for a truck ride to the venue. All I have to do is locate the ingredients in the boxes and coolers, find some space to work — my “station” — and begin marshaling a small army of beef-on-toasts so I’ve got enough of a quorum, 240 pieces or so, that when serve-out begins I’ll be able to keep pace with replenishment demand through a forty-five-minute cocktail hour.
Jhovany León Salazar, the kitchen assistant leading the hors d’oeuvre (“H.D.”) kitchen, shows me the photo the executive chef supplied that reveals the precise architecture of this bite: a slice of seared beef tenderloin, rare in the center and the size of a Kennedy half-dollar, resting on a slightly larger round of toasted brioche.[1] On top of the beef is a tangle of rich celeriac slaw — superfine threads of shredded celery root slicked with mayo, with a sprinkling of fresh chives showered over the whole. This is New York–caliber catering intelligence at work: take a throwback classic — the beef tenderloin carving station — to a higher, more knowing plane in a single bite. Here, the colors are lively, the scale is humane, the meat perfectly rosy-rare and tender, its edge seared black with ground pepper and char, the celeriac bringing novelty, though its flavor is familiar enough. It’s a pro design that satisfies the meat-’n’-potatoes crowd without talking down to the epicures.
The kitchen tonight — like every night, no matter the venue — is as makeshift as a school bake sale, a series of folding tables covered with white tablecloths and fashioned into a fort-like U. Since there are two warm hors d’oeuvres on the menu, our crew has a hotbox standing by — the tall, aluminum cabinet on wheels that both serves as transport vehicle for food and, once it’s on-site and loaded with a few flaming cans of jellied fuel (the odor-free version of Sterno is favored), becomes the oven. Imagine the most flame-averse venues — the New York Public Library, City Hall, the Metropolitan Museum of Art — even there, the ghostly blue flames in the hotbox pass muster with the fire marshal. In fact, this one fudge, this unspoken exception to the no-open-flames rule, is the secret to restaurant-quality catering in New York City. Read more…
In 2014, Rich Niemann, president and CEO of the Midwestern grocery company Niemann Foods, made the most important phone call of his career. He dialed the Los Angeles office of Shook Kelley, an architectural design firm, and admitted he saw no future in the traditional grocery business. He was ready to put aside a century of family knowledge, throw away all his assumptions, completely rethink his brand and strategy — whatever it would take to carry Niemann Foods deep into the 21st century.
“I need a last great hope strategy,” he told Kevin Kelley, the firm’s cofounder and principal. “I need a white knight.”
Part square-jawed cattle rancher, part folksy CEO, Niemann is the last person you’d expect to ask for a fresh start. He’s spent his whole life in the business, transforming the grocery chain his grandfather founded in 1917 into a regional powerhouse with more than 100 supermarkets and convenience stores across four states. In 2014, he was elected chairof the National Grocery Association. It’s probably fair to say no one alive knows how to run a grocery store better than Rich Niemann. Yet Niemann was no longer sure the future had a place for stores like his.
He was right to be worried. The traditional American supermarket is dying. It’s not just Amazon’s purchase of Whole Foods, an acquisition that trade publication Supermarket News says marked “a new era” for the grocery business — or the fact that Amazon hopes to launch a second new grocery chain in 2019, according to a recent report from The Wall Street Journal, with a potential plan to scale quickly by buying up floundering supermarkets. Even in plush times, grocery isa classic “red ocean” industry, highly undifferentiated and intensely competitive. (The term summons the image of a sea stained with the gore of countless skirmishes.) Now, the industry’s stodgy old playbook — “buy one, get one” sales, coupons in the weekly circular — is hurtling toward obsolescence. And with new ways to sell food ascendant, legacy grocers like Rich Niemann are failing to bring back the customers they once took for granted. You no longer need grocery stores to buy groceries.
Niemann hired Kelley in the context of this imminent doom. The assignment: to conceive, design, and build the grocery store of the future. Niemann was ready to entertain any idea and invest heavily. And for Kelley, a man who’s worked for decades honing his vision for what the grocery store should do and be, it was the opportunity of a lifetime — carte blanche to build the working model he’s long envisioned, one he believes can save the neighborhood supermarket from obscurity.
Kevin Kelley, illustration by Vinnie Neuberg
Rich Niemann, illustration by Vinnie Neuberg
The store that resulted is calledHarvest Market, which opened in 2016. It’s south of downtown Champaign, Illinois, out by the car dealerships and strip malls; 58,000 square feet of floor space mostly housed inside a huge, high-ceilinged glass barn. Its bulk calls to mind both the arch of a hayloft and the heavenward jut of a church. But you could also say it’s shaped like an ark, because it’s meant to survive an apocalypse.
Harvest Market is the anti-Amazon. It’s designed to excel at what e-commerce can’t do: convene people over the mouth-watering appeal of prize ingredients and freshly prepared food. The proportion of groceries sold onlineis expected to swell over the next five or six years, but Harvest is a bet that behavioral psychology, spatial design, and narrative panache can get people excited about supermarkets again. Kelley isn’t asking grocers to be more like Jeff Bezos or Sam Walton. He’s not asking them to be ruthless, race-to-the-bottom merchants. In fact, he thinks that grocery stores can be something far greater than we ever imagined — a place where farmers and their urban customers can meet, a crucial link between the city and the country.
But first, if they’re going to survive, Kelley says, grocers need to start thinking like Alfred Hitchcock.
* * *
Kevin Kelley is an athletic-looking man in his mid-50s, with a piercing hazel gaze that radiates thoughtful intensity. In the morning, he often bikes two miles to Shook Kelley’s office in Hollywood — a rehabbed former film production studio on an unremarkable stretch of Melrose Avenue, nestled between Bogie’s Liquors and a driving school. Four nights a week, he visits a boxing gym to practice Muay Thai, a form of martial arts sometimes called “the art of eight limbs” for the way it combines fist, elbow, knee, and shin attacks. “Martial arts,” Kelley tells me, “are a framework for handling the unexpected.”That’s not so different from his main mission in life: He helps grocery stores develop frameworks for the unexpected, too.
You’ve never heard of him, but then it’s his job to be invisible. Kelley calls himself a supermarket ghostwriter: His contributions are felt more than seen, and the brands that hire him get all the credit. Countless Americans have interacted with his work in intimate ways, but will never know his name. Such is the thankless lot of the supermarket architect.
A film buff equally fascinated by advertising and the psychology of religion, Kelley has radical theories about how grocery stores should be built, theories that involve terms like “emotional opportunity,” “brain activity,” “climax,” and “mise-en-scène.” But before he can talk to grocers about those concepts, he has to convince them of something far more elemental: that their businesses face near-certain annihilation and must change fundamentally to avoid going extinct.
“It is the most daunting feeling when you go to a grocery store chain, and you meet with these starched-white-shirt executives,” Kelley tells me. “When we get a new job, we sit around this table — we do it twenty, thirty times a year. Old men, generally. Don’t love food, progressive food. Just love their old food — like Archie Bunkers, essentially. You meet these people and then you tour their stores. Then I’ve got to go convince Archie Bunker that there’s something called emotions, that there are these ideas about branding and feeling. It is a crazy assignment. I can’t get them to forget that they’re no longer in a situation where they’ve got plenty of customers. That it’s do-or-die time now.”
Forget branding. Forget sales. Kelley’s main challenge is redirecting the attention of older male executives, scared of the future and yet stuck in their ways, to the things that really matter.
“I make my living convincing male skeptics of the power of emotions,” he says.
Human beings, it turns out, aren’t very good at avoiding large-scale disaster. As you read this, the climate is changing, thanks to the destructively planet-altering activities of our species. The past four years have beenthe hottest on record. If the trend continues — and virtually all experts agree it will — we’re likely to experiencemass disruptions on a scale never before seen in human history.Drought will be epidemic. Theocean will acidify. Islands will be swallowed by the sea. People could be displaced by the millions, creatinga new generation of climate refugees. And all becausewe didn’t move quickly enough when we still had time.
You know this already. But I bet you’re not doing much about it — not enough, at least, to help avert catastrophe. I’ll bet your approach looks a lot like mine: worry too much, accomplish too little. The sheer size of the problem is paralyzing. Vast, systemic challenges tend to short-circuit our primate brains. So we go on, as the grim future bears down.
Grocers, in their own workaday way, fall prey to the same inertia. They got used to an environment of relative stability. They don’t know how to prepare for an uncertain future. And they can’t force themselves to behave as if the good times are really going to go away — even if, deep down, they know it’s true.
I make my living convincing male skeptics of the power of emotions.
In the 1980s, you could still visit almost any community in the U.S. and find a thriving supermarket. Typically, it would be a dynasty family grocery store, one that had been in business for a few generations. Larger markets usually had two or three players, small chains that sorted themselves out along socioeconomic lines: fancy, middlebrow, thrifty. Competition was slack and demand — this is the beautiful thing about selling food — never waned. For decades, times were good in the grocery business. Roads and schools were named after local supermarket moguls, who often chaired their local chambers of commerce. “When you have that much demand, and not much competition, nothing gets tested. Kind of like a country with a military that really doesn’t know whether their bullets work,” Kelley says. “They’d never really been in a dogfight.”
It’s hard to believe now, but there wasnot a single Walmart on the West Coast until 1990. That decade saw the birth of the “hypermarket” and the beginning of the end for traditional grocery stores — Walmarts, Costcos, and Kmarts became the first aggressive competition supermarkets ever really faced, luring customers in with the promise of one-stop shopping on everything from Discmen to watermelon.
The other bright red flag: Americans started cooking at home less and eating out more. In 2010, Americansdined out more than in for the first time on record, the culmination of a slow shift away from home cooking that had been going onsince at least the 1960s. That trend is likely to continue. According to a 2017 report from the USDA’s Economic Research Service, millennials shop at food stores less than any other age group, spend less time preparing food, and are more likely to eat carry-out, delivery, or fast food even when they do eat at home. But even within the shrinking market for groceries, competition has stiffened. Retailers not known for selling food increasingly specialize in it, a phenomenon called“channel blurring”; today, pharmacies like CVS sell pantry staples and packaged foods, while 99-cent stores like Dollar General are a primary source of groceries for a growing number of Americans. Then there’s e-commerce. Though only about 3 percent of groceries are currently bought online, that figure couldrocket to 20 percent by 2025. From subscription meal-kit services like Blue Apron to online markets like FreshDirect and Amazon Fresh, shopping for food has become an increasingly digital endeavor — one that sidesteps traditional grocery stores entirely.
A cursory glance might suggest grocery stores are in no immediate danger. According to the data analytics company Inmar, traditional supermarkets still have a 44.6 percent market share among brick-and-mortar food retailers. And thougha spate of bankruptcies has recently hit the news,there are actually more grocery stores today than there were in 2005. Compared to many industries — internet service, for example — the grocery industry is still a diverse, highly varied ecosystem. Forty-three percent of grocery companies have fewer than four stores, according to a recent USDAreport. These independent stores sold 11 percent of the nation’s groceriesin 2015, a larger collective market share than successful chains like Albertson’s (4.5 percent), Publix (2.25 percent), and Whole Foods (1.2 percent).
But looking at this snapshot without context is misleading — a little like saying that the earth can’t be warming because it’s snowing outside. Not long ago, grocery stores sold the vast majority of the food that was prepared and eaten at home — about 90 percent in 1988, according to Inmar. Today, their market share has fallen by more than half, even as groceries represent a diminished proportion of overall food sold. Their slice of the pie is steadily shrinking, as is the pie itself.
By 2025, the thinking goes, most Americans will rarely enter a grocery store. That’s according to areport called “Surviving the Brave New World of Food Retailing,” published by the Coca-Cola Retailing Research Council — a think tank sponsored by the soft drink giant to help retailers prepare for major changes. The reportdescribes a retail marketplace in the throes of massive change, where supermarkets as we know them are functionally obsolete. Disposables and nonperishables, from paper towels to laundry detergent and peanut butter, will replenish themselves automatically, thanks to smart-home sensors that reorder when supplies are low. Online recipes from publishers like Epicurious will sync directly to digital shopping carts operated by e-retailers like Amazon. Impulse buys and last-minute errands will be fulfilled via Instacart and whisked over in self-driving Ubers. In other words, food — for the most part — will be controlled by a small handful of powerful tech companies.
The Coca-Cola report, written in consultation with a handful of influential grocery executives, including Rich Niemann, acknowledges that the challenges are dire. To remain relevant, it concludes, supermarkets will need to become more like tech platforms: develop a “robust set of e-commerce capabilities,” take “a mobile-first approach,” and leverage “enhanced digital assets.” They’ll need infrastructure for “click and collect” purchasing, allowing customers to order online and pick up in a jiffy. They’ll want to establish a social media presence, as well as a “chatbot strategy.” In short, they’ll need to become Amazon, and they’ll need to do it all while competing with Walmart — and its e-commerce platform, Jet.com — on convenience and price.
That’s why Amazon’s acquisition of Whole Foods Market was terrifying to so many grocers, sending the stocks of national chains like Kroger tumbling: It represents a future they can’t really compete in. Since August 2017, Amazon has masterfully integrated e-commerce and physical shopping, creating a muscular hybrid that represents an existential threat to traditional grocery stores. The acquisition was partially a real estate play: Whole Foods stores with Prime lockers nowact as a convenient pickup depot for Amazon goods. But Amazon’s also doing its best to make it too expensive and inconvenient for its Prime members, who pay $129 a year for free two-day shipping and a host of other perks, to shop anywhere else. Prime membersreceive additional 10 percent discounts on select goods at Whole Foods, and Amazon is rolling out home grocery delivery in select areas. With the Whole Foods acquisition, then, Amazon cornered two markets: the thrift-driven world of e-commerce and the pleasure-seeking universe of high-end grocery. Order dish soap and paper towels in bulk on Amazon, and pick them up at Whole Foods with your grass-fed steak.
An Amazon worker wheels back a cart after loading a bag of groceries into a customer’s car at an AmazonFresh Pickup location in Seattle. (AP Photo/Elaine Thompson, File)
Ingredients from a three-meal Blue Apron box. (AP Photo/Bree Fowler)
An employee of grocery delivery service Amazon Fresh scans ordered products before putting them into a transport bag. (Monika Skolimowska/picture-alliance/dpa/AP Images)
Traditional grocers are now expected to offer the same combination of convenience, flexibility, selection, and value. They’re understandably terrified by this scenario, which would require fundamental, complex, and very expensive changes. And Kelley is terrified of it, too, though for a different reason: He simply thinks it won’t work. In his view, supermarkets will never beat Walmart and Amazon at what they do best. If they try to succeed by that strategy alone, they’ll fail. That prospect keeps Kelley up at night — because it could mean a highly consolidated marketplace overseen by just a handful of players, one at stark contrast to the regional, highly varied food retail landscape America enjoyed throughout the 20th century.
“I’m afraid of what could happen if Walmart and Amazon and Lidl are running our food system, the players trying to get everything down to the lowest price possible,” he tells me. “What gives me hope is the upstarts who will do the opposite. Who aren’t going to sell convenience or efficiency, but fidelity.”
The approach Kelley’s suggesting still means completely overhauling everything, with no guarantee of success. It’s a strategy that’s decidedly low-tech, though it’s no less radical. It’s more about people than new platforms. It means making grocery shopping more like going to the movies.
* * *
Nobody grows up daydreaming about designing grocery stores, including Kelley. As a student at the University of North Carolina at Charlotte, he was just like every other architect-in-training: He wanted to be a figure like Frank Gehry, building celebrated skyscrapers and cultural centers. But he came to feel dissatisfied with the culture of his profession. In his view, architects coldly fixate on the aesthetics of buildings and aren’t concerned enough with the people inside.
“Architecture worships objects, and Capital-A architects are object makers,” Kelley tells me. “They aren’t trying to fix social issues. People and their experience and their perceptions and behaviors don’t matter to them. They don’t even really want people in their photographs—or if they have to, they’ll blur them out.”What interested Kelley most was how people would use his buildings, not how the structures would fit into the skyline. He wanted to shape spaces in ways that could actually affect our emotions and personalities, bringing out the better angels of our nature. To his surprise, no one had really quantified a set of rules for how environment could influence behavior. Wasn’t it strange that advertising agencies spent so much time thinking about the links between storytelling, emotions, and decision-making — while commercial spaces, the places where we actually go to buy, often had no design principle beyond brute utility?
“My ultimate goal was to create a truly multidisciplinary firm that was comprised of designers, social scientists and marketing types,” he says. “It was so unorthodox and so bizarrely new in terms of approach that everyone thought I was crazy.”
In 1992, when he was 28, Kelley cofounded Shook Kelley with the Charlotte, North Carolina–based architect and urban planner Terry Shook. Their idea was to offer a suite of services that bridged social science, branding, and design, a new field they called “perception management.” They were convinced space could be used to manage emotion, just the way cinema leads us through a guided sequence of feelings, and wanted to turn that abstract idea into actionable principles. While Shook focused on bigger, community-oriented spaces like downtown centers and malls, Kelley focused on the smaller, everyday commercial spaces overlooked by fancy architecture firms: dry cleaners, convenience stores, eateries, bars. One avant-garde restaurant Kelley designed in Charlotte, called Props, was an homage to the sitcom craze of the 1990s. It was built to look like a series of living rooms, based on the apartment scenes in shows like Seinfeld and Friends and featured couches and easy chairs instead of dining tables to encourage guests to mingle during dinner.
The shift to grocery stores didn’t happen until a few years later, almost by accident. In the mid-’90s, Americans still spent about 55 percent of their food dollars on meals eaten at home — but that share was declining quickly enough to concern top corporate brass at Harris Teeter, a Charlotte-area, North Carolina–based grocery chain with stores throughout the Southwestern United States. (Today, Harris Teeter is owned by Kroger, the country’ssecond-largest seller of groceries behind Walmart.) Harris Teeter execs reached out to Shook Kelley.“We hear you’re good with design, and you’re good with food,” Kelley remembers Harris Teeter reps saying. “Maybe you could help us.”
At first, it was Terry Shook’s account. He rebuilt each section of the store into a distinct “scene” that reinforced the themes and aesthetics of the type of food it sold. The deli counter became a mocked-up urban delicatessen, complete with awning and neon sign. The produce section resembled a roadside farmstand. The dairy cases were corrugated steel silos, emblazoned with the logo of a local milk supplier. And he introduced full-service cafés, a novelty for grocery stores at the time, with chrome siding like a vintage diner.It was pioneering work, winning that year’s Outstanding Achievement Award from the International Interior Design Association — according to Kelley, it was the first time the prestigious award had ever been given to a grocery store.
Shook backed off of grocery stores after launching the new Harris Teeter, but the experience sparked Kelley’s lifelong fascination with grocery stores, which he realized were ideal proving grounds for his ideas about design and behavior. Supermarkets contain thousands of products, and consumers make dozens of decisions inside them — decisions about health, safety, family, and tradition that get to the core of who they are. He largely took over the Harris Teeter account and redesigned nearly 100 of the chain’s stores, work that would go on to influence the way the industry saw itself and ultimately change the way stores are built and navigated.
Since then, Kelley has worked to show grocery stores that they don’t have to worship at the altar of supply-side economics. He urges grocers to appeal instead to our humanity. Kelley asks them to think more imaginatively about their stores, using physical space to evoke nostalgia, delight our senses, and appeal to the parts of us motivated by something bigger and more generous than plain old thrift. Shopping, for him, is all about navigating our personal hopes and fears, and grocery stores will only succeed when they play to those emotions.
When it works, the results are dramatic. Between 2003 and 2007, Whole Foods hired Shook Kelley for brand strategy and store design, working with the firm throughout a crucial period of the chain’s development. The fear was that as Whole Foods grew, its image would become too diffuse, harder to differentiate from other health food stores; at the same time, the company wanted to attract more mainstream shoppers. Kelley’s team was tasked with finding new ways to telegraph the brand’s singular value. Their solution was a hierarchical system of signage that would streamline the store’s crowded field of competing health and wellness claims.
Kelley’s view is that most grocery stores are “addicted” to signage, cramming their spaces with so many pricing details, promotions, navigational signs, ads, and brand assets that it “functionally shuts down [the customer’s] ability to digest the information in front of them.”
Kelley’s team stipulated that Whole Foods could only have seven layers of information, which ranged from evocative signage 60 feet away to descriptive displays six feet from customers to promotional info just six inches from their hands. Everything else was “noise,” and jettisoned from the stores entirely. If you’ve ever shopped at Whole Foods, you probably recognize the way that the store’s particular brand of feel-good, hippie sanctimony seems to permeate your consciousness at every turn. Kelley helped invent that. The system he created for pilot stores in Princeton, New Jersey, and Louisville, Kentucky, were scaled throughout the chain and are still in use today, he says. (Whole Foods did not respond to requests for comment for this story.)
With a carefully delineated set of core values guiding its purchasing and brand, Whole Foods was ripe for the kind of visual overhaul Kelley specializes in. But most regional grocery chains have a different set of problems: They don’t really have values to telegraph in the first place. Shook Kelley’s approach is about getting buttoned-down grocers to reflect on their beliefs, tapping into deeper, more primal reasons for wanting to sell food.
* * *
Today, Kelley and his team have developed a playbook for clients, a finely tuned process to get shoppers to think in terms that go beyond bargain-hunting. It embraces what he calls “the theater of retail” and draws inspiration from an unlikely place: the emotionally laden visual language of cinema. His goal is to convince grocers to stop thinking like Willy Loman — like depressed, dejected salesmen forever peddling broken-down goods, fixated on the past and losing touch with the present. In order to survive, Kelley says, grocers can’t be satisfied with providing a place to complete a chore. They’ll need to direct an experience.
Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.
Today’s successful retail brands establish what Kelley calls a “brand realm,” or what screenwriters would call a story’s “setting.” We don’t usually think consciously about them, but realms subtly shape our attitude toward shopping the same way the foggy, noirishly lit streets in a Batman movie tell us something about Gotham City. Cracker Barrel is set in a nostalgic rural house. Urban Outfitters is set on a graffitied urban street. Tommy Bahama takes place on a resort island. It’s a well-known industry secret that Costco stores arehugely expensive to construct — they’re designed to resemble fantasy versions of real-life warehouses, and the appearance of thrift doesn’t come cheap. Some realms are even more specific and fanciful: Anthropologie is an enchanted attic, complete with enticing cupboards and drawers. Trader Joe’s is a crew of carefree, hippie traders shipping bulk goods across the sea.A strong sense of place helps immerse us in a store, getting us emotionally invested and (perhaps) ready to suspend the critical faculties that prevent a shopping spree.
Kelley takes this a few steps further. The Shook Kelly team, which includes a cultural anthropologist with a Ph.D., begins by conducting interviews with executives, staff, and locals, looking for the storytelling hooks they call “emotional opportunities.” These can stem from core brand values, but often revolve around the most intense, place-specific feelings locals have about food. Then Kelley finds ways to place emotional opportunities inside a larger realm with an overarching narrative, helping retailers tell those stories — not with shelves of product, but through a series of affecting “scenes.”
In Alberta, Canada, Shook Kelley redesigned a small, regional grocery chain now called Freson Bros. Fresh Market. In interviews, the team discovered that meat-smoking is a beloved pastime there, so Shook Kelley built huge, in-store smokers at each new location — a scene called “Banj’s Smokehouse” — that crank out pound after pound of the province’s signature beef, as well as elk, deer, and other kinds of meat (customers can even BYO meat to be smoked in-house). Kelley also designed stylized root cellars in each produce section, a cooler, darker corner of each store that nods to the technique Albertans use to keep vegetables fresh. These elements aren’t just novel ways to taste, touch, and buy. They reference cultural set points, triggering memories and personal associations. Kelley uses these open, aisle-less spaces, which he calls “perceptual rooms,” to draw customers through an implied sequence of actions, tempting them towards a specific purchase.
Something magical happens when you engage customers this way. Behavior changes in visible, quantifiable ways. People move differently. They browse differently. And they buy differently. Rather than progressing in a linear fashion, the way a harried customer might shoot down an aisle — Kelley hates aisles, which he says encourage rushed, menial shopping — customers zig-zag, meander, revisit. These behaviors are a sign a customer is “experimenting,” engaging with curiosity and pleasure rather than just trying to complete a task. “If I was doing a case study presentation to you, I would show you exact conditions where we don’t change the product, the price, the service. We just change the environment and we’ll change the behavior,” Kelley tells me. “That always shocks retailers. They’re like ‘Holy cow.’ They don’t realize how much environment really affects behavior.”
A strong sense of place helps immerse us in a store, getting us emotionally invested and (perhaps) ready to suspend the critical faculties that prevent a shopping spree.
In the mid-2000s, Nabisco approached Kelley’s firm, complaining that sales were down 16 percent in the cookie-and-cracker aisle. In response, Shook Kelley designed “Mom’s Kitchen,” which was piloted at Buehler’s, a 15-store chain in northern Ohio. Kelley took Nabisco’s products out of the center aisles entirely and installed them in a self-contained zone: a perceptual room built out to look like a nostalgic vision of suburban childhood, all wooden countertops, tile, and hanging copper pans. Shelves of Nabisco products from Ritz Crackers to Oreos lined the walls. Miniature packs of Animal Crackers waited out in a large bowl, drawers opened to reveal boxes of Saltines. The finishing touch had nothing to do with Nabisco and everything to do with childhood associations: Kelley had the retailers install fridge cases filled with milk, backlit and glowing. Who wants to eat Oreos without a refreshing glass of milk to wash them down?
The store operators weren’t sold. They found it confusing and inconvenient to stock milk in two places at once. But from a sales perspective, the experiment was a smash. Sales of Nabisco products increased by as much as 32 percent, and the entire cookie-and-cracker segment experienced a halo effect, seeing double-digit jumps. Then, the unthinkable: The stores started selling out of milk. They simply couldn’t keep it on the shelves.
You’d think that the grocery stores would be thrilled, that it would have them scrambling to knock over their aisles of goods, building suites of perceptual rooms. Instead, they retreated. Nabisco’s parent company at the time, Kraft, was excited by the results and kicked the idea over to a higher-up corporate division where it stalled. And Buehler’s, for its part, never did anything to capitalize on its success. When the Nabisco took “Mom’s Kitchen” displays down, Kelley says, the stores didn’t replace them.
Mom’s Kitchen, fully stocked. (Photo by Tim Buchman)
“We were always asking a different question: What is the problem you’re trying to solve through food?” Kelley says. “It’s not just a refueling exercise — instead, what is the social, emotional issue that food is solving for us? We started trying to work that into grocery. But we probably did it a little too early, because they weren’t afraid enough.”
Since then, Kelley has continued to build his case to unreceptive audiences of male executiveswith mixed success. He tells them that when customers experiment — when the process of sampling, engaging, interacting, and evaluating an array of options becomes a source of pleasure — they tend to take more time shopping. And that the more time customers spend in-store, the more they buy. In the industry, this all-important metric is called “dwell time.” Most retail experts agree that increasing dwell without increasing frustration (say, with long checkout times) will be key to the survival of brick-and-mortar retail. Estimates vary on how much dwell time increases sales; according to Davinder Jheeta, creative brand director of the British supermarket Simply Fresh,customers spent 1.3 percent more for every 1 percent increase in dwell time in 2015.
Another way to increase dwell time? Offer prepared foods. Delis, cafes, and in-store restaurants increase dwell time and facilitate pleasure while operating with much higher profit margins and recapturing some of the dining-out dollar that grocers are now losing. “I tell my clients, ‘In five years, you’re going to be in the restaurant business,” Kelley says, “‘or you’re going to be out of business.’”
Kelley’s job, then, is to use design in ways that get customers to linger, touch, taste, scrutinize, explore. The stakes are high, but the ambitions are startlingly low. Kelley often asks clients what he calls a provocative question: Rather than trying to bring in new customers, would it solve their problems if 20 percent of customers increased their basket size by just two dollars? The answer, he says, is typically an enthusiastic yes.
Just two more dollars per trip for every fifth customer — that’s what victory looks like. And failure? That looks like a food marketplace dominated by Walmart and Amazon, a world where the neighborhood supermarket is a thing of the past.
* * *
When Shook Kelley started working on Niemann’s account, things began the way they always did: looking for emotional opportunities. But the team was stumped. Niemann’s stores were clean and expertly run. There was nothing wrong with them. Niemann’s problem was that he had no obvious problem. There was no there there.
Many of the regionals Kelley works with have no obvious emotional hook; all they know is that they’ve sold groceries for a long time and would like to keep on selling them. When he asks clients what they believe in, they show him grainy black-and-white photos of the stores their parents and grandparents ran, but they can articulate little beyond the universal goal of self-perpetuation. So part of Shook Kelley’s specialty is locating the distinguishing spark in brands that do nothing especially well, which isn’t always easy. At Buehler’s Fresh Foods, the chain where “Mom’s Kitchen” was piloted, the store’s Shook Kelley–supplied emotional theme is“Harnessing the Power of Nice.”
Still, Niemann Foods was an especially challenging case. “We were like, ‘Is there any core asset here?’” Kelley told me. “And we were like, ‘No. You really don’t have anything.’”
What Kelley noticed most was how depressed Niemann seemed, how gloomy about the fate of grocery stores in general. Nothing excited him — with one exception. Niemann runs a cattle ranch, a family operation in northeast Missouri. “Whenever he talked about cattle and feed and antibiotics and meat qualities, his physical body would change. We’re like, ‘My god. This guy loves ranching.’ He only had three hundred cattle or something, but he had a thousand pounds of interest in it.”
Niemann’s farm now has about 600 cattle, though it’s still more hobby farm than full-time gig — but it ended up being a revelation.During an early phase of the process, someone brought up “So God Made a Farmer” — a speech radio host Paul Harvey gave at the 1978 Future Farmers of America Convention that had been used in an ad for Ram trucks in the previous year’s Super Bowl. It’s a short poem that imagines the eighth day of the biblical creation, where God looks down from paradise and realizes his new world needs a caretaker. What kind of credentials is God looking for? Someone “willing to get up before dawn, milk cows, work all day in the fields, milk cows again, eat supper and then go to town and stay past midnight at a meeting of the school board.” God needs “somebody willing to sit up all night with a newborn colt. And watch it die. Then dry his eyes and say, ‘Maybe next year.’” God needs “somebody strong enough to clear trees and heave bails, yet gentle enough to yean lambs and wean pigs and tend the pink-combed pullets, who will stop his mower for an hour to splint the broken leg of a meadow lark.” In other words, God needs a farmer.
Part denim psalm, part Whitmanesque catalogue, it’s a quintessential piece of Americana — hokey and humbling like a Norman Rockwell painting, and a bit behind the times (of course, the archetypal farmer is male). And when Kelley’s team played the crackling audio over the speakers in a conference room in Quincy, Illinois, something completely unexpected happened. Something that convinced Kelley that his client’s stores had an emotional core after all, one strong enough to provide the thematic backbone for a new approach to the grocery store.
Rich Niemann, the jaded supermarket elder statesman, broke down and wept.
* * *
I have never been a fan of shopping. Spending money stresses me out. I worry too much to enjoy it. So I wanted to see if a Kelley store could really be what he said it was, a meaningful experience, or if it would just feel fake and hokey. You know, like the movies. When I asked if there was one store I could visit to see his full design principles in action, he told me to go to Harvest, “the most interesting store in America.”
Champaign is two hours south of O’Hare by car. Crossing its vast landscape of unrelenting farmland, you appreciate the sheer scale of Illinois, how far the state’s lower half is from Chicago. It’s a college town, which comes with the usual trappings — progressive politics, cafes and bars, young people lugging backpacks with their earbuds in — but you forget that fast outside the city limits. In 2016, some townships in Champaign county voted for Donald Trump over Hillary Clinton by 50 points.
I was greeted in the parking lot by Gerry Kettler, Niemann Foods’ director of consumer affairs. Vintage John Deere tractors formed a caravan outside the store. The shopping cart vestibules were adorned with images of huge combines roving across fields of commodity crops. Outside the wide-mouthed entryway, local produce waited in picket-fence crates — in-season tomatoes from Johnstonville, sweet onions from Warrensburg.
And then we stepped inside.
Everywhere, sunlight poured in through the tall, glass facade, illuminating a sequence of discrete, airy, and largely aisle-less zones. Kettler bounded around the store, pointing out displays with surprised joy on his face, as if he couldn’t believe his luck. The flowers by the door come from local growers like Delight Flower Farm and Illinois Willows. “Can’t keep this shit in stock,” he said. He makes me hold an enormous jackfruit to admire its heft. The produce was beautiful, he was right, with more local options than I’ve ever seen in a grocery store. The Warrensville sweet corn is eye-poppingly cheap: two bucks a dozen. There were purple broccolini and clamshells filled with squash blossoms, a delicacy so temperamental that they’re rarely sold outside of farmers’ markets. Early on, they had to explain to some teenage cashiers what they were — they’d never seen squash blossoms before.
I started to sense the “realm” Harvest inhabits: a distinctly red-state brand of America, local food for fans of faith and the free market. It’s hunting gear. It’s Chevys. It’s people for whom commercial-scale pig barns bring back memories of home. Everywhere, Shook Kelley signage — a hierarchy of cues like what Kelley dreamed up for Whole Foods — drives the message home. A large, evocative sign on the far wall reads Pure Farm Flavor, buttressed by the silhouettes of livestock, so large it almost feels subliminal. Folksy slogans hang on the walls, sayings like FULL OF THE MILK OF HUMAN KINDNESS and THE CREAM ALWAYS RISES TO THE TOP.
Then there are the informational placards that point out suppliers and methods.
There are at least a half dozen varieties of small-batch honey; you can find pastured eggs for $3.69. The liquor section includes local selections, like whiskey distilled in DeKalb and a display with cutting boards made from local wood by Niemann Foods’ HR Manager. “Turns out we had some talent in our backyard,” Kettler said. Niemann’s willingness to look right under his nose, sidestepping middlemen distributors to offer reasonably priced, local goods, is a hallmark of Harvest Market.
Champaign, IL’s Harvest Market is styled like Whole Foods for the Heartland—complete with a John Deere tractor stationed outside. (Photo courtesy of the author.)
Unlike most large-format grocery stores, Harvest Market buys some produce directly from farmers (like these sweet candy onions from Warrensville, IL, o, about 50 miles away). (Photo courtesy of the author.)
Interior of Harvest Market from the upper mezzanine, where shoppers gather for lunch and board games during the day and glasses of wine at night. (Photo courtesy of the author.)
By the cheese section, a glassed in contraption works night and day: a butter churner, which transforms local sweet cream into yellow, briskly selling-bricks of fat. (Photo courtesy of the author.)
Interior of Harvest Market from the upper mezzanine, where shoppers gather for lunch and board games during the day and glasses of wine at night. (Photo courtesy of the author.)
Harvest Market executives Gerry Kettler, left, and Rich Niemann chat with a salsa vendor visiting to do demos. (Photo courtesy of the author.)
That shortened chain of custody is only possible because of Niemann and the lifetime of supply-side know-how he brings to table. But finding ways to offer better, more affordable food has been a long-term goal of Kelley — who strained his relationship with Whole Foods CEO John Mackey over the issue. As obsessed as Kelley is with appearances, he insists to me that his work must be grounded in something “real”: that grocery stores only succeed when they really try to make the world a better place through food. In his view, Whole Foods wasn’t doing enough to address its notoriously high prices — opening itself up to be undercut by cheaper competition, and missing a kind of ethical opportunity to make better food available to more people.
“When,” Kelley remembers asking, “did you start to mistake opulence for success?”
In Kelley’s telling, demand slackened so much during the Great Recession that it nearly lead to Whole Foods’ downfall, a financial setback that the company never fully recovered from — and, one could argue, ultimately led to its acquisition. Harvest Market, for its part, has none of Whole Foods’ clean-label sanctimony. It takes an “all-of-the-above” approach: There’s local produce, but there’re also Oreos and Doritos and Coca-Cola; at Thanksgiving, you can buy a pastured turkey from Triple S Farms or a 20-pound Butterball. But that strong emphasis on making local food more accessible and affordable makes it an interesting counterpart to Kelley’s former client.
The most Willy Wonka–esque touch is the hulking piece of dairy processing equipment in a glass room by the cheese case. It’s a commercial-scale butter churner — the first one ever, Kettler told me, to grace the inside of a grocery store.
“So this was a Shook Kelley idea,” he said, “We said yes, without knowing how much it would cost. And the costs just kept accelerating. But we’re thrilled. People love it.”Harvest Market isn’t just a grocery store — it’s also a federally inspected dairy plant. The store buys sweet cream from a local dairy, which it churns into house-made butter, available for purchase by the brick and used throughout Harvest’s bakery and restaurant. The butter sells out as fast as they can make it. Unlike the grocers who objected to “Mom’s Kitchen,” the staff don’t seem to mind.
As I walked through the store, I couldn’t help wondering how impressed I really was. I found Harvest to be a beautiful example of a grocery store, no doubt, and a very unusual one. What was it that made me want to encounter something more outrageous, more radical, more theatrical and bizarre? I wanted animatronic puppets. I wanted fog machines.
I should have known better — Kelley had warned me that you can’t take the theater of retail too far without breaking the dream. He’d told me that he admires stores where “you’re just not even aware of the wonder of the scene, you’re just totally engrossed in it” — stores a universe away from the overwrought, hokey feel of Disneyland. But I had Amazon’s new stores in the back up my mind as a counterpoint, with all their cashierless bells and whistles, their ability to click and collect, their ability to test-drive Alexa and play a song or switch on a fan. I guess, deep down, I was wondering if something this subtle really could work.
“Here, this is Rich Niemann,” Kettler said, and I found myself face-to-face with Niemann himself. We shook hands and he asked if I’d ever been to Illinois before. Many times, I told him. My wife is from Chicago, so we’ve visited the city often.
He grinned at me.
“That’s not Illinois,” he said.
We walked to Harvest’s restaurant, a 40-person seating area plus an adjacent bar with a row of stools, that offers standards like burgers, salads, and flatbreads. There’s an additional 80-person seating area on the second-floor mezzanine, a simulated living room complete with couches and board games. Beyond that, they pointed out the brand-new wine bar — open, like the rest of the space, until midnight. There’s a cooking classroom by the corporate offices. Through the window, I saw a classroom full of children doing something to vegetables. Adult Cooking classes run two or three nightsevery week, plus special events for schools and other groups.
For a summer weekday at noon in a grocery store I’m amazed how many people are eating and working on laptops. One guy has his machine hooked up to a full-sized monitor he lugged up the stairs — he’s made a customized wooden piece that hooks into Harvest’s wrought-iron support beams to create a platform for his plus-size screen. He comes every day, like it’s his office. He’s a dwell-time dream.
We sit down, and Kettler insists I eat the corn first, slathering it with the house-made butter and eating it while it’s hot. He reminds me that it’s grown by the Maddoxes, a family in Warrensburg, about 50 miles west of Champaign.
The corn was good, but I wanted to ask Niemann if the grocery industry was really that bad, and he told me it is. I assume he’ll want to talk about Amazon and its acquisition of Whole Foods and the way e-commerce has changed the game. He acknowledges that, but to my surprise he said the biggest factor is something else entirely — a massive shift happening in the world of consumer packaged goods, or CPGs.
For years, grocery stores never had to advertise, because the largest companies in the world — Proctor and Gamble, Coca-Cola, Nestle — did their advertising for them, just the way Nabisco helped finance “Mom’s Kitchen” to benefit the stores. People came to supermarkets to buy the foods they saw on TV. ButAmericans are falling out of love with legacy brands. They’re looking for something different, locality, a sense of novelty and adventure. Kellogg’s and General Mills don’t have the pull they once had.
When their sales flag, grocery sales do too — and the once-bulletproof alliance between food brands and supermarkets is splitting. For the past two years, the Grocery Manufacturers’ Association, an influential trade group representing the biggest food companies in the world,started to lose members. It began with Campbell’s Soup. Dean Foods, Mars, Tyson Foods, Unilever, Hershey Company, the Kraft Heinz Company, and others followed. That profound betrayal was a rude awakening: CPG companies don’t need grocery stores. They have Amazon. They can sell directly through their websites. They can launch their own pop-ups.
It’s only then that I realized how dire the predicament of grocery stores really is, and why Niemann was so frustrated when he first called Kevin Kelley. It’s one thing when you can’t sell as cheaply and conveniently as your competitors. But it’s another thing when no one wants what you’re selling.
Harvest doesn’t feel obviously futuristic in the way an Amazon store might. If I went there as a regular shopper and not as a journalist sniffing around for a story, I’m sure I’d find it to be a lovely and transporting way to buy food. But what’s going on behind the scenes is, frankly, unheard of.
Grocery stores have two ironclad rules. First, that grocers set the prices, and farmers do what they can within those mandates. And second, that everyone works with distributors who oversee the aggregation and transport of all goods. Harvest has traditional relationships with companies like Coca-Cola, but it breaks those rules with local farmers and foodmakers. Suppliers — from the locally milled wheat to the local produce to the Kilgus Farms sweet cream that goes into the churner — truck their products right to the back. By avoiding middlemen and their surcharges, Harvest is able to pay suppliers more directly and charge customers less. And it keeps costs low. You can still find $4.29 pints of Halo Top ice cream in the freezer, but the produce section features stunning bargains. When the Maddox family pulls up with its latest shipment of corn, people sometimes start buying it off the back of the truck in the parking lot. That’s massive change, and it’s virtually unheard of in supermarkets. At the same time, suppliers get to set their own prices. Niemann’s suppliers tell him what they need to charge; Niemann adds a standard margin and lets customers decide if they’re willing to pay.
If there’s a reason Harvest matters, it’s only partly because of the aesthetics. It’s mainly because the model of what a grocery store is has been tossed out and rebuilt. And why not? The world as Rich Niemann knows it is ending.
* * *
In 2017, just months after Harvest Market’s opening, Niemannwon the Thomas K. Zaucha Entrepreneurial Excellence Award — the National Grocers Association’s top honor, given for “persistence, vision, and creative entrepreneurship.” That spring, Harvest was spotlighted in a “Store of the Month” cover feature in the influential trade magazine Progressive Grocer. Characteristically, the contributions of Kelley and his firm were not mentioned in the piece.
Niemann tells me his company is currently planning to open a second Harvest Market in Springfield, Illinois, about 90 minutes west of Champaign, in 2020. Without sharing specifics about profitability or sales numbers, he says the store was everything he’d hoped it would be as far as the metrics that most matter — year-over-year sales growth and customer engagement. His only complaint about the store, has to do with parking. For years, Niemann has relied on the same golden ratio to determine the size of parking lot needed for his stores — a certain number of spots for every thousand dollars of expected sales. Harvest’s lot uses the same logic, and it’s nowhere near enough space.
“In any grocery store, the customer’s first objective is pantry fill — to take care of my needs as best I can on my budget,” Niemann says. “But we created a different atmosphere. These customers want to talk. They want to know. They want to experience. They want to taste. They’re there because it’s an adventure.”
They stay so much longer than expected that the parking lot sometimes struggles to fit all their cars at once. Unlike the Amazon stores that may soon be cropping up in a neighborhood near you — reportedly, the company is considering plans to open 3,000 of them in by 2021 — it’s not about getting in and out quickly without interacting with another human being. At Harvest, you stay awhile. And that’s the point.
ButAmericans are falling out of love with legacy brands. They’re looking for something different, locality, a sense of novelty and adventure. Kellogg’s and General Mills don’t have the pull they once had.
So far, Harvest’s success hasn’t made it any easier for Kelley, who still struggles to persuade clients to make fundamental changes. They’re still as scared as they’ve always been, clinging to the same old ideas. He tells them that, above all else, they need to develop a food philosophy — a reason why they do this in the first place, something that goes beyond mere nostalgia or the need to make money. They need to build something that means something, a store people return to not just to complete a task but because it somehow sustains them. For some, that’s too tall an order. “They go, ‘I’m not going to do that.’ I’m like, ‘Then what are you going to do?’ And they literally tell me: ‘I’m going to retire.’” It’s easier to cash out. Pass the buck, and consign the fate of the world to younger people with bolder dreams.
Does it even matter? The world existed before supermarkets, and it won’t end if they vanish. And in the ongoing story of American food, the 20th-century grocery store is no great hero. A&P — the once titanic chain, now itself defunct — was a great mechanizer, undercutting the countless smaller, local businesses that used to populate the landscape. More generally, the supermarket made it easier for Americans to distance ourselves from what we eat, shrouding food production behind a veil and letting us convince ourselves that price and convenience matter above all else. We let ourselves be satisfied with the appearance of abundance — even if great stacks of unblemished fruit contribute to waste and spoilage, even if the array of brightly colored packages are all owned by the same handful of multinational corporations.
But whatever springs up to replace grocery stores will have consequences, too, and the truth is that brick-and-mortar is not going away any time soon — far from it. Instead, the most powerful retailers in the world have realized that physical spaces have advantages they want to capitalize on. It’s not just that stores in residential neighborhoods work well as distribution depots, ones that help facilitate the home delivery of packages. And it’s not just that we can’t always be home to pick up the shipments we ordered when they arrive, so stores remain useful. The world’s biggest brands are now beginning to realize what Kelley has long argued: Physical stores are a way to capture attention, to subject customers to an experience, to influence the way they feel and think. What could be more useful? And what are Amazon’s proposed cashierless stores, but an illustration of Kelley’s argument? They take a brand thesis, a set of core values — that shopping should be quick and easy and highly mechanized — and seduce us with it, letting us feel the sweep and power of that vision as we pass with our goods through the doors without paying, flushed with the thrill a thief feels.
This is where new troubles start. Only a few companies in the world will be able to compete at Amazon’s scale — the scale where building 3,000 futuristic convenience stores in three years may be a realistic proposition. Unlike in the golden age of grocery, where different family owned chains catered to different demographics, we’ll have only a handful of players. We’ll have companies that own the whole value chain, low to high. Amazon owns the e-commerce site where you can find almost anything in the world for the cheapest price. And for when you want to feel the heft of an heirloom tomato in your hand or sample some manchego before buying, there is Whole Foods. Online retail for thrift, in-person shopping for pleasure. Except one massive company now owns them both.
If this new landscape comes to dominate, we may find there are things we miss about the past. For all its problems, the grocery industry is at least decentralized, owned by no one dominant company and carved up into more players than you could ever count. It’s run by people who often live alongside the communities they serve and share their concerns. We might miss that competition, that community. They are small. They are nimble. They are independently, sometimes even cooperatively, owned. They employ people. And if they are scrappy, and ingenious, and willing to change, there’s no telling what they might do. It is not impossible that they could use their assets — financial resources, industry connections, prime real estate — to find new ways to supply what we all want most: to be happier, to be healthier, to feel more connected. To be better people. To do the right thing.
I want to believe that, anyway. That stores — at least in theory — could be about something bigger, and better than mere commerce. The way Harvest seems to want to be, with some success. But I wonder if that’s just a fantasy, too: the dream that we can buy and sell our way to a better world, that it will take no more than that.
Which one is right?
I guess it depends on how you feel about the movies.
Maybe a film is just a diversion, a way to feel briefly better about our lives, the limitations and disappointments that define us, the things we cannot change. Most of us leave the theater, after all, and just go on being ourselves.
Still, maybe something else is possible. Maybe in the moment when the music swells, and our hearts beat faster, and we feel overcome by the beauty of an image — in the instant that we feel newly brave and noble, and ready to be different, braver versions of ourselves — that we are who we really are.
* * *
Joe Fassler, The Counter’s deputy editor, has covered the intersection of food, policy, technology, and culture for the magazine since 2015. His food reporting has twice been a finalist for the James Beard Foundation Award in Journalism. He’s also editor ofLight the Dark: Writers on Creativity, Inspiration, and the Creative Process (Penguin, 2017), a book based on “By Heart,” his ongoing series of literary conversations for The Atlantic.
Editor: Michelle Weber Fact checker: Matt Giles Copy editor: Jacob Z. Gross
The American Midwest is hard to define. Even which states can be considered “Midwestern” depends on who you ask; is it what lies between Ohio and Iowa? Or does the Midwest stretch further west across the Great Plains; north into Wisconsin, Minnesota, and the Dakotas; or east into parts of Pennsylvania and New York state? Perhaps part of the confusion over the term is rooted in the idea that the Midwest represents far more than a geographic space — it represents a vision of the country as a whole, and is a stand-in for nostalgia, despite the fact that the reality of the nation, and the Midwest along with it, has always been far messier than any myth.
In her new book, The Heartland: An American History, University of Illinois professor Kristin L. Hoganson tells the story of the region through its links to the rest of the world. Arguing that the Midwest, centered here on Illinois, has long been misunderstood as far more provincial and isolated than it actually is, Hoganson lays out the ways in which international relationships have shaped the economy and identity of the region. She also examines part of the region’s complicated history with race, and the way some stories have been obscured in a way that has given everyone — outsiders and locals alike — a warped idea of who has a claim to the most all-American of places. Read more…
There was the decision in ’46 by the Brevard Mosquito Control District to slather the Merritt Island salt marshes in DDT dropped aerially from a No. 2 diesel–fuel carrier.
Then, because the mosquitoes grew resistant to DDT, there was the application of BHC and Dieldrin and Malathion.
Soraya Roberts | Longreads | April 2019 | 10 minutes (2,422 words)
INT. COFFEE SHOP – DAY
SORAYA sits down at her laptop with a cookieor some cake or that weirdly oversize banana bread. As she startsworking on a column like this one, the woman next to her, workingon a spreadsheet, glances at Soraya’s desktop and turns to her.
WOMAN: What do you do?
SORAYA: I’m a columnist.
WOMAN: Holy shit, that’s cool.
I starred in this scene two weeks ago, and again just this past week at a party. The women don’t have to tell me why they think it’s cool, I know why: Carrie Bradshaw. An apartment in New York, a photo on the side of a bus, Louboutins, tutus, and a column at the top of each week. Which is why I qualify it every time: “I don’t make as much as Carrie Bradshaw.” Yes, the job is cool, and it is holy-shit-worthy because so few journalists are able to actually work as journalists. But I’m freelance: I can cover my rent but can’t buy a house, I don’t get benefits, and I might be out of a job next week. Not to mention that I might not be so lucky next time. The women usually turn back to their admin after that — admin looks a lot cooler than journalism these days. But only if you’re not going by Sex and the City or basically every other journalism movie or series that has come after, all of which romanticize an industry which has a knack for playing into that.
“This is the end of an era, everything’s changing,” Gina Rodriguez tells her friends in the trailer for Someone Great, a new Netflix rom-com in which she, a music journalist, gets a job. At a magazine. In San Francisco. This is not a sci-fi movie in which the character has time traveled back to, I don’t know, 1975. It is only one recent example of the obfuscation of what journalism actually means now. There’s also the Hulu series Shrill, which presents itself as if it were current-day but is based on the life of Lindy West, who had a staff job at the Seattle alt-weekly The Stranger when you could still have a staff job and make a name for yourself with first-person essays, i.e., 2009. Special (another Netflix show) also harkens back to that time, and though it’s more overt about how exploitative online media can be — the hero is an intern with cerebral palsy who writes about his disability (which he claims is from a car accident) for clicks — the star is still hired straight out of an internship. (What’s an internship?)
Hollywood romanticizes everything, you say? Perhaps, but this is a case where the media itself seems to be actively engaging in a certain kind of deception about how bad its own situation actually is. In February, The Washington Post, which is no doubt still benefiting from the press off the still-gold-standard journalism movie — 1976’s All the President’s Men — ran a Super Bowl ad narrated by Tom Hanks, which applauds late journalists Marie Colvin and Jamal Khashoggi, who, in their words, brought the story, “no matter the cost.” The spot highlighted what we already know, which is that we need journalism to be a functioning democracy and that many journalists risk their lives to guarantee it. What it kept in darkness (ha), however, was that to do their job properly, those journalists need protection and they need resources — provided by their editors and by their publishers. Hanks, of course, starred in The Post, Steven Spielberg’s 2017 film based on the journalists who reported on the Pentagon Papers in 1971. The ad was using the past to promote the future, rather than dealing with a present, in which more than 2,400 people lost media jobs in the first three months of the year and journalists are trying to unionize en masse. But that’s not particularly telegenic, is it?
* * *
The romanticized idea of the journalist — dogged, trenchcoated — really took off at the movies. In 1928, ex-reporters Ben Hecht and Charles MacArthur wrote a play which was adapted into The Front Page, a 1931 screwball that became the journalism movie prototype, with fast dialogue and faster morals. My favorite part is that not only is the star reporter trying to quit the paper (in this economy?), but his editor will do anything — including harboring an accused murderer — to keep him on staff. Matt Ehrlich, coauthor of Heroes and Scoundrels: The Image of the Journalist in Popular Culture, once told me for Maclean’s that The Front Page came out of the “love-hate relationship” the writers had with the industry even back then. “The reporters are absolute sleazebags, they do horrible things,” he said. “At the same time The Front Page makes journalism seem very exciting, and they do get the big scoop.” Ehrlich also told me that some initially thought All the President’s Men, which eventually became the prototype of the journalism movie, was reminiscent of the earlier era of the genre. In case you are not a journalist and so haven’t seen it, Robert Redford and Dustin Hoffman starred as Bob Woodward and Carl Bernstein, The Washington Post reporters whose stories on the Watergate burglary and subsequent cover-up helped lead to President Nixon’s resignation. While the film also played fast and loose with the truth, it had a veneer of rumpled repetitious reality — not to mention a strong moral core that made taking down the president with a typewriter seem, if implausible, at least not impossible.
In February, Education Week reported that a survey of 500 high school journalism teachers across 45 states found that, in the past two years, 44 percent of U.S. school teachers saw a rise in journalism enrollment and a 30 percent increase in interest in journalism higher education. “This is this generation’s Watergate,” the executive director of the National Scholastic Press Association said. “With President Trump, everyone is really in tune to the importance of a free press.” Sure. But this isn’t 1976. No doubt there are scores of WoodSteins out there, but not only do a number of journalists no longer have the resources or the time to follow stories of any kind, they rarely have the salaried staff positions to finance them, nor the editors and publishers to support them doing the job they were hired to do. In All the President’s Men, executive editor Ben Bradlee asks WoodStein if they trust their source, before muttering “I can’t do the reporting for my reporters, which means I have to trust them. And I hate trusting anybody.” Then he tells them to “Run that baby.” These days there is little trust in anything beyond the bottom line.
The myth is that All the President’s Men led to a surge of interest in journalism as a career. But in reality it was women, increasingly educated post-liberation, whose interest explained the surge. (My editor is asking: “Is it an accident that shitting on journalism as a worthy profession coincided with women moving into journalism?” My reply is: “I think not.”) Still, women remain underrepresented in the field to this day, a fact reflected by the paucity of movies about the work of female journalists. While there were scores of ’70s and ’80s thrillers built around male reporters with too much hair taking down the man, for the women … there was The China Syndrome, with Jane Fonda as a television reporter named Kimberly covering a nuclear power plant conspiracy. And, um, Absence of Malice? Sally Field is a newspaper reporter who sleeps with her subject (I mean, it is Paul Newman). I guess I could include Broadcast News, which stars Holly Hunter as a neurotic-but-formidable producer and personified the pull between delivering the news and delivering ratings (the analog version of clicks). But Network did that first and more memorably, with its suicidal anchorman lamenting the demise of media that matters. “I’m a human being, GODDAMN IT!!!” he shouts into the void. “My life has value!!!” You don’t hear female journalists saying that on-screen, though you do hear them saying “I do” a whole lot.
The quintessential journalism film and the quintessential rom-com are in fact connected. Nora Ephron, who was briefly married to Carl Bernstein, actually cowrote an early script for All the President’s Men. While it was chucked in favor of William Goldman’s, she went on to write When Harry Met Sally, and I’ll forgive you for not remembering that Sally was a journalist. She probably only mentions it twice because this was 1989, an era in which you decided to be a journalist and then you became one — the end. The movie treats reporting like it’s so stable it’s not even worth mentioning, like being a bureaucrat. Sally could afford a nice apartment, she had plenty of time to hang out with Harry, so what was there to gripe about (Good Girls Revolt would suggest Ephron’s trajectory was less smooth, but that’s another story)? Four years later, in Sleepless in Seattle, Meg Ryan is another journalist in another Ephron movie, equally comfortable, so comfortable in fact that her editor pays her to fly across the country to stalk Tom Hanks. This newspaper editor literally assigns a reporter to take a plane to Seattle from Chicago to “look into” a possible lifestyle story about a single white guy. (Am I doing something wrong?!?!)
Journalism and rom-coms were fused from almost the start, around the ’30s and ’40s. The Front Page went from being a journalism movie to being a rom-com when it turned its hero into a heroine for His Girl Friday. The reporter repartee and the secretive nature of the job appeared to lend themselves well to Hays-era screwballs, though they also indelibly imprinted a lack of seriousness onto their on-screen female journalists. After a brief moment in the 1970s when The Mary Tyler Moore Show embodied the viability of a woman journalist who puts work first, the post-Ephron rom-coms of the 2000s were basically glossy romances in “offices” that were really showrooms for a pink-frosted fantasy girl-reporter gig no doubt thought up by male executives who almost certainly saw All the President’s Men and almost certainly decided a woman couldn’t do that and who cares anyway because the real story is how you’re going to get Matthew McConaughey to pop the question. I can’t with the number of women who recently announced that 13 Going on 30 — the movie in which Jennifer Garner plays a literal child successfully running a fashion magazine — made them want to be journalists. But the real death knell of the aughts journo-rom-com, according to rom-com columnist Caroline Siede, was in 2003 with How to Lose a Guy in 10 Days in 2003. In that caper, Kate Hudson has a job as a columnist despite thinking it is completely rational to write a piece called “How to Bring Peace to Tajikistan” for her Cosmo-type fashion magazine.
* * *
In 2016, the Oscar for Best Picture went to Spotlight, which follows The Boston Globe’s titular investigative team — three men, one woman — as it uncovers the Catholic Church abuse scandal. The film earned comparisons to All the President’s Men for its focus on journalistic drudgery, but it also illustrated the growing precariousness of the newsroom with the arrival of the web. In one scene, executive editor Marty Baron expresses shock when he is told it takes a couple of months for the team to settle on a story and then a year or more to investigate it. At the same time, Baron and two other editors are heavily involved and supportive of the three reporters, who went on to win the Pulitzer in 2003 and remained on the team for years after. Released only 12 years after the fact, the film suggested that journalists who win Pulitzers have some kind of security, which, you know, makes sense, and is maybe true at The Boston Globe. But two years after Spotlight came out, David Wood, who had won HuffPost its only Pulitzer, was laid off. As one of BuzzFeed’s reporters told The Columbia Journalism Review after BuzzFeed shed 15 percent of its staff, “It’s this sense that your job security isn’t tied to the quality of your work.”
“We have so much to learn from these early media companies and in many ways it feels like we’re at the start of another formative era of media history where iconic companies will emerge and thrive for many decades,” BuzzFeed founder and CEO Jonah Peretti blew hard in a memo in 2014, referring to traditional outfits like Time and The New York Times. But both those publications have unions, which Peretti has been clear he doesn’t think “is right” for his company. “A lot of the best new-economy companies are environments where there’s an alliance between managers and employees,” he said in 2015. “People have shared goals.” In this case the shared goals seem to be that Peretti profits (his company was valued at more than $1 billion in 2016) while his staff is disposable.
Which brings us back to the Globe in 2019. That is to say the real one, not the romanticized one. This version of the Globe hires a Gonzo-esque leftist political writer named Luke O’Neil as a freelancer and publishes his “controversial” op-ed about the Secretary of Homeland Security’s resignation titled “Keep Kirstjen Nielsen unemployed and eating Grubhub over her kitchen sink.” “One of the biggest regrets of my life is not pissing in Bill Kristol’s salmon,” it opened, and it concluded with, “As for the waiters out there, I’m not saying you should tamper with anyone’s food, as that could get you into trouble. You might lose your serving job. But you’d be serving America. And you won’t have any regrets years later.” The article was gone by Friday, pulled upon the request of the paper’s owners (O’Neil sent me the original). According to WGBH, a now-deleted note on the opinion page stated that the article “did not receive sufficient editorial oversight and did not meet Globe standards. The Globe regrets its lack of vigilance on the matter. O’Neil is not on staff.” And, oh, man, that last line. It says everything there is to say about modern journalism that is unspoken not only on-screen but by the culture at large and the media in it. It says you serve us but we provide no security, no benefits, no loyalty. It says, unlike Spotlight or All the President’s Men or even The Front Page, we do not have your back. Because if they did, you better believe it would have a good chance of ending up on-screen.
Nina Li Coomes | Longreads | April 2019 | 14 minutes (3,609 words)
A month after Donald Trump is inaugurated president, my mother visits me in Boston. I have lived in the city for only a month, and my apartment is furnished, but barely. During the day, while I sit in a windowless office, my mother drags a suitcase down snowy Commonwealth Avenue to TJ Maxx, where she fills the rolling bag with comforting objects: a teal ceramic pitcher; a wire kitchen cart; a swirling, blue-and-white rug. She makes at least three trips down the hill to the store and back again.
When she is not buying knickknacks, she scrubs my buckling apartment floors. She wrings a rag in warm water, palms it over the wood, her posture and form impeccable as usual. Though I’d beg her not to do this, her actions make sense. For the 20 years we have lived in the United States, my mother has made a ritual of scrubbing the floors of all of our homes. In our first American house, in the unwelcoming cornfields of Illinois, I would know that all was well if I came through the front door to see the warm gleam of freshly scrubbed wood. In my parents’ house in Chicago, if I ever walked across the kitchen in my shoes by accident or, more likely, in a careless hurry, guilt would course down my back, the memory of her hunched by the radiator busily scrubbing flooding my mind. After college, when I lived in New York, she visited me there and insisted on getting down on her hands and knees again, though my roommate had a dog who shed constant, ungrateful clouds of black fur, making a clean floor impossible. In each place we have lived, no matter where we are, my mother has labored over the floor to make it home.
* * *
I was born in Japan to a Japanese mother and a white American father. After my birth, my parents sent an application the U.S. consulate for my American citizenship. The application included my Japanese birth certificate and an accompanying English translation, proof of their marriage in both languages, as well as proof of my father’s U.S. citizenship. My mother’s status as an ethnically Japanese national qualified me for Japanese citizenship upon birth. I have always been a dual citizen of both the United States and Japan.
As a child, I bragged about this status to my peers. I had two countries I could claim as my own, I would crow, two places to call home. My parents often chided me for this bragging, but my willful girl-self ignored them. Though my status as mixed race was most often confusing and other times painful, this was one place I found pride, a jolt of pleasure pulsing through my hands as I touched the spines of one blue and one red passport, both with my name emblazoned on the inside. At the customs kiosk in airports, I liked the momentary juggle my parents did, swapping out our U.S. passports for Japanese ones in Tokyo, and back again in Chicago. All of the coming and going resulted in my American passport looking like an absurdist travel log, appearing as if I left the country and came back a month later without ever entering another country. Though I was only ever just shuttling between the same two nations to visit one set of grandparents or another, childishly I imagined my dual citizenship as a secret mission, a doorway into which I could walk and disappear, existing in secret for a short while. Other times, my passports felt like a double-headed key, easing the pain of leaving one home with the improbable solution of arriving at a different one. My passports — their primary-colored bindings, their grainy texture and heavy pages, these were magical tokens of my childish belief in my double-belonging.
This was one place I found pride, a jolt of pleasure pulsing through my hands as I touched the spines of one blue and one red passport, both with my name emblazoned on the inside.
Dual citizenship is technically only legal in Japan until the age of 22, at which point an individual is required to make a “declaration of citizenship,” effectively asking dual citizens to give up their claim on at least one of their countries of origin. There are, of course, ways around this. There are an estimated 700,000 dual citizens past the age of 22 living in Japan, though this number is probably skewed by the willingness of illegal dual citizens to come forward regarding their legal status. Some dual citizens choose never to declare, trusting in the inefficiencies of a labyrinthine bureaucracy to forget about legal technicalities. Others make their declaration in remote locations far from metropolises like Tokyo or Osaka with the hopes that less-urban officials will not take the time to ask for a renunciation of non-Japanese passports. Some, like me, renewed their passport on the eve of their 22nd birthday, effectively buying another four years to weigh the choice, hoping that laws might shift to allow for legally sustained dual citizenship.
* * *
In Japan, a person obtains citizenship not by birthplace but by blood: This is called jus sanguinis citizenship, or citizenship as defined by the “right of blood.” It does not matter if you are born in the country or out of it. You are only a citizen if you have at least one parent whose blood can be classified as Japanese. (There are some exceptions based on naturalization and statelessness.) Requiring Japanese blood as a tenet of citizenship implies that there is such a thing; that Japaneseness can be traced back to one, biologically determined race.In 2008, conservative lawmakers proposed that DNA testing become part of the process necessary to determine Japanese citizenship, suggesting that biological markers could identify Japanese blood over foreign blood. Though the proposal was ultimately thrown out on grounds of logistical and financial impossibility, it lays bare the use of Japanese citizenship to promote a Japanese ethnostate. Simply put, to Japan, an ideal citizen is someone who is 100 percent racially Japanese.
In the United States, people become citizens through a combination of jus sanguinis, “right of blood,” and jus soli, “right of soil.” If you are born within the boundaries of the United States of America, or born to a parent who is a U.S. citizen, you are granted U.S. citizenship. This idea is introduced in the 14th Amendment of the Constitution: “All persons born or naturalized in the United States, and subject to the jurisdiction thereof are citizens of the United States and of the State wherein they reside.” It is tempting to say that the U.S. is egalitarian, that it is not founded on ethnocentrism, but the citizenship clause of the 14th Amendment was written only as a result of the Civil War. It granted citizenship to Black Americans nearly a century after the nation’s founding and in many ways did so in name only.
Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.
Though Asian Americans were granted citizenship in 1898, the Chinese Exclusion Act of 1882 insured that immigrant laborers were not given easily accessible avenues to permanent citizenship. By the same token, Supreme Court cases in the 1920s (Ozawa v. United States and United States v. Bhagat Singh Thind) established a further precedent barring Asians from naturalizing as citizens on account of their not being “free white persons.” The “free white persons” clause of naturalization in U.S. law was dissolved in 1952, but strict immigration quotas continued to be official policy until 1965. Before 1924, Native Americans were only considered citizens if they could be taxed, if they served in a war, married a white person, or disavowed their tribal allegiance. By the time the Indian Citizenship Act of 1924 passed, most had already followed these alternate paths to citizenship, and even then, states with large Native American populations refused to grant citizenship to their population for fear of the Native American vote. It took almost 25 years for the Indian Citizenship Act to be adopted by all 50 of the United States of America.
No matter the intention of our Founding Fathers or the text of the 14th Amendment, citizenship in the United States is complicated, fraught; at once given andtaken away, fickle and traitorous, seemingly color-blind and yet in service to a majority of “free white persons.”
My passports — their primary-colored bindings, their grainy texture and heavy pages, these were magical tokens of my childish belief in my double-belonging.
This duplicity isn’t unique to the United States or Japan. It is the nature of citizenship to uphold humanity while simultaneously denying it. For the Roman philosopher Cicero, one of the first to consider the idea of the citizen, this duality was best explained as a trade-off between citizen and state. In return for completing certain civic responsibilities (say, paying your taxes and following road signs), citizens are offered rights: protection from the state, the ability to claim nationality, and the like. More than a thousand years later, German-born American philosopher and writer Hannah Arendt echoed this same sentiment by famously calling citizenship “the right to have rights.” In her view, citizenship was a necessary vehicle to deliver human rights. Simply being human didn’t give you access to things like life and liberty. One needs a state to fulfill them. Taken backwards, this implies that without a government’s acknowledgement of citizenship, a person can be stripped of the rights inherent to their existence. In other words, if you’re not a citizen, you’re not fully a person.
* * *
At the end of my mother’s Boston visit, her busy homemaking and floor-scrubbing now at an end, I take her to a donut shop for breakfast. Inside, a Cambodian family slips rings of hot fried dough glazed in honey into paper envelopes, handing them to construction workers, police officers, and university students. Behind the counter, on the other side of the kitchen door, no English exists. Instead, Cambodian wafts, punctured by laughter and sighs, tossed by the woman pouring coffee with her hand balled at her hip, the smiling man behind the counter, the surly teenager bussing half-finished plates of buttery scrambled eggs. Above the cash register proud signs hang declaring the store a “Boston Favorite,” a “Chosen Community Partner,” and the recipient of numerous other local awards.
At our sticky table, I find myself unexpectedly moved. Passing by the donut shop on my daily commute, I assumed that the curly pink neon signage, a relic from the ’50s preserved on a triangular storefront, was surely the property of a white family. Instead what I found was a family of South Asian immigrants, making a classic American food and serving it in their own fashion with aplomb. The donut shop seemed unconcerned with assimilation. Months later, I’d take my sister to the same donut shop and she’d say that she was confused. The decor inside made her feel like she should be eating some sort of noodles but instead she was eating a chocolate glazed cake donut.
As a rule, I am skeptical of the American Dream. I’m suspicious of what it sells and at what cost. What does it mean to believe in “life, liberty, and the pursuit of happiness” when the state reserves the right to take it away at a moment’s notice, to inter you and your family for looking like the enemy? What is freedom if it is a specific, circumscribed kind of freedom? A labored freedom? An unfair freedom? A tilted, volatile, violent freedom?
But at the donut shop, picking apart a vanilla-and-chocolate twist, I see a glimpse of what this country might offer: a promise of evolution, integrity, and acceptance. Perhaps this is what belonging in this country might mean, at its best: that something as classically American as a 1950s corner donut store could be taken over by a family of refugees from South Asia without pomp or angst. That the store and the family that run it can exist without concerning themselves with assimilating to a white American standard, but instead remain rooted in their own traditions and languages. Sitting in the corner table with my mother, I feel as if happiness, freedom, equality, these are hard to come by and elusive. But change, the potential for newness and its embrace, these might yet flourish. These prospects feel solid, somehow, steady and unconditional, vivacious in comparison to the pale two-faced promise of a passport. A hint that perhaps making a home for oneself actually has nothing to do with the cold detachment of a customs official, and more to do with the warmth of feeding your kin on a cold morning.
* * *
Here is how I once passed through customs in Tokyo:
After 14 hours of sitting in an economy class seat, the overhead bin bumping precariously along to turbulence, sleep evasive and slippery, I am greasy and dry-eyed. Everything feels dreamlike. Time moves in stilted seconds, late afternoon sunlight pouring in through pristine panels of glass when my mind is clamoring that it ought to be night. Passengers are herded like badly behaved cattle along moving walkways, the robotic woman’s voice telling us to please watch our step. The path curves, and soon the windows are replaced by gray walls and fluorescent lights. I continue to trudge forward, dragging my stubbornly lagging suitcase. On the walls are signs advertising caution about various strains of influenza.
Sitting in the corner table with my mother, I feel as if happiness, freedom, equality, these are hard to come by and elusive. But change, the potential for newness and its embrace, these might yet flourish.
At customs, placards hang from the ceiling, directing the flight crew to the right, followed by foreigners and tourists, with Japanese nationals and permanent residents filing to the far left. I take my place in the line to the left, feeling at once indignant and like an imposter. An anxious, scrambling feeling chases its tail under my collarbone. As I approach the sunken booth, I try to sound as local as it can get, hoping that the country bumpkin slur of my words will score me a point in the invisible tally of Japaneseness I imagine each customs official keeping. I answer questions about where I am staying, why I am here. Images of the kerosene stove in my grandmother’s front room, my grandfather’s furled fists, their unruly garden — these blossom in my mind, a talisman of home to hold tightly under my breath. Believe me, I pray, believe that I belong here. Inside my backpack, I can feel my other passport, my other citizenship, pulsating like a treacherous living thing.
* * *
It is not lost on me that the language of citizenship traffics in metaphors of life and death, but delivers on promise and rumors. We are given weighty, destiny-scaled ultimatums, discussions of blood and soil evoking images of birth and death, sustenance and longevity. Identification implies belonging, our membership to a country playing on notions of larger, state-bound families. The nation is our mother. The nation is our father. In giving us the gift of citizenship, it has labored to give us life and will lay us weeping in the ground.
But in delivery, citizenship becomes elusive and hard to pin down. It is promised to us with outstretched arms, then snatched away with ease. We are assured home and kinship; we arrive to find an empty house. We are drawn to the visage of a guardian — “Give me your tired, your poor, your huddled masses yearning to breathe free” —but we are greeted by a ghost.
* * *
After finishing our breakfast at the donut shop, my mother and I take a cab to Logan Airport so she can catch her flight home to Chicago. When we arrive, I help her check in and walk her to the TSA cordoned security area. She waves me away at the mouth of the line, the oblong maze of tangled tape empty at this apparently unpopular time to fly. “Go,” she says. I shake my head, watching her hoist her navy canvas bag over one shoulder, taking mincing steps through the open line in front of her. This shooing-and-staying, like the floor-washing, is another one of our family’s traditions. Whenever one of us leaves their home, whether it is in Japan or the U.S., whomever they are leaving staunchly refuses to leave the side of the security line until they can no longer see them. This staying put is an act of loyalty, of love, of claiming each other as our own. We are stating that no border crossing, no officialdom, no distance or space can slice its way through our bonds.
That day I watch my mother’s small body turn even smaller in the distance, and I feel a familiar animal anxiety dig its claws into my chest. Earlier that week, crowds of people poured into U.S. airports, protesting Donald Trump’s travel ban. Scenes of lobbies filled with protesters flooded televisions, mouths moving in angry unison on muted screens. Reports of families separated at customs, of loved ones canceling plans to visit their relatives in the U.S., patients unable to access American hospitals — these are the stories that dominated the news cycle.
Suddenly, as if someone had passed a transparency over my eyes, I see the TSA agent taking a closer look at my mother’s green card. I imagine his voice, meaty and rough when raised. I imagine my mother’s English, flattening as frustration crept into her voice. I imagine what I might do if someone emerged from the wings of the security booth to grab her by the arm, roughly escorting her to a private room. I imagine if I would shout, run, or stay rooted to the spot. At least she would be OK in Japan, a small voice, at once guilty and relieved, says inside me.
My mother passes through the security checkpoint without incident. She waves from behind the metal detector, her hand cleaving a wide, swinging arc in the air.
* * *
Citizenship comes into sharp relief at the most important junctures of life. Two years after my mother’s visit to Boston, my now-husband and I go to the Cook County Clerk’s office, in Chicago, to obtain our marriage license. We are presented with a list of appropriate documents to prove our citizenship — driver’s licenses, passports, birth certificates. Above us, a looming sign proclaims: COOK COUNTY CLERK | BIRTH MARRIAGE DEATH. Birth, marriage, death: To be acknowledged, all these require proof of belonging to a nation. Plunking down my own driver’s license, I wonder what one does without the proper identification. A man ahead of us in line is turned away for not having the correct paperwork to claim his infant daughter’s birth certificate. Without the necessary government-issued credentials, no matter how strange it seemed, he could not receive proof that his daughter now existed outside the womb. Without citizenship, could you be born? Without it, could you die?
This staying put is an act of loyalty, of love, of claiming each other as our own. We are stating that no border crossing, no officialdom, no distance or space can slice its way through our bonds.
My wondering is of course borne of a certain kind of privilege. Undocumented and stateless people know exactly what it is like to live without citizenship. People dear to me have struggled for acknowledgement in the eyes of a mercurial state, granting and revoking rights with the turn of an administration. In many ways I am lucky to be presented with the conundrum of citizenship after 22 years of dual citizenship. I have had not one but two homes.
* * *
On my most recent trip home to Japan, this time to celebrate my new marriage with my family, I exited the plane groggy and barely awake. I followed the familiar corridor, the paneled light flickering, the woman’s voice telling us to mind the gap. Passengers plodded on, all of us filing forward to customs, noting the warnings for newer, more varied strains of flu. This time, I did not take the far left lane. Instead, I entered the country for the first time on a U.S. passport, my lapsed Japanese one tucked in my backpack, safely away from questions of allegiance, loyalty, and citizenship. A small part of me was relieved to filter through the droning line of tourists, no need to prove my worthiness of entry to a stony-faced official. A larger part of me wallowed in a shallow sadness, as if a pale premonition of grief, suspecting that this might be the first step toward exile.
Why do you speak Japanese so well? the man at customs barked, suspicious. Because my mother is Japanese, I answered, the image of her running a rag over my Boston floors, the homes she has created the world over for us, blurring my vision. Is this your only passport? he jabbed a finger at my solitary blue book. Yes, I smiled, three red booklets pulsing against my back.
* * *
Nina Li Coomes is a Japanese and American writer from Nagoya and Chicago. Her work can be found in The Atlantic, EATER, Catapult and elsewhere.
The Stelton colony in central New Jersey was founded in 1915. Humble cottages (some little more than shacks) and a smattering of public buildings ranged over a 140-acre tract of scrubland a few miles north of New Brunswick. Unlike America’s better-known experimental settlements of the nineteenth century, rather than a refuge for a devout religious sect, Stelton was a hive of political radicals, where federal agents came snooping during the Red Scare of 1919-1920. But it was also a suburb, a community of people who moved out of the city for the sake of their children’s education and to enjoy a little land and peace. They were not even the first people to come to the area with the same idea: There was already a German socialist enclave nearby, called Fellowship Farm.
The founders of Stelton were anarchists. In the twenty-first century, the word “anarchism” evokes images of masked antifa facing off against neo-Nazis. What it meant in the early twentieth century was different, and not easily defined. The anarchist movement emerged in the mid-nineteenth century alongside Marxism, and the two were allied for a time before a decisive split in 1872. Anarchist leader Mikhail Bakunin rejected the authority of any state — even a worker-led state, as Marx envisioned — and therefore urged abstention from political engagement. Engels railed against this as a “swindle.”
But anarchism was less a coherent, unified ideology than a spectrum of overlapping beliefs, especially in the United States. Although some anarchists used violence to achieve their ends, like Leon Czolgosz, who assassinated President William McKinley in 1901, others opposed it. Many of the colonists at Stelton were influenced by the anarcho-pacifism of Leo Tolstoy and by the land-tax theory of Henry George. The most venerated hero was probably the Russian scientist-philosopher Peter Kropotkin, who argued that voluntary cooperation (“mutual aid”) was a fundamental drive of animals and humans, and opposed centralized government and state laws in favor of small, self-governing, voluntary associations such as communes and co-ops. Read more…
Since the move to Douglas, Arizona, Jennifer had spent less and less time at home. She was distant and irritable. Her anger encompassed her mother, her mother’s abusive boyfriend Saul, American schools, and the whole United States. At the nadir, she started lashing out at her sisters Aida and Cynthia. And then, in 1998 or 1999, she left for good.
The morning Jennifer ran away, Aida was the only other person home. She watched her sister dump schoolbooks from her backpack and replace them with clothes. She knew what was happening without having to ask and figured it was for the best. On the way out, Jennifer said that a friend would drive her across the border. After that, she’d see what happened.
Chris Outcalt | Longreads | March 2019 | 13 minutes (3,723 words)
The helicopter took off from a narrow patch of grass off the side of Route 2 about 30 miles southeast of Fairbanks, Alaska. The two-lane highway runs like an artery through the heart of the Alaskan interior, connecting the state’s third-most populous city to the outer reaches of North America. I’m riding shotgun in the lightweight, four-passenger chopper; Colorado State University (CSU) archeologist Julie Esdale is seated behind me. Esdale, who earned her Ph.D. in anthropology at Brown University, has spent more than a decade in this part of the state, exploring centuries of soil with a community of other social scientists whose aim is to weave together the tangled origins of humanity.
Fifty feet up, as the booming whop-whop of the propeller blades cuts through the air overhead, we crest a row of trees along the edge of the road, revealing a spectacular view: a massive, tree-lined valley framed to the west by the peaks of the Alaska Range, one of the highest stretches of mountains in the world. These jagged hills formed millions of years ago; shifting tectonic plates collided along the Denali and Hines Creek Faults, pushing the earth 20,000 feet into the air. Our destination lies about 10 miles into this lowland known as the Tanana Flats. Esdale and her colleagues believe the spot, a vestige of a 14,000-year-old hunter-gatherer encampment hidden deep in the earth, could hold important clues to better understanding the behavior of North America’s earliest inhabitants.
Esdale helped discover and excavate this important ground known as McDonald Creek, which turned out to be one of the oldest archeological sites in the country. Field crews found fragments of stone tools, charcoal dust left behind by ancient firepits, and remains of bison, mammoth, elk, and waterfowl. Admittedly, I hadn’t spent much time thinking about those who pioneered the landmass I’d lived on my entire life, let alone the particulars of their livelihood; but my interest piqued at the thought of these scientists dedicating their professional lives to better understanding those who came before us, like a detective unit attempting to solve one of the first mysteries of mankind.
Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.
Esdale, who’s in her mid-40s and has straight, shoulder-length blond hair she often tucks under a ball cap out in the field, explained that Alaska is a hot spot for this research — that it was both a matter of history and geography. The last ice age took hold about 2.6 million years ago. When it began to melt around 12,000 years ago, it covered a well-documented land bridge between what is now Russia and Alaska. But before the glaciers thawed, causing water levels in the Bering Strait to rise, submerging the area known as Beringia, early humans wandered east to west across this continental divide. They were the first people to set foot in the New World, and they walked straight into what is today central Alaska.
…my interest piqued at the thought of these scientists dedicating their professional lives to better understanding those who came before us, like a detective unit attempting to solve one of the first mysteries of mankind.
“Early sites are hit and miss in the lower forty-eight,” Esdale told me. “But in the interior, we’ve got lots and lots of them.” Still, perhaps too far-flung to have slipped into the mainstream, she said Alaskan archeology was often overlooked in favor of research in the continental United States. Esdale’s husband, Jeff Rasic, also an Alaskan archeologist, told me he’d attended numerous national meetings of top researchers in the field and had often been struck by how little they tracked new findings in Alaska. “These are full-time academic archeologists,” Rasic said, “and they’re behind.” If I ever wanted to have a look up close, Esdale said she’d be happy to show me around when I first contacted her by phone last year.
By chance, I flew into Fairbanks two days ahead of the summer solstice, which brings nearly 24 hours of daylight to the region. When I landed close to midnight the sky was bright enough it could’ve easily been noon. (Later, I overheard a popular American Legion baseball game was scheduled for the following night. First pitch: 12:01 a.m.) I met Esdale early the next morning. We stopped at the local Safeway for a coffee and to pack a lunch, then headed to the helicopter launch site. After about 15 minutes in the air, Esdale pointed to our landing spot, a prominent mound that jutted above the flat, wooded landscape.
As we approached, she explained the scenery would’ve looked a lot different 14,000 years ago; the ground was still recovering from the ice age’s deep freeze and the trees hadn’t grown in yet. Nevertheless, I could see what the people who camped here back then were thinking. Atop the high point of an otherwise flat area would’ve been a good place to lookout for predators, scout prey for their next meal, or to simply rest their legs and enjoy the view after a long walk. At least that last part, I thought, we had in common.
***
In Alaska, a state known for its expansive territory, the federal government is the largest landowner, controlling about 61 percent of the terrain. Most of that is allocated for public use and managed by the National Park Service and the Fish and Wildlife Service. There are other operators, however; notably, the United States Army oversees the use of about 1.5 million acres in the central part of the state.
Drawn to the open, undeveloped land and distinct climate, the military has maintained a presence in interior Alaska since the 1930s. Today, the local base is known as Fort Wainwright, “home of the Arctic Warriors.” During the frigid Alaskan winters, soldiers test gear, vehicles, and the limits of their own bodies in extreme cold. What’s more, with ample space, units can spread out and simulate wartime drills and construct practice bombing ranges. But although there are few neighbors to disturb, federal law — the National Historic Preservation Act and the Archeological Resources Protection Act — requires the military pay close attention to what might lie beneath the surface. In fact, given that the area is archaeologically rich, the Army funds a team of about half a dozen people who make sure it doesn’t trample any sensitive material — anything from stone tools or rock carvings to portions of structures or grave sites at least a century old. For the past eight years, Esdale has run the team.
Esdale first moved to Alaska in 2002 as a student, several years before getting the gig with the Army. She’d been conducting research for her Ph.D. in the far reaches of northwest Alaska when she met her husband out in the field. Not long after, Rasic got a job with the National Park Service based in Fairbanks; they made the move north together, two scientists in love headed for the Last Frontier. That first year they got a dog, a big, goofy lab who demanded a lot of time outside — even when it was 50 below and felt like your eyelids would freeze shut after a few minutes. Eventually, Esdale and Rasic had two boys and she got the contract with the Army. By then Fairbanks felt like home.
Although sharpshooting members of the armed forces and a crew of erudite scientists studying human history might seem like strange bedfellows, the partnership has identified hundreds of significant sites hidden in the Alaskan tundra. Take McDonald Creek, for example. Several years ago, the brass at Fort Wainwright proposed building a road through the Tanana Flats. A team headed by Colorado State’s Ned Gaines, which included Esdale, dug a few test pits while surveying in advance of the development. “Everywhere we put a shovel, we found artifacts,” Esdale said. The Army rerouted the planned road, and excavation of the site was turned over to Texas A&M researcher Kelly Graf.
Although sharpshooting members of the armed forces and a crew of pesky erudite scientists studying human history might seem like strange bedfellows, the partnership has identified hundreds of significant sites hidden in the Alaskan tundra.
I met Graf and her team of mostly graduate students last summer. From the clearing where our helicopter landed, Esdale and I walked a well-worn path to a sort of base camp — an area among the trees about 80 feet in diameter. The camp was surrounded by a small, pop-up electric fence designed to keep animals away, and there were dozens of water jugs and large plastic bear-proof storage containers that resembled beer kegs. About 10 people sat around in fold-out camping chairs and on tree stumps finishing their lunch. This was Graf’s fourth year digging at the remote location. One highlight, she said, was they’d recently found what appeared to be a bone from a dog. Graf said the discovery could amount to evidence of the earliest known domesticated canine in North America. While we were talking she wondered aloud whether these early people would have traversed Beringia via some sort of dogsled or used the animals to help shoulder the weight of their belongings.
After lunch, the group migrated to the nearby dig location, a large pit that looked as if someone had pressed a massive rectangular cookie cutter into the ground and discarded all the dirt in the middle. Excavating an archeological site is tedious work, a far cry from the escapades of the world’s most famous member of the trade, the fictional character Indiana Jones. Rather, it consists mainly of carefully scraping away layers of dirt with a trowel and cataloging any items for further examination and analysis. “Our goal as anthropologists — it’s not just about treasures, not just about finding stuff,” Esdale told me. “It’s to understand people.”
Scientists have learned a lot about the founding populations of Indigenous peoples who lived in this area, particularly about how they subsisted. These people were mobile, resourceful, and skilled — unquestionably successful big-game hunters who preyed on bison, elk, and maybe even mammoth. They used spears and a throwing device called an atlatl, a curved tool made from wood, bone, or ivory not unlike the plastic tennis ball throwers popular at dog parks today. Hunters used it to launch darts fashioned with a pointed stone tip. (The bow and arrow didn’t show up for another 12,000 years.) Flakes discarded during the sharpening of these points are often found in the soil at sites like McDonald Creek.
‘Our goal as anthropologists — it’s not just about treasures, not just about finding stuff,’ Esdale told me. ‘It’s to understand people.’
For her part, though, Graf hoped to find more than flakes. Carbon dating of charcoal left behind by campfires and preserved 10 feet underground suggested that people occupied this location three different times throughout history — 7,000, 13,000, and nearly 14,000 years ago — making it one of the oldest sites in Alaska. “It’s an interesting place,” Graf told me. “We’ve always been looking for the base camp of these people. There are a lot of hunting camps around, shorter-term sites, but somewhere they had to be hunkering down, where grandma and grandpa and the kids and the mom, where everyone was hanging out. That’s kind of what we’re wondering, because this is a nice, fixed spot.”
“So, this could be that type of place?” I asked.
“Could be,” she said. “Could be.”
***
On my second day in Fairbanks, Esdale introduced me to an archeologist in his mid-70s named Chuck Holmes. He had a full head of neatly parted gray hair and a trimmed white beard. Before we met, Esdale outlined Holmes’s long resume. He’d taught at multiple universities, enlightening undergrads and guiding Ph.D. candidates, and had held senior-level science jobs with both the state and federal governments. It all amounted to decades of research and discoveries in the region. Hearing Esdale, I got the impression she was describing a sort of grandfather of Alaskan archeology.
Holmes first came to Alaska via Florida, about as far away as you can get in the United States — a fact his mother made sure to note when Holmes told her he’d decided to enroll at the University of Alaska Fairbanks in 1970. Holmes had fallen for the state’s wide-open territory the year before. Thanks to a friend’s father who worked for one of the railroad companies, Holmes and his hometown pal landed summer jobs laying train track across the tundra. “My friend was a little less interested in doing that kind of work; I just saw it as an adventure,” Holmes said. “I got in good shape and got to see quite a bit of the state.” From that moment, aside from brief stopovers in Calgary, Canada, and Washington state, Holmes spent the rest of his life in Alaska.
Holmes told me that as a kid he’d always had a penchant for finding things, so it was perhaps no surprise that during his undergrad years in Fairbanks he found archeology. “I was hooked on Alaska at that point,” Holmes said. But it was something he discovered two decades later that Esdale wanted me to learn more about: another archeological site not too far from McDonald Creek. The spot was known as Swan Point, and it happened to be the oldest historical site with evidence of human activity not just in Alaska but in the rest of the United States as well.
Back then, in the early 1990s, Holmes worked for the Office of History and Archeology in Alaska’s Department of Natural Resources. One summer, he led a group of students digging at an already well-established site in the Tanana Valley. A couple of the kids involved in the excavation wanted to venture out to look for something new, so Holmes pulled out a couple of maps and a compass, essential tools for an archeologist in the days before Google Earth. He identified what looked like a promising topographic feature: a hill off in the woods that appeared high enough to function as a lookout point, but not so high that it would’ve deterred a group of hunter-gatherers from climbing to the top. Holmes told the students to check it out, dig a few holes, and see what they found.
On their first attempt, the kids had trouble pinpointing the right location. Holmes sent them back the next day with additional instructions, and this time they returned with wide grins. First, they handed Holmes a couple of small plastic bags containing flakes likely cleaved from a stone tool. Not bad, Holmes thought. That was enough to suggest the site was worthy of further exploration. The students, however, had one more bag to show off. This one contained a scrap of ivory. The hard, white material, typically part of a tooth or tusk, is much more difficult to find in the wild, particularly in a shallow test pit dug at a somewhat hastily selected point on a map. It was like plucking a needle you didn’t know existed from a haystack the size of Delaware.
Holmes and other researchers excavated Swan Point on and off for the next two decades. Carbon dating placed it at about 14,200 years old. Scientists uncovered all kinds of gems, including stone tools, bones from a baby mammoth, food-storage pits, and hearths that campfires were built upon. The findings from Swan Point have been documented and published in numerous scientific papers, and in 2008 the government listed the site on the National Register of Historic Places. As it turned out, Holmes explained, much of the Swan Point technology was similar to what had been commonly found by scientists on the other side of the land bridge in Siberia, suggesting these people were related in some way. “These guys, we’re not really sure who the heck they are,” Holmes said, referring to whomever camped at Swan Point so long ago.
“They’re basically Asian; they are ancient folk,” he said. “But their genes carried into the New World.”
***
Later that day, after meeting Holmes, Esdale and I bumped along an overgrown, two-lane Jeep road that ran deep into the woods. We were headed toward another archeological site on Army lands, this one dating back about 13,000 years. The road dead-ended at a clearing atop a ridge with a view of a river and an open forest below. Esdale explained this location, aptly named Delta River Overlook, marked the first time that archeologists had found a Beringian site that humans appeared to have occupied in the winter. They could tell, she said, based on the existence of a specific tooth that had belonged to a baby bison — a molar that only erupts in the cold season.
Winters were lean times for humans 13,000 years ago. In addition to tracking larger animals and storing the frozen meat under rocks, hunters in these tribes also set snares to trap small game for times when the weather made it challenging to venture too far from camp; at Delta River Overlook, for example, there’s evidence of grouse and ground squirrel. Staying warm was another challenge. Furs from big-game animals helped, but scientists are still piecing together the picture of what their shelters might’ve looked like that long ago. Best guess from ethnographic evidence, Esdale told me, is that families constructed dwellings by draping animal skins over a dome of flexible branches and packing the outside with snow for additional insulation.
The excavation of the Delta River site was led by a professor of archeology at the University of Alaska Fairbanks named Ben Potter. Potter was in China on a research trip when I visited Alaska, but I spoke with him on the phone later. Like Holmes, he’s made a number of important contributions to the Alaskan archeological canon. Potter’s body of work, however, contains one particularly unique entry: He uncovered the oldest human remains to date at an archeological site in Alaska. The first finding occurred in 2010, after years of work at an 11,500-year-old site known as Upward Sun River.
Potter and his team were contracted in 2005 to conduct a survey ahead of a proposed railway expansion through Army lands 40 miles from Fairbanks. His crew dug a few test pits and found evidence of human activity. The rail project was eventually rerouted, and in 2009 Potter received a grant from the National Science Foundation to continue excavating and investigating the site. He made the startling discovery the following year. About a meter down, Potter’s crew found parts of a human skull; later analysis determined the bones had come from a 3-year-old cremated child. In 2013, they went deeper into the site, and the team found the remains of two infants. Extracting human remains from the ground in Alaska necessitates consulting with local Indigenous tribes, which maintain a notable presence in their ancestral lands in the state — about 100,000 people spread across at least four groups. With the support and cooperation of local tribal leaders, his team removed the bones and sent out a sample for genetic analysis. They published the results last year.
The goal is just knowing more — to keep understanding.
The DNA makeup revealed an entirely new population of Native peoples, a group Potter labeled “Ancient Beringians.” There were other important findings at Upward Sun River. For example, they discovered fish bones buried in a hearth, where hunters would’ve cooked their meat, which helped Potter and his team establish the earliest known human consumption of salmon in the Americas. Previously, scientists had thought this occurred near the ocean. “It wasn’t on the coast, it was in the deep interior rivers,” Potter said. “That’s pretty exciting.” But the conclusions drawn from the DNA analysis were by far the most significant: a previously unknown branch of ancient humans.
It was a substantial addition to the archeology of the time. Although the general narrative about the early migration of people from Siberia to the Americas is mostly agreed upon, the specifics are subject to ongoing debate among social scientists. When exactly did these ancient people first arrive in Alaska? Did they settle down? If so, for how long? When did they colonize the rest of America? Did they travel inland or along the coast? What the DNA from Potter’s discovery and other analysis showed was that for a period of several thousand years the genetic code of early Indigenous people evolved in isolation, no longer mixing with the DNA of those who lived in eastern Asia. It also appeared that these Ancient Beringians were eventually separated from those who went on to colonize the rest of the Americas.
Two other groups of scientists have discovered new genetic evidence that he felt buttressed his work. The findings included, in part, a human DNA sample from a 12,600-year-old cave in Montana and a single tooth preserved from a 1949 dig at a 10,000-year-old site in western Alaska, hundreds of miles from Fairbanks. The tooth had long been forgotten, stashed away on a dusty shelf at a museum in Copenhagen, Denmark. It was found by, of all people, Esdale’s husband Rasic. Turned out, the genetic makeup of the tooth matched the children’s from Upward Sun River.
“This actually clarifies quite a bit,” Potter told me when I followed up with him after the new papers were released. He walked me through the scenario he saw taking shape: People were likely living in Asia around 16,000 years ago. The glaciers began to melt and tribes migrated from western Beringia to Alaska around, say, 15,000 years ago. Then you have a split: ancient Beringians sticking around Alaska and another group traveling south, either inland, along the coast, or both, entering the rest of the Americas. That second group, he said, looked to be a single population that spread quickly and later split into many lineages.
Talking with Potter about the DNA results and migration theories it reminded me of a conversation Esdale and I had on our drive out to Delta River Overlook, the day before I left Alaska and flew back to the rest of the United States. We’d been talking about how, based on the antique elements of the profession, archeologists are necessarily adept at spinning complex abstractions from limited evidence, whether it’s the shape of a microblade point or a scrap of an animal bone. It seemed to me, however, that that meant there was no endgame to this work — that it could go on forever, like trying to solve a massive jigsaw puzzle in which an untold number of pieces were destroyed eons ago. When I floated this thought to Esdale, she laughed. “Yeah, no, there’s never an endgame. The goal is just knowing more — to keep understanding.”
We continued along the Jeep road into the forest.
“I never really thought about it like that,” she said.
***
Chris Outcalt is a writer and editor based in Colorado.
You must be logged in to post a comment.