When I was a kid, in the pre-internet days of the 1980s, my screen time was all about Nickelodeon. My favorite show was “You Can’t Do That on Television.” It was a kind of sketch show; the most common punchline was a bucket of green slime being dropped on characters’ heads. It was pretty dumb. It was also created by professional writers, actors, and crew, who were decently paid; many of them belonged to unions.
Today, my kids don’t have much interest in that sort of show. For them, TV mostly means YouTube. Their preferred channels collect memes and jokes from various corners of the internet. In a typical show, a host puts on goofy voices to read posts from r/ChoosingBeggars, a Reddit message board devoted to customers who make absurd demands of Etsy vendors. It’s significantly funnier than “You Can’t Do That on Television,” I admit. It also involves no unionized professionals.
In 2014, Rich Niemann, president and CEO of the Midwestern grocery company Niemann Foods, made the most important phone call of his career. He dialed the Los Angeles office of Shook Kelley, an architectural design firm, and admitted he saw no future in the traditional grocery business. He was ready to put aside a century of family knowledge, throw away all his assumptions, completely rethink his brand and strategy — whatever it would take to carry Niemann Foods deep into the 21st century.
“I need a last great hope strategy,” he told Kevin Kelley, the firm’s cofounder and principal. “I need a white knight.”
Part square-jawed cattle rancher, part folksy CEO, Niemann is the last person you’d expect to ask for a fresh start. He’s spent his whole life in the business, transforming the grocery chain his grandfather founded in 1917 into a regional powerhouse with more than 100 supermarkets and convenience stores across four states. In 2014, he was elected chairof the National Grocery Association. It’s probably fair to say no one alive knows how to run a grocery store better than Rich Niemann. Yet Niemann was no longer sure the future had a place for stores like his.
He was right to be worried. The traditional American supermarket is dying. It’s not just Amazon’s purchase of Whole Foods, an acquisition that trade publication Supermarket News says marked “a new era” for the grocery business — or the fact that Amazon hopes to launch a second new grocery chain in 2019, according to a recent report from The Wall Street Journal, with a potential plan to scale quickly by buying up floundering supermarkets. Even in plush times, grocery isa classic “red ocean” industry, highly undifferentiated and intensely competitive. (The term summons the image of a sea stained with the gore of countless skirmishes.) Now, the industry’s stodgy old playbook — “buy one, get one” sales, coupons in the weekly circular — is hurtling toward obsolescence. And with new ways to sell food ascendant, legacy grocers like Rich Niemann are failing to bring back the customers they once took for granted. You no longer need grocery stores to buy groceries.
Niemann hired Kelley in the context of this imminent doom. The assignment: to conceive, design, and build the grocery store of the future. Niemann was ready to entertain any idea and invest heavily. And for Kelley, a man who’s worked for decades honing his vision for what the grocery store should do and be, it was the opportunity of a lifetime — carte blanche to build the working model he’s long envisioned, one he believes can save the neighborhood supermarket from obscurity.
Kevin Kelley, illustration by Vinnie Neuberg
Rich Niemann, illustration by Vinnie Neuberg
The store that resulted is calledHarvest Market, which opened in 2016. It’s south of downtown Champaign, Illinois, out by the car dealerships and strip malls; 58,000 square feet of floor space mostly housed inside a huge, high-ceilinged glass barn. Its bulk calls to mind both the arch of a hayloft and the heavenward jut of a church. But you could also say it’s shaped like an ark, because it’s meant to survive an apocalypse.
Harvest Market is the anti-Amazon. It’s designed to excel at what e-commerce can’t do: convene people over the mouth-watering appeal of prize ingredients and freshly prepared food. The proportion of groceries sold onlineis expected to swell over the next five or six years, but Harvest is a bet that behavioral psychology, spatial design, and narrative panache can get people excited about supermarkets again. Kelley isn’t asking grocers to be more like Jeff Bezos or Sam Walton. He’s not asking them to be ruthless, race-to-the-bottom merchants. In fact, he thinks that grocery stores can be something far greater than we ever imagined — a place where farmers and their urban customers can meet, a crucial link between the city and the country.
But first, if they’re going to survive, Kelley says, grocers need to start thinking like Alfred Hitchcock.
* * *
Kevin Kelley is an athletic-looking man in his mid-50s, with a piercing hazel gaze that radiates thoughtful intensity. In the morning, he often bikes two miles to Shook Kelley’s office in Hollywood — a rehabbed former film production studio on an unremarkable stretch of Melrose Avenue, nestled between Bogie’s Liquors and a driving school. Four nights a week, he visits a boxing gym to practice Muay Thai, a form of martial arts sometimes called “the art of eight limbs” for the way it combines fist, elbow, knee, and shin attacks. “Martial arts,” Kelley tells me, “are a framework for handling the unexpected.”That’s not so different from his main mission in life: He helps grocery stores develop frameworks for the unexpected, too.
You’ve never heard of him, but then it’s his job to be invisible. Kelley calls himself a supermarket ghostwriter: His contributions are felt more than seen, and the brands that hire him get all the credit. Countless Americans have interacted with his work in intimate ways, but will never know his name. Such is the thankless lot of the supermarket architect.
A film buff equally fascinated by advertising and the psychology of religion, Kelley has radical theories about how grocery stores should be built, theories that involve terms like “emotional opportunity,” “brain activity,” “climax,” and “mise-en-scène.” But before he can talk to grocers about those concepts, he has to convince them of something far more elemental: that their businesses face near-certain annihilation and must change fundamentally to avoid going extinct.
“It is the most daunting feeling when you go to a grocery store chain, and you meet with these starched-white-shirt executives,” Kelley tells me. “When we get a new job, we sit around this table — we do it twenty, thirty times a year. Old men, generally. Don’t love food, progressive food. Just love their old food — like Archie Bunkers, essentially. You meet these people and then you tour their stores. Then I’ve got to go convince Archie Bunker that there’s something called emotions, that there are these ideas about branding and feeling. It is a crazy assignment. I can’t get them to forget that they’re no longer in a situation where they’ve got plenty of customers. That it’s do-or-die time now.”
Forget branding. Forget sales. Kelley’s main challenge is redirecting the attention of older male executives, scared of the future and yet stuck in their ways, to the things that really matter.
“I make my living convincing male skeptics of the power of emotions,” he says.
Human beings, it turns out, aren’t very good at avoiding large-scale disaster. As you read this, the climate is changing, thanks to the destructively planet-altering activities of our species. The past four years have beenthe hottest on record. If the trend continues — and virtually all experts agree it will — we’re likely to experiencemass disruptions on a scale never before seen in human history.Drought will be epidemic. Theocean will acidify. Islands will be swallowed by the sea. People could be displaced by the millions, creatinga new generation of climate refugees. And all becausewe didn’t move quickly enough when we still had time.
You know this already. But I bet you’re not doing much about it — not enough, at least, to help avert catastrophe. I’ll bet your approach looks a lot like mine: worry too much, accomplish too little. The sheer size of the problem is paralyzing. Vast, systemic challenges tend to short-circuit our primate brains. So we go on, as the grim future bears down.
Grocers, in their own workaday way, fall prey to the same inertia. They got used to an environment of relative stability. They don’t know how to prepare for an uncertain future. And they can’t force themselves to behave as if the good times are really going to go away — even if, deep down, they know it’s true.
I make my living convincing male skeptics of the power of emotions.
In the 1980s, you could still visit almost any community in the U.S. and find a thriving supermarket. Typically, it would be a dynasty family grocery store, one that had been in business for a few generations. Larger markets usually had two or three players, small chains that sorted themselves out along socioeconomic lines: fancy, middlebrow, thrifty. Competition was slack and demand — this is the beautiful thing about selling food — never waned. For decades, times were good in the grocery business. Roads and schools were named after local supermarket moguls, who often chaired their local chambers of commerce. “When you have that much demand, and not much competition, nothing gets tested. Kind of like a country with a military that really doesn’t know whether their bullets work,” Kelley says. “They’d never really been in a dogfight.”
It’s hard to believe now, but there wasnot a single Walmart on the West Coast until 1990. That decade saw the birth of the “hypermarket” and the beginning of the end for traditional grocery stores — Walmarts, Costcos, and Kmarts became the first aggressive competition supermarkets ever really faced, luring customers in with the promise of one-stop shopping on everything from Discmen to watermelon.
The other bright red flag: Americans started cooking at home less and eating out more. In 2010, Americansdined out more than in for the first time on record, the culmination of a slow shift away from home cooking that had been going onsince at least the 1960s. That trend is likely to continue. According to a 2017 report from the USDA’s Economic Research Service, millennials shop at food stores less than any other age group, spend less time preparing food, and are more likely to eat carry-out, delivery, or fast food even when they do eat at home. But even within the shrinking market for groceries, competition has stiffened. Retailers not known for selling food increasingly specialize in it, a phenomenon called“channel blurring”; today, pharmacies like CVS sell pantry staples and packaged foods, while 99-cent stores like Dollar General are a primary source of groceries for a growing number of Americans. Then there’s e-commerce. Though only about 3 percent of groceries are currently bought online, that figure couldrocket to 20 percent by 2025. From subscription meal-kit services like Blue Apron to online markets like FreshDirect and Amazon Fresh, shopping for food has become an increasingly digital endeavor — one that sidesteps traditional grocery stores entirely.
A cursory glance might suggest grocery stores are in no immediate danger. According to the data analytics company Inmar, traditional supermarkets still have a 44.6 percent market share among brick-and-mortar food retailers. And thougha spate of bankruptcies has recently hit the news,there are actually more grocery stores today than there were in 2005. Compared to many industries — internet service, for example — the grocery industry is still a diverse, highly varied ecosystem. Forty-three percent of grocery companies have fewer than four stores, according to a recent USDAreport. These independent stores sold 11 percent of the nation’s groceriesin 2015, a larger collective market share than successful chains like Albertson’s (4.5 percent), Publix (2.25 percent), and Whole Foods (1.2 percent).
But looking at this snapshot without context is misleading — a little like saying that the earth can’t be warming because it’s snowing outside. Not long ago, grocery stores sold the vast majority of the food that was prepared and eaten at home — about 90 percent in 1988, according to Inmar. Today, their market share has fallen by more than half, even as groceries represent a diminished proportion of overall food sold. Their slice of the pie is steadily shrinking, as is the pie itself.
By 2025, the thinking goes, most Americans will rarely enter a grocery store. That’s according to areport called “Surviving the Brave New World of Food Retailing,” published by the Coca-Cola Retailing Research Council — a think tank sponsored by the soft drink giant to help retailers prepare for major changes. The reportdescribes a retail marketplace in the throes of massive change, where supermarkets as we know them are functionally obsolete. Disposables and nonperishables, from paper towels to laundry detergent and peanut butter, will replenish themselves automatically, thanks to smart-home sensors that reorder when supplies are low. Online recipes from publishers like Epicurious will sync directly to digital shopping carts operated by e-retailers like Amazon. Impulse buys and last-minute errands will be fulfilled via Instacart and whisked over in self-driving Ubers. In other words, food — for the most part — will be controlled by a small handful of powerful tech companies.
The Coca-Cola report, written in consultation with a handful of influential grocery executives, including Rich Niemann, acknowledges that the challenges are dire. To remain relevant, it concludes, supermarkets will need to become more like tech platforms: develop a “robust set of e-commerce capabilities,” take “a mobile-first approach,” and leverage “enhanced digital assets.” They’ll need infrastructure for “click and collect” purchasing, allowing customers to order online and pick up in a jiffy. They’ll want to establish a social media presence, as well as a “chatbot strategy.” In short, they’ll need to become Amazon, and they’ll need to do it all while competing with Walmart — and its e-commerce platform, Jet.com — on convenience and price.
That’s why Amazon’s acquisition of Whole Foods Market was terrifying to so many grocers, sending the stocks of national chains like Kroger tumbling: It represents a future they can’t really compete in. Since August 2017, Amazon has masterfully integrated e-commerce and physical shopping, creating a muscular hybrid that represents an existential threat to traditional grocery stores. The acquisition was partially a real estate play: Whole Foods stores with Prime lockers nowact as a convenient pickup depot for Amazon goods. But Amazon’s also doing its best to make it too expensive and inconvenient for its Prime members, who pay $129 a year for free two-day shipping and a host of other perks, to shop anywhere else. Prime membersreceive additional 10 percent discounts on select goods at Whole Foods, and Amazon is rolling out home grocery delivery in select areas. With the Whole Foods acquisition, then, Amazon cornered two markets: the thrift-driven world of e-commerce and the pleasure-seeking universe of high-end grocery. Order dish soap and paper towels in bulk on Amazon, and pick them up at Whole Foods with your grass-fed steak.
An Amazon worker wheels back a cart after loading a bag of groceries into a customer’s car at an AmazonFresh Pickup location in Seattle. (AP Photo/Elaine Thompson, File)
Ingredients from a three-meal Blue Apron box. (AP Photo/Bree Fowler)
An employee of grocery delivery service Amazon Fresh scans ordered products before putting them into a transport bag. (Monika Skolimowska/picture-alliance/dpa/AP Images)
Traditional grocers are now expected to offer the same combination of convenience, flexibility, selection, and value. They’re understandably terrified by this scenario, which would require fundamental, complex, and very expensive changes. And Kelley is terrified of it, too, though for a different reason: He simply thinks it won’t work. In his view, supermarkets will never beat Walmart and Amazon at what they do best. If they try to succeed by that strategy alone, they’ll fail. That prospect keeps Kelley up at night — because it could mean a highly consolidated marketplace overseen by just a handful of players, one at stark contrast to the regional, highly varied food retail landscape America enjoyed throughout the 20th century.
“I’m afraid of what could happen if Walmart and Amazon and Lidl are running our food system, the players trying to get everything down to the lowest price possible,” he tells me. “What gives me hope is the upstarts who will do the opposite. Who aren’t going to sell convenience or efficiency, but fidelity.”
The approach Kelley’s suggesting still means completely overhauling everything, with no guarantee of success. It’s a strategy that’s decidedly low-tech, though it’s no less radical. It’s more about people than new platforms. It means making grocery shopping more like going to the movies.
* * *
Nobody grows up daydreaming about designing grocery stores, including Kelley. As a student at the University of North Carolina at Charlotte, he was just like every other architect-in-training: He wanted to be a figure like Frank Gehry, building celebrated skyscrapers and cultural centers. But he came to feel dissatisfied with the culture of his profession. In his view, architects coldly fixate on the aesthetics of buildings and aren’t concerned enough with the people inside.
“Architecture worships objects, and Capital-A architects are object makers,” Kelley tells me. “They aren’t trying to fix social issues. People and their experience and their perceptions and behaviors don’t matter to them. They don’t even really want people in their photographs—or if they have to, they’ll blur them out.”What interested Kelley most was how people would use his buildings, not how the structures would fit into the skyline. He wanted to shape spaces in ways that could actually affect our emotions and personalities, bringing out the better angels of our nature. To his surprise, no one had really quantified a set of rules for how environment could influence behavior. Wasn’t it strange that advertising agencies spent so much time thinking about the links between storytelling, emotions, and decision-making — while commercial spaces, the places where we actually go to buy, often had no design principle beyond brute utility?
“My ultimate goal was to create a truly multidisciplinary firm that was comprised of designers, social scientists and marketing types,” he says. “It was so unorthodox and so bizarrely new in terms of approach that everyone thought I was crazy.”
In 1992, when he was 28, Kelley cofounded Shook Kelley with the Charlotte, North Carolina–based architect and urban planner Terry Shook. Their idea was to offer a suite of services that bridged social science, branding, and design, a new field they called “perception management.” They were convinced space could be used to manage emotion, just the way cinema leads us through a guided sequence of feelings, and wanted to turn that abstract idea into actionable principles. While Shook focused on bigger, community-oriented spaces like downtown centers and malls, Kelley focused on the smaller, everyday commercial spaces overlooked by fancy architecture firms: dry cleaners, convenience stores, eateries, bars. One avant-garde restaurant Kelley designed in Charlotte, called Props, was an homage to the sitcom craze of the 1990s. It was built to look like a series of living rooms, based on the apartment scenes in shows like Seinfeld and Friends and featured couches and easy chairs instead of dining tables to encourage guests to mingle during dinner.
The shift to grocery stores didn’t happen until a few years later, almost by accident. In the mid-’90s, Americans still spent about 55 percent of their food dollars on meals eaten at home — but that share was declining quickly enough to concern top corporate brass at Harris Teeter, a Charlotte-area, North Carolina–based grocery chain with stores throughout the Southwestern United States. (Today, Harris Teeter is owned by Kroger, the country’ssecond-largest seller of groceries behind Walmart.) Harris Teeter execs reached out to Shook Kelley.“We hear you’re good with design, and you’re good with food,” Kelley remembers Harris Teeter reps saying. “Maybe you could help us.”
At first, it was Terry Shook’s account. He rebuilt each section of the store into a distinct “scene” that reinforced the themes and aesthetics of the type of food it sold. The deli counter became a mocked-up urban delicatessen, complete with awning and neon sign. The produce section resembled a roadside farmstand. The dairy cases were corrugated steel silos, emblazoned with the logo of a local milk supplier. And he introduced full-service cafés, a novelty for grocery stores at the time, with chrome siding like a vintage diner.It was pioneering work, winning that year’s Outstanding Achievement Award from the International Interior Design Association — according to Kelley, it was the first time the prestigious award had ever been given to a grocery store.
Shook backed off of grocery stores after launching the new Harris Teeter, but the experience sparked Kelley’s lifelong fascination with grocery stores, which he realized were ideal proving grounds for his ideas about design and behavior. Supermarkets contain thousands of products, and consumers make dozens of decisions inside them — decisions about health, safety, family, and tradition that get to the core of who they are. He largely took over the Harris Teeter account and redesigned nearly 100 of the chain’s stores, work that would go on to influence the way the industry saw itself and ultimately change the way stores are built and navigated.
Since then, Kelley has worked to show grocery stores that they don’t have to worship at the altar of supply-side economics. He urges grocers to appeal instead to our humanity. Kelley asks them to think more imaginatively about their stores, using physical space to evoke nostalgia, delight our senses, and appeal to the parts of us motivated by something bigger and more generous than plain old thrift. Shopping, for him, is all about navigating our personal hopes and fears, and grocery stores will only succeed when they play to those emotions.
When it works, the results are dramatic. Between 2003 and 2007, Whole Foods hired Shook Kelley for brand strategy and store design, working with the firm throughout a crucial period of the chain’s development. The fear was that as Whole Foods grew, its image would become too diffuse, harder to differentiate from other health food stores; at the same time, the company wanted to attract more mainstream shoppers. Kelley’s team was tasked with finding new ways to telegraph the brand’s singular value. Their solution was a hierarchical system of signage that would streamline the store’s crowded field of competing health and wellness claims.
Kelley’s view is that most grocery stores are “addicted” to signage, cramming their spaces with so many pricing details, promotions, navigational signs, ads, and brand assets that it “functionally shuts down [the customer’s] ability to digest the information in front of them.”
Kelley’s team stipulated that Whole Foods could only have seven layers of information, which ranged from evocative signage 60 feet away to descriptive displays six feet from customers to promotional info just six inches from their hands. Everything else was “noise,” and jettisoned from the stores entirely. If you’ve ever shopped at Whole Foods, you probably recognize the way that the store’s particular brand of feel-good, hippie sanctimony seems to permeate your consciousness at every turn. Kelley helped invent that. The system he created for pilot stores in Princeton, New Jersey, and Louisville, Kentucky, were scaled throughout the chain and are still in use today, he says. (Whole Foods did not respond to requests for comment for this story.)
With a carefully delineated set of core values guiding its purchasing and brand, Whole Foods was ripe for the kind of visual overhaul Kelley specializes in. But most regional grocery chains have a different set of problems: They don’t really have values to telegraph in the first place. Shook Kelley’s approach is about getting buttoned-down grocers to reflect on their beliefs, tapping into deeper, more primal reasons for wanting to sell food.
* * *
Today, Kelley and his team have developed a playbook for clients, a finely tuned process to get shoppers to think in terms that go beyond bargain-hunting. It embraces what he calls “the theater of retail” and draws inspiration from an unlikely place: the emotionally laden visual language of cinema. His goal is to convince grocers to stop thinking like Willy Loman — like depressed, dejected salesmen forever peddling broken-down goods, fixated on the past and losing touch with the present. In order to survive, Kelley says, grocers can’t be satisfied with providing a place to complete a chore. They’ll need to direct an experience.
Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.
Today’s successful retail brands establish what Kelley calls a “brand realm,” or what screenwriters would call a story’s “setting.” We don’t usually think consciously about them, but realms subtly shape our attitude toward shopping the same way the foggy, noirishly lit streets in a Batman movie tell us something about Gotham City. Cracker Barrel is set in a nostalgic rural house. Urban Outfitters is set on a graffitied urban street. Tommy Bahama takes place on a resort island. It’s a well-known industry secret that Costco stores arehugely expensive to construct — they’re designed to resemble fantasy versions of real-life warehouses, and the appearance of thrift doesn’t come cheap. Some realms are even more specific and fanciful: Anthropologie is an enchanted attic, complete with enticing cupboards and drawers. Trader Joe’s is a crew of carefree, hippie traders shipping bulk goods across the sea.A strong sense of place helps immerse us in a store, getting us emotionally invested and (perhaps) ready to suspend the critical faculties that prevent a shopping spree.
Kelley takes this a few steps further. The Shook Kelly team, which includes a cultural anthropologist with a Ph.D., begins by conducting interviews with executives, staff, and locals, looking for the storytelling hooks they call “emotional opportunities.” These can stem from core brand values, but often revolve around the most intense, place-specific feelings locals have about food. Then Kelley finds ways to place emotional opportunities inside a larger realm with an overarching narrative, helping retailers tell those stories — not with shelves of product, but through a series of affecting “scenes.”
In Alberta, Canada, Shook Kelley redesigned a small, regional grocery chain now called Freson Bros. Fresh Market. In interviews, the team discovered that meat-smoking is a beloved pastime there, so Shook Kelley built huge, in-store smokers at each new location — a scene called “Banj’s Smokehouse” — that crank out pound after pound of the province’s signature beef, as well as elk, deer, and other kinds of meat (customers can even BYO meat to be smoked in-house). Kelley also designed stylized root cellars in each produce section, a cooler, darker corner of each store that nods to the technique Albertans use to keep vegetables fresh. These elements aren’t just novel ways to taste, touch, and buy. They reference cultural set points, triggering memories and personal associations. Kelley uses these open, aisle-less spaces, which he calls “perceptual rooms,” to draw customers through an implied sequence of actions, tempting them towards a specific purchase.
Something magical happens when you engage customers this way. Behavior changes in visible, quantifiable ways. People move differently. They browse differently. And they buy differently. Rather than progressing in a linear fashion, the way a harried customer might shoot down an aisle — Kelley hates aisles, which he says encourage rushed, menial shopping — customers zig-zag, meander, revisit. These behaviors are a sign a customer is “experimenting,” engaging with curiosity and pleasure rather than just trying to complete a task. “If I was doing a case study presentation to you, I would show you exact conditions where we don’t change the product, the price, the service. We just change the environment and we’ll change the behavior,” Kelley tells me. “That always shocks retailers. They’re like ‘Holy cow.’ They don’t realize how much environment really affects behavior.”
A strong sense of place helps immerse us in a store, getting us emotionally invested and (perhaps) ready to suspend the critical faculties that prevent a shopping spree.
In the mid-2000s, Nabisco approached Kelley’s firm, complaining that sales were down 16 percent in the cookie-and-cracker aisle. In response, Shook Kelley designed “Mom’s Kitchen,” which was piloted at Buehler’s, a 15-store chain in northern Ohio. Kelley took Nabisco’s products out of the center aisles entirely and installed them in a self-contained zone: a perceptual room built out to look like a nostalgic vision of suburban childhood, all wooden countertops, tile, and hanging copper pans. Shelves of Nabisco products from Ritz Crackers to Oreos lined the walls. Miniature packs of Animal Crackers waited out in a large bowl, drawers opened to reveal boxes of Saltines. The finishing touch had nothing to do with Nabisco and everything to do with childhood associations: Kelley had the retailers install fridge cases filled with milk, backlit and glowing. Who wants to eat Oreos without a refreshing glass of milk to wash them down?
The store operators weren’t sold. They found it confusing and inconvenient to stock milk in two places at once. But from a sales perspective, the experiment was a smash. Sales of Nabisco products increased by as much as 32 percent, and the entire cookie-and-cracker segment experienced a halo effect, seeing double-digit jumps. Then, the unthinkable: The stores started selling out of milk. They simply couldn’t keep it on the shelves.
You’d think that the grocery stores would be thrilled, that it would have them scrambling to knock over their aisles of goods, building suites of perceptual rooms. Instead, they retreated. Nabisco’s parent company at the time, Kraft, was excited by the results and kicked the idea over to a higher-up corporate division where it stalled. And Buehler’s, for its part, never did anything to capitalize on its success. When the Nabisco took “Mom’s Kitchen” displays down, Kelley says, the stores didn’t replace them.
Mom’s Kitchen, fully stocked. (Photo by Tim Buchman)
“We were always asking a different question: What is the problem you’re trying to solve through food?” Kelley says. “It’s not just a refueling exercise — instead, what is the social, emotional issue that food is solving for us? We started trying to work that into grocery. But we probably did it a little too early, because they weren’t afraid enough.”
Since then, Kelley has continued to build his case to unreceptive audiences of male executiveswith mixed success. He tells them that when customers experiment — when the process of sampling, engaging, interacting, and evaluating an array of options becomes a source of pleasure — they tend to take more time shopping. And that the more time customers spend in-store, the more they buy. In the industry, this all-important metric is called “dwell time.” Most retail experts agree that increasing dwell without increasing frustration (say, with long checkout times) will be key to the survival of brick-and-mortar retail. Estimates vary on how much dwell time increases sales; according to Davinder Jheeta, creative brand director of the British supermarket Simply Fresh,customers spent 1.3 percent more for every 1 percent increase in dwell time in 2015.
Another way to increase dwell time? Offer prepared foods. Delis, cafes, and in-store restaurants increase dwell time and facilitate pleasure while operating with much higher profit margins and recapturing some of the dining-out dollar that grocers are now losing. “I tell my clients, ‘In five years, you’re going to be in the restaurant business,” Kelley says, “‘or you’re going to be out of business.’”
Kelley’s job, then, is to use design in ways that get customers to linger, touch, taste, scrutinize, explore. The stakes are high, but the ambitions are startlingly low. Kelley often asks clients what he calls a provocative question: Rather than trying to bring in new customers, would it solve their problems if 20 percent of customers increased their basket size by just two dollars? The answer, he says, is typically an enthusiastic yes.
Just two more dollars per trip for every fifth customer — that’s what victory looks like. And failure? That looks like a food marketplace dominated by Walmart and Amazon, a world where the neighborhood supermarket is a thing of the past.
* * *
When Shook Kelley started working on Niemann’s account, things began the way they always did: looking for emotional opportunities. But the team was stumped. Niemann’s stores were clean and expertly run. There was nothing wrong with them. Niemann’s problem was that he had no obvious problem. There was no there there.
Many of the regionals Kelley works with have no obvious emotional hook; all they know is that they’ve sold groceries for a long time and would like to keep on selling them. When he asks clients what they believe in, they show him grainy black-and-white photos of the stores their parents and grandparents ran, but they can articulate little beyond the universal goal of self-perpetuation. So part of Shook Kelley’s specialty is locating the distinguishing spark in brands that do nothing especially well, which isn’t always easy. At Buehler’s Fresh Foods, the chain where “Mom’s Kitchen” was piloted, the store’s Shook Kelley–supplied emotional theme is“Harnessing the Power of Nice.”
Still, Niemann Foods was an especially challenging case. “We were like, ‘Is there any core asset here?’” Kelley told me. “And we were like, ‘No. You really don’t have anything.’”
What Kelley noticed most was how depressed Niemann seemed, how gloomy about the fate of grocery stores in general. Nothing excited him — with one exception. Niemann runs a cattle ranch, a family operation in northeast Missouri. “Whenever he talked about cattle and feed and antibiotics and meat qualities, his physical body would change. We’re like, ‘My god. This guy loves ranching.’ He only had three hundred cattle or something, but he had a thousand pounds of interest in it.”
Niemann’s farm now has about 600 cattle, though it’s still more hobby farm than full-time gig — but it ended up being a revelation.During an early phase of the process, someone brought up “So God Made a Farmer” — a speech radio host Paul Harvey gave at the 1978 Future Farmers of America Convention that had been used in an ad for Ram trucks in the previous year’s Super Bowl. It’s a short poem that imagines the eighth day of the biblical creation, where God looks down from paradise and realizes his new world needs a caretaker. What kind of credentials is God looking for? Someone “willing to get up before dawn, milk cows, work all day in the fields, milk cows again, eat supper and then go to town and stay past midnight at a meeting of the school board.” God needs “somebody willing to sit up all night with a newborn colt. And watch it die. Then dry his eyes and say, ‘Maybe next year.’” God needs “somebody strong enough to clear trees and heave bails, yet gentle enough to yean lambs and wean pigs and tend the pink-combed pullets, who will stop his mower for an hour to splint the broken leg of a meadow lark.” In other words, God needs a farmer.
Part denim psalm, part Whitmanesque catalogue, it’s a quintessential piece of Americana — hokey and humbling like a Norman Rockwell painting, and a bit behind the times (of course, the archetypal farmer is male). And when Kelley’s team played the crackling audio over the speakers in a conference room in Quincy, Illinois, something completely unexpected happened. Something that convinced Kelley that his client’s stores had an emotional core after all, one strong enough to provide the thematic backbone for a new approach to the grocery store.
Rich Niemann, the jaded supermarket elder statesman, broke down and wept.
* * *
I have never been a fan of shopping. Spending money stresses me out. I worry too much to enjoy it. So I wanted to see if a Kelley store could really be what he said it was, a meaningful experience, or if it would just feel fake and hokey. You know, like the movies. When I asked if there was one store I could visit to see his full design principles in action, he told me to go to Harvest, “the most interesting store in America.”
Champaign is two hours south of O’Hare by car. Crossing its vast landscape of unrelenting farmland, you appreciate the sheer scale of Illinois, how far the state’s lower half is from Chicago. It’s a college town, which comes with the usual trappings — progressive politics, cafes and bars, young people lugging backpacks with their earbuds in — but you forget that fast outside the city limits. In 2016, some townships in Champaign county voted for Donald Trump over Hillary Clinton by 50 points.
I was greeted in the parking lot by Gerry Kettler, Niemann Foods’ director of consumer affairs. Vintage John Deere tractors formed a caravan outside the store. The shopping cart vestibules were adorned with images of huge combines roving across fields of commodity crops. Outside the wide-mouthed entryway, local produce waited in picket-fence crates — in-season tomatoes from Johnstonville, sweet onions from Warrensburg.
And then we stepped inside.
Everywhere, sunlight poured in through the tall, glass facade, illuminating a sequence of discrete, airy, and largely aisle-less zones. Kettler bounded around the store, pointing out displays with surprised joy on his face, as if he couldn’t believe his luck. The flowers by the door come from local growers like Delight Flower Farm and Illinois Willows. “Can’t keep this shit in stock,” he said. He makes me hold an enormous jackfruit to admire its heft. The produce was beautiful, he was right, with more local options than I’ve ever seen in a grocery store. The Warrensville sweet corn is eye-poppingly cheap: two bucks a dozen. There were purple broccolini and clamshells filled with squash blossoms, a delicacy so temperamental that they’re rarely sold outside of farmers’ markets. Early on, they had to explain to some teenage cashiers what they were — they’d never seen squash blossoms before.
I started to sense the “realm” Harvest inhabits: a distinctly red-state brand of America, local food for fans of faith and the free market. It’s hunting gear. It’s Chevys. It’s people for whom commercial-scale pig barns bring back memories of home. Everywhere, Shook Kelley signage — a hierarchy of cues like what Kelley dreamed up for Whole Foods — drives the message home. A large, evocative sign on the far wall reads Pure Farm Flavor, buttressed by the silhouettes of livestock, so large it almost feels subliminal. Folksy slogans hang on the walls, sayings like FULL OF THE MILK OF HUMAN KINDNESS and THE CREAM ALWAYS RISES TO THE TOP.
Then there are the informational placards that point out suppliers and methods.
There are at least a half dozen varieties of small-batch honey; you can find pastured eggs for $3.69. The liquor section includes local selections, like whiskey distilled in DeKalb and a display with cutting boards made from local wood by Niemann Foods’ HR Manager. “Turns out we had some talent in our backyard,” Kettler said. Niemann’s willingness to look right under his nose, sidestepping middlemen distributors to offer reasonably priced, local goods, is a hallmark of Harvest Market.
Champaign, IL’s Harvest Market is styled like Whole Foods for the Heartland—complete with a John Deere tractor stationed outside. (Photo courtesy of the author.)
Unlike most large-format grocery stores, Harvest Market buys some produce directly from farmers (like these sweet candy onions from Warrensville, IL, o, about 50 miles away). (Photo courtesy of the author.)
Interior of Harvest Market from the upper mezzanine, where shoppers gather for lunch and board games during the day and glasses of wine at night. (Photo courtesy of the author.)
By the cheese section, a glassed in contraption works night and day: a butter churner, which transforms local sweet cream into yellow, briskly selling-bricks of fat. (Photo courtesy of the author.)
Interior of Harvest Market from the upper mezzanine, where shoppers gather for lunch and board games during the day and glasses of wine at night. (Photo courtesy of the author.)
Harvest Market executives Gerry Kettler, left, and Rich Niemann chat with a salsa vendor visiting to do demos. (Photo courtesy of the author.)
That shortened chain of custody is only possible because of Niemann and the lifetime of supply-side know-how he brings to table. But finding ways to offer better, more affordable food has been a long-term goal of Kelley — who strained his relationship with Whole Foods CEO John Mackey over the issue. As obsessed as Kelley is with appearances, he insists to me that his work must be grounded in something “real”: that grocery stores only succeed when they really try to make the world a better place through food. In his view, Whole Foods wasn’t doing enough to address its notoriously high prices — opening itself up to be undercut by cheaper competition, and missing a kind of ethical opportunity to make better food available to more people.
“When,” Kelley remembers asking, “did you start to mistake opulence for success?”
In Kelley’s telling, demand slackened so much during the Great Recession that it nearly lead to Whole Foods’ downfall, a financial setback that the company never fully recovered from — and, one could argue, ultimately led to its acquisition. Harvest Market, for its part, has none of Whole Foods’ clean-label sanctimony. It takes an “all-of-the-above” approach: There’s local produce, but there’re also Oreos and Doritos and Coca-Cola; at Thanksgiving, you can buy a pastured turkey from Triple S Farms or a 20-pound Butterball. But that strong emphasis on making local food more accessible and affordable makes it an interesting counterpart to Kelley’s former client.
The most Willy Wonka–esque touch is the hulking piece of dairy processing equipment in a glass room by the cheese case. It’s a commercial-scale butter churner — the first one ever, Kettler told me, to grace the inside of a grocery store.
“So this was a Shook Kelley idea,” he said, “We said yes, without knowing how much it would cost. And the costs just kept accelerating. But we’re thrilled. People love it.”Harvest Market isn’t just a grocery store — it’s also a federally inspected dairy plant. The store buys sweet cream from a local dairy, which it churns into house-made butter, available for purchase by the brick and used throughout Harvest’s bakery and restaurant. The butter sells out as fast as they can make it. Unlike the grocers who objected to “Mom’s Kitchen,” the staff don’t seem to mind.
As I walked through the store, I couldn’t help wondering how impressed I really was. I found Harvest to be a beautiful example of a grocery store, no doubt, and a very unusual one. What was it that made me want to encounter something more outrageous, more radical, more theatrical and bizarre? I wanted animatronic puppets. I wanted fog machines.
I should have known better — Kelley had warned me that you can’t take the theater of retail too far without breaking the dream. He’d told me that he admires stores where “you’re just not even aware of the wonder of the scene, you’re just totally engrossed in it” — stores a universe away from the overwrought, hokey feel of Disneyland. But I had Amazon’s new stores in the back up my mind as a counterpoint, with all their cashierless bells and whistles, their ability to click and collect, their ability to test-drive Alexa and play a song or switch on a fan. I guess, deep down, I was wondering if something this subtle really could work.
“Here, this is Rich Niemann,” Kettler said, and I found myself face-to-face with Niemann himself. We shook hands and he asked if I’d ever been to Illinois before. Many times, I told him. My wife is from Chicago, so we’ve visited the city often.
He grinned at me.
“That’s not Illinois,” he said.
We walked to Harvest’s restaurant, a 40-person seating area plus an adjacent bar with a row of stools, that offers standards like burgers, salads, and flatbreads. There’s an additional 80-person seating area on the second-floor mezzanine, a simulated living room complete with couches and board games. Beyond that, they pointed out the brand-new wine bar — open, like the rest of the space, until midnight. There’s a cooking classroom by the corporate offices. Through the window, I saw a classroom full of children doing something to vegetables. Adult Cooking classes run two or three nightsevery week, plus special events for schools and other groups.
For a summer weekday at noon in a grocery store I’m amazed how many people are eating and working on laptops. One guy has his machine hooked up to a full-sized monitor he lugged up the stairs — he’s made a customized wooden piece that hooks into Harvest’s wrought-iron support beams to create a platform for his plus-size screen. He comes every day, like it’s his office. He’s a dwell-time dream.
We sit down, and Kettler insists I eat the corn first, slathering it with the house-made butter and eating it while it’s hot. He reminds me that it’s grown by the Maddoxes, a family in Warrensburg, about 50 miles west of Champaign.
The corn was good, but I wanted to ask Niemann if the grocery industry was really that bad, and he told me it is. I assume he’ll want to talk about Amazon and its acquisition of Whole Foods and the way e-commerce has changed the game. He acknowledges that, but to my surprise he said the biggest factor is something else entirely — a massive shift happening in the world of consumer packaged goods, or CPGs.
For years, grocery stores never had to advertise, because the largest companies in the world — Proctor and Gamble, Coca-Cola, Nestle — did their advertising for them, just the way Nabisco helped finance “Mom’s Kitchen” to benefit the stores. People came to supermarkets to buy the foods they saw on TV. ButAmericans are falling out of love with legacy brands. They’re looking for something different, locality, a sense of novelty and adventure. Kellogg’s and General Mills don’t have the pull they once had.
When their sales flag, grocery sales do too — and the once-bulletproof alliance between food brands and supermarkets is splitting. For the past two years, the Grocery Manufacturers’ Association, an influential trade group representing the biggest food companies in the world,started to lose members. It began with Campbell’s Soup. Dean Foods, Mars, Tyson Foods, Unilever, Hershey Company, the Kraft Heinz Company, and others followed. That profound betrayal was a rude awakening: CPG companies don’t need grocery stores. They have Amazon. They can sell directly through their websites. They can launch their own pop-ups.
It’s only then that I realized how dire the predicament of grocery stores really is, and why Niemann was so frustrated when he first called Kevin Kelley. It’s one thing when you can’t sell as cheaply and conveniently as your competitors. But it’s another thing when no one wants what you’re selling.
Harvest doesn’t feel obviously futuristic in the way an Amazon store might. If I went there as a regular shopper and not as a journalist sniffing around for a story, I’m sure I’d find it to be a lovely and transporting way to buy food. But what’s going on behind the scenes is, frankly, unheard of.
Grocery stores have two ironclad rules. First, that grocers set the prices, and farmers do what they can within those mandates. And second, that everyone works with distributors who oversee the aggregation and transport of all goods. Harvest has traditional relationships with companies like Coca-Cola, but it breaks those rules with local farmers and foodmakers. Suppliers — from the locally milled wheat to the local produce to the Kilgus Farms sweet cream that goes into the churner — truck their products right to the back. By avoiding middlemen and their surcharges, Harvest is able to pay suppliers more directly and charge customers less. And it keeps costs low. You can still find $4.29 pints of Halo Top ice cream in the freezer, but the produce section features stunning bargains. When the Maddox family pulls up with its latest shipment of corn, people sometimes start buying it off the back of the truck in the parking lot. That’s massive change, and it’s virtually unheard of in supermarkets. At the same time, suppliers get to set their own prices. Niemann’s suppliers tell him what they need to charge; Niemann adds a standard margin and lets customers decide if they’re willing to pay.
If there’s a reason Harvest matters, it’s only partly because of the aesthetics. It’s mainly because the model of what a grocery store is has been tossed out and rebuilt. And why not? The world as Rich Niemann knows it is ending.
* * *
In 2017, just months after Harvest Market’s opening, Niemannwon the Thomas K. Zaucha Entrepreneurial Excellence Award — the National Grocers Association’s top honor, given for “persistence, vision, and creative entrepreneurship.” That spring, Harvest was spotlighted in a “Store of the Month” cover feature in the influential trade magazine Progressive Grocer. Characteristically, the contributions of Kelley and his firm were not mentioned in the piece.
Niemann tells me his company is currently planning to open a second Harvest Market in Springfield, Illinois, about 90 minutes west of Champaign, in 2020. Without sharing specifics about profitability or sales numbers, he says the store was everything he’d hoped it would be as far as the metrics that most matter — year-over-year sales growth and customer engagement. His only complaint about the store, has to do with parking. For years, Niemann has relied on the same golden ratio to determine the size of parking lot needed for his stores — a certain number of spots for every thousand dollars of expected sales. Harvest’s lot uses the same logic, and it’s nowhere near enough space.
“In any grocery store, the customer’s first objective is pantry fill — to take care of my needs as best I can on my budget,” Niemann says. “But we created a different atmosphere. These customers want to talk. They want to know. They want to experience. They want to taste. They’re there because it’s an adventure.”
They stay so much longer than expected that the parking lot sometimes struggles to fit all their cars at once. Unlike the Amazon stores that may soon be cropping up in a neighborhood near you — reportedly, the company is considering plans to open 3,000 of them in by 2021 — it’s not about getting in and out quickly without interacting with another human being. At Harvest, you stay awhile. And that’s the point.
ButAmericans are falling out of love with legacy brands. They’re looking for something different, locality, a sense of novelty and adventure. Kellogg’s and General Mills don’t have the pull they once had.
So far, Harvest’s success hasn’t made it any easier for Kelley, who still struggles to persuade clients to make fundamental changes. They’re still as scared as they’ve always been, clinging to the same old ideas. He tells them that, above all else, they need to develop a food philosophy — a reason why they do this in the first place, something that goes beyond mere nostalgia or the need to make money. They need to build something that means something, a store people return to not just to complete a task but because it somehow sustains them. For some, that’s too tall an order. “They go, ‘I’m not going to do that.’ I’m like, ‘Then what are you going to do?’ And they literally tell me: ‘I’m going to retire.’” It’s easier to cash out. Pass the buck, and consign the fate of the world to younger people with bolder dreams.
Does it even matter? The world existed before supermarkets, and it won’t end if they vanish. And in the ongoing story of American food, the 20th-century grocery store is no great hero. A&P — the once titanic chain, now itself defunct — was a great mechanizer, undercutting the countless smaller, local businesses that used to populate the landscape. More generally, the supermarket made it easier for Americans to distance ourselves from what we eat, shrouding food production behind a veil and letting us convince ourselves that price and convenience matter above all else. We let ourselves be satisfied with the appearance of abundance — even if great stacks of unblemished fruit contribute to waste and spoilage, even if the array of brightly colored packages are all owned by the same handful of multinational corporations.
But whatever springs up to replace grocery stores will have consequences, too, and the truth is that brick-and-mortar is not going away any time soon — far from it. Instead, the most powerful retailers in the world have realized that physical spaces have advantages they want to capitalize on. It’s not just that stores in residential neighborhoods work well as distribution depots, ones that help facilitate the home delivery of packages. And it’s not just that we can’t always be home to pick up the shipments we ordered when they arrive, so stores remain useful. The world’s biggest brands are now beginning to realize what Kelley has long argued: Physical stores are a way to capture attention, to subject customers to an experience, to influence the way they feel and think. What could be more useful? And what are Amazon’s proposed cashierless stores, but an illustration of Kelley’s argument? They take a brand thesis, a set of core values — that shopping should be quick and easy and highly mechanized — and seduce us with it, letting us feel the sweep and power of that vision as we pass with our goods through the doors without paying, flushed with the thrill a thief feels.
This is where new troubles start. Only a few companies in the world will be able to compete at Amazon’s scale — the scale where building 3,000 futuristic convenience stores in three years may be a realistic proposition. Unlike in the golden age of grocery, where different family owned chains catered to different demographics, we’ll have only a handful of players. We’ll have companies that own the whole value chain, low to high. Amazon owns the e-commerce site where you can find almost anything in the world for the cheapest price. And for when you want to feel the heft of an heirloom tomato in your hand or sample some manchego before buying, there is Whole Foods. Online retail for thrift, in-person shopping for pleasure. Except one massive company now owns them both.
If this new landscape comes to dominate, we may find there are things we miss about the past. For all its problems, the grocery industry is at least decentralized, owned by no one dominant company and carved up into more players than you could ever count. It’s run by people who often live alongside the communities they serve and share their concerns. We might miss that competition, that community. They are small. They are nimble. They are independently, sometimes even cooperatively, owned. They employ people. And if they are scrappy, and ingenious, and willing to change, there’s no telling what they might do. It is not impossible that they could use their assets — financial resources, industry connections, prime real estate — to find new ways to supply what we all want most: to be happier, to be healthier, to feel more connected. To be better people. To do the right thing.
I want to believe that, anyway. That stores — at least in theory — could be about something bigger, and better than mere commerce. The way Harvest seems to want to be, with some success. But I wonder if that’s just a fantasy, too: the dream that we can buy and sell our way to a better world, that it will take no more than that.
Which one is right?
I guess it depends on how you feel about the movies.
Maybe a film is just a diversion, a way to feel briefly better about our lives, the limitations and disappointments that define us, the things we cannot change. Most of us leave the theater, after all, and just go on being ourselves.
Still, maybe something else is possible. Maybe in the moment when the music swells, and our hearts beat faster, and we feel overcome by the beauty of an image — in the instant that we feel newly brave and noble, and ready to be different, braver versions of ourselves — that we are who we really are.
* * *
Joe Fassler, The Counter’s deputy editor, has covered the intersection of food, policy, technology, and culture for the magazine since 2015. His food reporting has twice been a finalist for the James Beard Foundation Award in Journalism. He’s also editor ofLight the Dark: Writers on Creativity, Inspiration, and the Creative Process (Penguin, 2017), a book based on “By Heart,” his ongoing series of literary conversations for The Atlantic.
Editor: Michelle Weber Fact checker: Matt Giles Copy editor: Jacob Z. Gross
Soraya Roberts | Longreads | April 2019 | 10 minutes (2,422 words)
INT. COFFEE SHOP – DAY
SORAYA sits down at her laptop with a cookieor some cake or that weirdly oversize banana bread. As she startsworking on a column like this one, the woman next to her, workingon a spreadsheet, glances at Soraya’s desktop and turns to her.
WOMAN: What do you do?
SORAYA: I’m a columnist.
WOMAN: Holy shit, that’s cool.
I starred in this scene two weeks ago, and again just this past week at a party. The women don’t have to tell me why they think it’s cool, I know why: Carrie Bradshaw. An apartment in New York, a photo on the side of a bus, Louboutins, tutus, and a column at the top of each week. Which is why I qualify it every time: “I don’t make as much as Carrie Bradshaw.” Yes, the job is cool, and it is holy-shit-worthy because so few journalists are able to actually work as journalists. But I’m freelance: I can cover my rent but can’t buy a house, I don’t get benefits, and I might be out of a job next week. Not to mention that I might not be so lucky next time. The women usually turn back to their admin after that — admin looks a lot cooler than journalism these days. But only if you’re not going by Sex and the City or basically every other journalism movie or series that has come after, all of which romanticize an industry which has a knack for playing into that.
“This is the end of an era, everything’s changing,” Gina Rodriguez tells her friends in the trailer for Someone Great, a new Netflix rom-com in which she, a music journalist, gets a job. At a magazine. In San Francisco. This is not a sci-fi movie in which the character has time traveled back to, I don’t know, 1975. It is only one recent example of the obfuscation of what journalism actually means now. There’s also the Hulu series Shrill, which presents itself as if it were current-day but is based on the life of Lindy West, who had a staff job at the Seattle alt-weekly The Stranger when you could still have a staff job and make a name for yourself with first-person essays, i.e., 2009. Special (another Netflix show) also harkens back to that time, and though it’s more overt about how exploitative online media can be — the hero is an intern with cerebral palsy who writes about his disability (which he claims is from a car accident) for clicks — the star is still hired straight out of an internship. (What’s an internship?)
Hollywood romanticizes everything, you say? Perhaps, but this is a case where the media itself seems to be actively engaging in a certain kind of deception about how bad its own situation actually is. In February, The Washington Post, which is no doubt still benefiting from the press off the still-gold-standard journalism movie — 1976’s All the President’s Men — ran a Super Bowl ad narrated by Tom Hanks, which applauds late journalists Marie Colvin and Jamal Khashoggi, who, in their words, brought the story, “no matter the cost.” The spot highlighted what we already know, which is that we need journalism to be a functioning democracy and that many journalists risk their lives to guarantee it. What it kept in darkness (ha), however, was that to do their job properly, those journalists need protection and they need resources — provided by their editors and by their publishers. Hanks, of course, starred in The Post, Steven Spielberg’s 2017 film based on the journalists who reported on the Pentagon Papers in 1971. The ad was using the past to promote the future, rather than dealing with a present, in which more than 2,400 people lost media jobs in the first three months of the year and journalists are trying to unionize en masse. But that’s not particularly telegenic, is it?
* * *
The romanticized idea of the journalist — dogged, trenchcoated — really took off at the movies. In 1928, ex-reporters Ben Hecht and Charles MacArthur wrote a play which was adapted into The Front Page, a 1931 screwball that became the journalism movie prototype, with fast dialogue and faster morals. My favorite part is that not only is the star reporter trying to quit the paper (in this economy?), but his editor will do anything — including harboring an accused murderer — to keep him on staff. Matt Ehrlich, coauthor of Heroes and Scoundrels: The Image of the Journalist in Popular Culture, once told me for Maclean’s that The Front Page came out of the “love-hate relationship” the writers had with the industry even back then. “The reporters are absolute sleazebags, they do horrible things,” he said. “At the same time The Front Page makes journalism seem very exciting, and they do get the big scoop.” Ehrlich also told me that some initially thought All the President’s Men, which eventually became the prototype of the journalism movie, was reminiscent of the earlier era of the genre. In case you are not a journalist and so haven’t seen it, Robert Redford and Dustin Hoffman starred as Bob Woodward and Carl Bernstein, The Washington Post reporters whose stories on the Watergate burglary and subsequent cover-up helped lead to President Nixon’s resignation. While the film also played fast and loose with the truth, it had a veneer of rumpled repetitious reality — not to mention a strong moral core that made taking down the president with a typewriter seem, if implausible, at least not impossible.
In February, Education Week reported that a survey of 500 high school journalism teachers across 45 states found that, in the past two years, 44 percent of U.S. school teachers saw a rise in journalism enrollment and a 30 percent increase in interest in journalism higher education. “This is this generation’s Watergate,” the executive director of the National Scholastic Press Association said. “With President Trump, everyone is really in tune to the importance of a free press.” Sure. But this isn’t 1976. No doubt there are scores of WoodSteins out there, but not only do a number of journalists no longer have the resources or the time to follow stories of any kind, they rarely have the salaried staff positions to finance them, nor the editors and publishers to support them doing the job they were hired to do. In All the President’s Men, executive editor Ben Bradlee asks WoodStein if they trust their source, before muttering “I can’t do the reporting for my reporters, which means I have to trust them. And I hate trusting anybody.” Then he tells them to “Run that baby.” These days there is little trust in anything beyond the bottom line.
The myth is that All the President’s Men led to a surge of interest in journalism as a career. But in reality it was women, increasingly educated post-liberation, whose interest explained the surge. (My editor is asking: “Is it an accident that shitting on journalism as a worthy profession coincided with women moving into journalism?” My reply is: “I think not.”) Still, women remain underrepresented in the field to this day, a fact reflected by the paucity of movies about the work of female journalists. While there were scores of ’70s and ’80s thrillers built around male reporters with too much hair taking down the man, for the women … there was The China Syndrome, with Jane Fonda as a television reporter named Kimberly covering a nuclear power plant conspiracy. And, um, Absence of Malice? Sally Field is a newspaper reporter who sleeps with her subject (I mean, it is Paul Newman). I guess I could include Broadcast News, which stars Holly Hunter as a neurotic-but-formidable producer and personified the pull between delivering the news and delivering ratings (the analog version of clicks). But Network did that first and more memorably, with its suicidal anchorman lamenting the demise of media that matters. “I’m a human being, GODDAMN IT!!!” he shouts into the void. “My life has value!!!” You don’t hear female journalists saying that on-screen, though you do hear them saying “I do” a whole lot.
The quintessential journalism film and the quintessential rom-com are in fact connected. Nora Ephron, who was briefly married to Carl Bernstein, actually cowrote an early script for All the President’s Men. While it was chucked in favor of William Goldman’s, she went on to write When Harry Met Sally, and I’ll forgive you for not remembering that Sally was a journalist. She probably only mentions it twice because this was 1989, an era in which you decided to be a journalist and then you became one — the end. The movie treats reporting like it’s so stable it’s not even worth mentioning, like being a bureaucrat. Sally could afford a nice apartment, she had plenty of time to hang out with Harry, so what was there to gripe about (Good Girls Revolt would suggest Ephron’s trajectory was less smooth, but that’s another story)? Four years later, in Sleepless in Seattle, Meg Ryan is another journalist in another Ephron movie, equally comfortable, so comfortable in fact that her editor pays her to fly across the country to stalk Tom Hanks. This newspaper editor literally assigns a reporter to take a plane to Seattle from Chicago to “look into” a possible lifestyle story about a single white guy. (Am I doing something wrong?!?!)
Journalism and rom-coms were fused from almost the start, around the ’30s and ’40s. The Front Page went from being a journalism movie to being a rom-com when it turned its hero into a heroine for His Girl Friday. The reporter repartee and the secretive nature of the job appeared to lend themselves well to Hays-era screwballs, though they also indelibly imprinted a lack of seriousness onto their on-screen female journalists. After a brief moment in the 1970s when The Mary Tyler Moore Show embodied the viability of a woman journalist who puts work first, the post-Ephron rom-coms of the 2000s were basically glossy romances in “offices” that were really showrooms for a pink-frosted fantasy girl-reporter gig no doubt thought up by male executives who almost certainly saw All the President’s Men and almost certainly decided a woman couldn’t do that and who cares anyway because the real story is how you’re going to get Matthew McConaughey to pop the question. I can’t with the number of women who recently announced that 13 Going on 30 — the movie in which Jennifer Garner plays a literal child successfully running a fashion magazine — made them want to be journalists. But the real death knell of the aughts journo-rom-com, according to rom-com columnist Caroline Siede, was in 2003 with How to Lose a Guy in 10 Days in 2003. In that caper, Kate Hudson has a job as a columnist despite thinking it is completely rational to write a piece called “How to Bring Peace to Tajikistan” for her Cosmo-type fashion magazine.
* * *
In 2016, the Oscar for Best Picture went to Spotlight, which follows The Boston Globe’s titular investigative team — three men, one woman — as it uncovers the Catholic Church abuse scandal. The film earned comparisons to All the President’s Men for its focus on journalistic drudgery, but it also illustrated the growing precariousness of the newsroom with the arrival of the web. In one scene, executive editor Marty Baron expresses shock when he is told it takes a couple of months for the team to settle on a story and then a year or more to investigate it. At the same time, Baron and two other editors are heavily involved and supportive of the three reporters, who went on to win the Pulitzer in 2003 and remained on the team for years after. Released only 12 years after the fact, the film suggested that journalists who win Pulitzers have some kind of security, which, you know, makes sense, and is maybe true at The Boston Globe. But two years after Spotlight came out, David Wood, who had won HuffPost its only Pulitzer, was laid off. As one of BuzzFeed’s reporters told The Columbia Journalism Review after BuzzFeed shed 15 percent of its staff, “It’s this sense that your job security isn’t tied to the quality of your work.”
“We have so much to learn from these early media companies and in many ways it feels like we’re at the start of another formative era of media history where iconic companies will emerge and thrive for many decades,” BuzzFeed founder and CEO Jonah Peretti blew hard in a memo in 2014, referring to traditional outfits like Time and The New York Times. But both those publications have unions, which Peretti has been clear he doesn’t think “is right” for his company. “A lot of the best new-economy companies are environments where there’s an alliance between managers and employees,” he said in 2015. “People have shared goals.” In this case the shared goals seem to be that Peretti profits (his company was valued at more than $1 billion in 2016) while his staff is disposable.
Which brings us back to the Globe in 2019. That is to say the real one, not the romanticized one. This version of the Globe hires a Gonzo-esque leftist political writer named Luke O’Neil as a freelancer and publishes his “controversial” op-ed about the Secretary of Homeland Security’s resignation titled “Keep Kirstjen Nielsen unemployed and eating Grubhub over her kitchen sink.” “One of the biggest regrets of my life is not pissing in Bill Kristol’s salmon,” it opened, and it concluded with, “As for the waiters out there, I’m not saying you should tamper with anyone’s food, as that could get you into trouble. You might lose your serving job. But you’d be serving America. And you won’t have any regrets years later.” The article was gone by Friday, pulled upon the request of the paper’s owners (O’Neil sent me the original). According to WGBH, a now-deleted note on the opinion page stated that the article “did not receive sufficient editorial oversight and did not meet Globe standards. The Globe regrets its lack of vigilance on the matter. O’Neil is not on staff.” And, oh, man, that last line. It says everything there is to say about modern journalism that is unspoken not only on-screen but by the culture at large and the media in it. It says you serve us but we provide no security, no benefits, no loyalty. It says, unlike Spotlight or All the President’s Men or even The Front Page, we do not have your back. Because if they did, you better believe it would have a good chance of ending up on-screen.
Nina Li Coomes | Longreads | April 2019 | 14 minutes (3,609 words)
A month after Donald Trump is inaugurated president, my mother visits me in Boston. I have lived in the city for only a month, and my apartment is furnished, but barely. During the day, while I sit in a windowless office, my mother drags a suitcase down snowy Commonwealth Avenue to TJ Maxx, where she fills the rolling bag with comforting objects: a teal ceramic pitcher; a wire kitchen cart; a swirling, blue-and-white rug. She makes at least three trips down the hill to the store and back again.
When she is not buying knickknacks, she scrubs my buckling apartment floors. She wrings a rag in warm water, palms it over the wood, her posture and form impeccable as usual. Though I’d beg her not to do this, her actions make sense. For the 20 years we have lived in the United States, my mother has made a ritual of scrubbing the floors of all of our homes. In our first American house, in the unwelcoming cornfields of Illinois, I would know that all was well if I came through the front door to see the warm gleam of freshly scrubbed wood. In my parents’ house in Chicago, if I ever walked across the kitchen in my shoes by accident or, more likely, in a careless hurry, guilt would course down my back, the memory of her hunched by the radiator busily scrubbing flooding my mind. After college, when I lived in New York, she visited me there and insisted on getting down on her hands and knees again, though my roommate had a dog who shed constant, ungrateful clouds of black fur, making a clean floor impossible. In each place we have lived, no matter where we are, my mother has labored over the floor to make it home.
* * *
I was born in Japan to a Japanese mother and a white American father. After my birth, my parents sent an application the U.S. consulate for my American citizenship. The application included my Japanese birth certificate and an accompanying English translation, proof of their marriage in both languages, as well as proof of my father’s U.S. citizenship. My mother’s status as an ethnically Japanese national qualified me for Japanese citizenship upon birth. I have always been a dual citizen of both the United States and Japan.
As a child, I bragged about this status to my peers. I had two countries I could claim as my own, I would crow, two places to call home. My parents often chided me for this bragging, but my willful girl-self ignored them. Though my status as mixed race was most often confusing and other times painful, this was one place I found pride, a jolt of pleasure pulsing through my hands as I touched the spines of one blue and one red passport, both with my name emblazoned on the inside. At the customs kiosk in airports, I liked the momentary juggle my parents did, swapping out our U.S. passports for Japanese ones in Tokyo, and back again in Chicago. All of the coming and going resulted in my American passport looking like an absurdist travel log, appearing as if I left the country and came back a month later without ever entering another country. Though I was only ever just shuttling between the same two nations to visit one set of grandparents or another, childishly I imagined my dual citizenship as a secret mission, a doorway into which I could walk and disappear, existing in secret for a short while. Other times, my passports felt like a double-headed key, easing the pain of leaving one home with the improbable solution of arriving at a different one. My passports — their primary-colored bindings, their grainy texture and heavy pages, these were magical tokens of my childish belief in my double-belonging.
This was one place I found pride, a jolt of pleasure pulsing through my hands as I touched the spines of one blue and one red passport, both with my name emblazoned on the inside.
Dual citizenship is technically only legal in Japan until the age of 22, at which point an individual is required to make a “declaration of citizenship,” effectively asking dual citizens to give up their claim on at least one of their countries of origin. There are, of course, ways around this. There are an estimated 700,000 dual citizens past the age of 22 living in Japan, though this number is probably skewed by the willingness of illegal dual citizens to come forward regarding their legal status. Some dual citizens choose never to declare, trusting in the inefficiencies of a labyrinthine bureaucracy to forget about legal technicalities. Others make their declaration in remote locations far from metropolises like Tokyo or Osaka with the hopes that less-urban officials will not take the time to ask for a renunciation of non-Japanese passports. Some, like me, renewed their passport on the eve of their 22nd birthday, effectively buying another four years to weigh the choice, hoping that laws might shift to allow for legally sustained dual citizenship.
* * *
In Japan, a person obtains citizenship not by birthplace but by blood: This is called jus sanguinis citizenship, or citizenship as defined by the “right of blood.” It does not matter if you are born in the country or out of it. You are only a citizen if you have at least one parent whose blood can be classified as Japanese. (There are some exceptions based on naturalization and statelessness.) Requiring Japanese blood as a tenet of citizenship implies that there is such a thing; that Japaneseness can be traced back to one, biologically determined race.In 2008, conservative lawmakers proposed that DNA testing become part of the process necessary to determine Japanese citizenship, suggesting that biological markers could identify Japanese blood over foreign blood. Though the proposal was ultimately thrown out on grounds of logistical and financial impossibility, it lays bare the use of Japanese citizenship to promote a Japanese ethnostate. Simply put, to Japan, an ideal citizen is someone who is 100 percent racially Japanese.
In the United States, people become citizens through a combination of jus sanguinis, “right of blood,” and jus soli, “right of soil.” If you are born within the boundaries of the United States of America, or born to a parent who is a U.S. citizen, you are granted U.S. citizenship. This idea is introduced in the 14th Amendment of the Constitution: “All persons born or naturalized in the United States, and subject to the jurisdiction thereof are citizens of the United States and of the State wherein they reside.” It is tempting to say that the U.S. is egalitarian, that it is not founded on ethnocentrism, but the citizenship clause of the 14th Amendment was written only as a result of the Civil War. It granted citizenship to Black Americans nearly a century after the nation’s founding and in many ways did so in name only.
Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.
Though Asian Americans were granted citizenship in 1898, the Chinese Exclusion Act of 1882 insured that immigrant laborers were not given easily accessible avenues to permanent citizenship. By the same token, Supreme Court cases in the 1920s (Ozawa v. United States and United States v. Bhagat Singh Thind) established a further precedent barring Asians from naturalizing as citizens on account of their not being “free white persons.” The “free white persons” clause of naturalization in U.S. law was dissolved in 1952, but strict immigration quotas continued to be official policy until 1965. Before 1924, Native Americans were only considered citizens if they could be taxed, if they served in a war, married a white person, or disavowed their tribal allegiance. By the time the Indian Citizenship Act of 1924 passed, most had already followed these alternate paths to citizenship, and even then, states with large Native American populations refused to grant citizenship to their population for fear of the Native American vote. It took almost 25 years for the Indian Citizenship Act to be adopted by all 50 of the United States of America.
No matter the intention of our Founding Fathers or the text of the 14th Amendment, citizenship in the United States is complicated, fraught; at once given andtaken away, fickle and traitorous, seemingly color-blind and yet in service to a majority of “free white persons.”
My passports — their primary-colored bindings, their grainy texture and heavy pages, these were magical tokens of my childish belief in my double-belonging.
This duplicity isn’t unique to the United States or Japan. It is the nature of citizenship to uphold humanity while simultaneously denying it. For the Roman philosopher Cicero, one of the first to consider the idea of the citizen, this duality was best explained as a trade-off between citizen and state. In return for completing certain civic responsibilities (say, paying your taxes and following road signs), citizens are offered rights: protection from the state, the ability to claim nationality, and the like. More than a thousand years later, German-born American philosopher and writer Hannah Arendt echoed this same sentiment by famously calling citizenship “the right to have rights.” In her view, citizenship was a necessary vehicle to deliver human rights. Simply being human didn’t give you access to things like life and liberty. One needs a state to fulfill them. Taken backwards, this implies that without a government’s acknowledgement of citizenship, a person can be stripped of the rights inherent to their existence. In other words, if you’re not a citizen, you’re not fully a person.
* * *
At the end of my mother’s Boston visit, her busy homemaking and floor-scrubbing now at an end, I take her to a donut shop for breakfast. Inside, a Cambodian family slips rings of hot fried dough glazed in honey into paper envelopes, handing them to construction workers, police officers, and university students. Behind the counter, on the other side of the kitchen door, no English exists. Instead, Cambodian wafts, punctured by laughter and sighs, tossed by the woman pouring coffee with her hand balled at her hip, the smiling man behind the counter, the surly teenager bussing half-finished plates of buttery scrambled eggs. Above the cash register proud signs hang declaring the store a “Boston Favorite,” a “Chosen Community Partner,” and the recipient of numerous other local awards.
At our sticky table, I find myself unexpectedly moved. Passing by the donut shop on my daily commute, I assumed that the curly pink neon signage, a relic from the ’50s preserved on a triangular storefront, was surely the property of a white family. Instead what I found was a family of South Asian immigrants, making a classic American food and serving it in their own fashion with aplomb. The donut shop seemed unconcerned with assimilation. Months later, I’d take my sister to the same donut shop and she’d say that she was confused. The decor inside made her feel like she should be eating some sort of noodles but instead she was eating a chocolate glazed cake donut.
As a rule, I am skeptical of the American Dream. I’m suspicious of what it sells and at what cost. What does it mean to believe in “life, liberty, and the pursuit of happiness” when the state reserves the right to take it away at a moment’s notice, to inter you and your family for looking like the enemy? What is freedom if it is a specific, circumscribed kind of freedom? A labored freedom? An unfair freedom? A tilted, volatile, violent freedom?
But at the donut shop, picking apart a vanilla-and-chocolate twist, I see a glimpse of what this country might offer: a promise of evolution, integrity, and acceptance. Perhaps this is what belonging in this country might mean, at its best: that something as classically American as a 1950s corner donut store could be taken over by a family of refugees from South Asia without pomp or angst. That the store and the family that run it can exist without concerning themselves with assimilating to a white American standard, but instead remain rooted in their own traditions and languages. Sitting in the corner table with my mother, I feel as if happiness, freedom, equality, these are hard to come by and elusive. But change, the potential for newness and its embrace, these might yet flourish. These prospects feel solid, somehow, steady and unconditional, vivacious in comparison to the pale two-faced promise of a passport. A hint that perhaps making a home for oneself actually has nothing to do with the cold detachment of a customs official, and more to do with the warmth of feeding your kin on a cold morning.
* * *
Here is how I once passed through customs in Tokyo:
After 14 hours of sitting in an economy class seat, the overhead bin bumping precariously along to turbulence, sleep evasive and slippery, I am greasy and dry-eyed. Everything feels dreamlike. Time moves in stilted seconds, late afternoon sunlight pouring in through pristine panels of glass when my mind is clamoring that it ought to be night. Passengers are herded like badly behaved cattle along moving walkways, the robotic woman’s voice telling us to please watch our step. The path curves, and soon the windows are replaced by gray walls and fluorescent lights. I continue to trudge forward, dragging my stubbornly lagging suitcase. On the walls are signs advertising caution about various strains of influenza.
Sitting in the corner table with my mother, I feel as if happiness, freedom, equality, these are hard to come by and elusive. But change, the potential for newness and its embrace, these might yet flourish.
At customs, placards hang from the ceiling, directing the flight crew to the right, followed by foreigners and tourists, with Japanese nationals and permanent residents filing to the far left. I take my place in the line to the left, feeling at once indignant and like an imposter. An anxious, scrambling feeling chases its tail under my collarbone. As I approach the sunken booth, I try to sound as local as it can get, hoping that the country bumpkin slur of my words will score me a point in the invisible tally of Japaneseness I imagine each customs official keeping. I answer questions about where I am staying, why I am here. Images of the kerosene stove in my grandmother’s front room, my grandfather’s furled fists, their unruly garden — these blossom in my mind, a talisman of home to hold tightly under my breath. Believe me, I pray, believe that I belong here. Inside my backpack, I can feel my other passport, my other citizenship, pulsating like a treacherous living thing.
* * *
It is not lost on me that the language of citizenship traffics in metaphors of life and death, but delivers on promise and rumors. We are given weighty, destiny-scaled ultimatums, discussions of blood and soil evoking images of birth and death, sustenance and longevity. Identification implies belonging, our membership to a country playing on notions of larger, state-bound families. The nation is our mother. The nation is our father. In giving us the gift of citizenship, it has labored to give us life and will lay us weeping in the ground.
But in delivery, citizenship becomes elusive and hard to pin down. It is promised to us with outstretched arms, then snatched away with ease. We are assured home and kinship; we arrive to find an empty house. We are drawn to the visage of a guardian — “Give me your tired, your poor, your huddled masses yearning to breathe free” —but we are greeted by a ghost.
* * *
After finishing our breakfast at the donut shop, my mother and I take a cab to Logan Airport so she can catch her flight home to Chicago. When we arrive, I help her check in and walk her to the TSA cordoned security area. She waves me away at the mouth of the line, the oblong maze of tangled tape empty at this apparently unpopular time to fly. “Go,” she says. I shake my head, watching her hoist her navy canvas bag over one shoulder, taking mincing steps through the open line in front of her. This shooing-and-staying, like the floor-washing, is another one of our family’s traditions. Whenever one of us leaves their home, whether it is in Japan or the U.S., whomever they are leaving staunchly refuses to leave the side of the security line until they can no longer see them. This staying put is an act of loyalty, of love, of claiming each other as our own. We are stating that no border crossing, no officialdom, no distance or space can slice its way through our bonds.
That day I watch my mother’s small body turn even smaller in the distance, and I feel a familiar animal anxiety dig its claws into my chest. Earlier that week, crowds of people poured into U.S. airports, protesting Donald Trump’s travel ban. Scenes of lobbies filled with protesters flooded televisions, mouths moving in angry unison on muted screens. Reports of families separated at customs, of loved ones canceling plans to visit their relatives in the U.S., patients unable to access American hospitals — these are the stories that dominated the news cycle.
Suddenly, as if someone had passed a transparency over my eyes, I see the TSA agent taking a closer look at my mother’s green card. I imagine his voice, meaty and rough when raised. I imagine my mother’s English, flattening as frustration crept into her voice. I imagine what I might do if someone emerged from the wings of the security booth to grab her by the arm, roughly escorting her to a private room. I imagine if I would shout, run, or stay rooted to the spot. At least she would be OK in Japan, a small voice, at once guilty and relieved, says inside me.
My mother passes through the security checkpoint without incident. She waves from behind the metal detector, her hand cleaving a wide, swinging arc in the air.
* * *
Citizenship comes into sharp relief at the most important junctures of life. Two years after my mother’s visit to Boston, my now-husband and I go to the Cook County Clerk’s office, in Chicago, to obtain our marriage license. We are presented with a list of appropriate documents to prove our citizenship — driver’s licenses, passports, birth certificates. Above us, a looming sign proclaims: COOK COUNTY CLERK | BIRTH MARRIAGE DEATH. Birth, marriage, death: To be acknowledged, all these require proof of belonging to a nation. Plunking down my own driver’s license, I wonder what one does without the proper identification. A man ahead of us in line is turned away for not having the correct paperwork to claim his infant daughter’s birth certificate. Without the necessary government-issued credentials, no matter how strange it seemed, he could not receive proof that his daughter now existed outside the womb. Without citizenship, could you be born? Without it, could you die?
This staying put is an act of loyalty, of love, of claiming each other as our own. We are stating that no border crossing, no officialdom, no distance or space can slice its way through our bonds.
My wondering is of course borne of a certain kind of privilege. Undocumented and stateless people know exactly what it is like to live without citizenship. People dear to me have struggled for acknowledgement in the eyes of a mercurial state, granting and revoking rights with the turn of an administration. In many ways I am lucky to be presented with the conundrum of citizenship after 22 years of dual citizenship. I have had not one but two homes.
* * *
On my most recent trip home to Japan, this time to celebrate my new marriage with my family, I exited the plane groggy and barely awake. I followed the familiar corridor, the paneled light flickering, the woman’s voice telling us to mind the gap. Passengers plodded on, all of us filing forward to customs, noting the warnings for newer, more varied strains of flu. This time, I did not take the far left lane. Instead, I entered the country for the first time on a U.S. passport, my lapsed Japanese one tucked in my backpack, safely away from questions of allegiance, loyalty, and citizenship. A small part of me was relieved to filter through the droning line of tourists, no need to prove my worthiness of entry to a stony-faced official. A larger part of me wallowed in a shallow sadness, as if a pale premonition of grief, suspecting that this might be the first step toward exile.
Why do you speak Japanese so well? the man at customs barked, suspicious. Because my mother is Japanese, I answered, the image of her running a rag over my Boston floors, the homes she has created the world over for us, blurring my vision. Is this your only passport? he jabbed a finger at my solitary blue book. Yes, I smiled, three red booklets pulsing against my back.
* * *
Nina Li Coomes is a Japanese and American writer from Nagoya and Chicago. Her work can be found in The Atlantic, EATER, Catapult and elsewhere.
The Stelton colony in central New Jersey was founded in 1915. Humble cottages (some little more than shacks) and a smattering of public buildings ranged over a 140-acre tract of scrubland a few miles north of New Brunswick. Unlike America’s better-known experimental settlements of the nineteenth century, rather than a refuge for a devout religious sect, Stelton was a hive of political radicals, where federal agents came snooping during the Red Scare of 1919-1920. But it was also a suburb, a community of people who moved out of the city for the sake of their children’s education and to enjoy a little land and peace. They were not even the first people to come to the area with the same idea: There was already a German socialist enclave nearby, called Fellowship Farm.
The founders of Stelton were anarchists. In the twenty-first century, the word “anarchism” evokes images of masked antifa facing off against neo-Nazis. What it meant in the early twentieth century was different, and not easily defined. The anarchist movement emerged in the mid-nineteenth century alongside Marxism, and the two were allied for a time before a decisive split in 1872. Anarchist leader Mikhail Bakunin rejected the authority of any state — even a worker-led state, as Marx envisioned — and therefore urged abstention from political engagement. Engels railed against this as a “swindle.”
But anarchism was less a coherent, unified ideology than a spectrum of overlapping beliefs, especially in the United States. Although some anarchists used violence to achieve their ends, like Leon Czolgosz, who assassinated President William McKinley in 1901, others opposed it. Many of the colonists at Stelton were influenced by the anarcho-pacifism of Leo Tolstoy and by the land-tax theory of Henry George. The most venerated hero was probably the Russian scientist-philosopher Peter Kropotkin, who argued that voluntary cooperation (“mutual aid”) was a fundamental drive of animals and humans, and opposed centralized government and state laws in favor of small, self-governing, voluntary associations such as communes and co-ops. Read more…
Since the move to Douglas, Arizona, Jennifer had spent less and less time at home. She was distant and irritable. Her anger encompassed her mother, her mother’s abusive boyfriend Saul, American schools, and the whole United States. At the nadir, she started lashing out at her sisters Aida and Cynthia. And then, in 1998 or 1999, she left for good.
The morning Jennifer ran away, Aida was the only other person home. She watched her sister dump schoolbooks from her backpack and replace them with clothes. She knew what was happening without having to ask and figured it was for the best. On the way out, Jennifer said that a friend would drive her across the border. After that, she’d see what happened.
I’d responded to the Author’s anonymous posting on Craigslist, and when I showed up to the interview, I still didn’t know who I’d be speaking with. I was 23, in grad school in New York, piecing together my rent with odd jobs. The month before, I’d replied to an equally opaque Craigslist ad and found myself wobbling over cobblestones in stilettos, club promoting for a man known to the Meatpacking District only as “Doc.” Doc had informed me that I was an “8” among regular girls, but in club world I was only a “4,” given my 5-foot-3-inch stature. He wondered: Did I have many girlfriends over 5-foot-11 I could bring around? They didn’t need to be actual models, just tall enough to be mistaken for models by drunk men from across dark, strobe-lit rooms. I needed a new job.
The Author shuffled into our interview at his Upper East Side apartment, his velvet slippers whispering against the Oriental rugs. He was pushing 80, a small man with bushy white eyebrows and a bulbous nose that pressed flat against his face. He had a full, pouty lower lip and a thin upper lip that curled under when he smiled.
The Author had been a staff writer at an iconic American magazine for three decades and had written a remarkable number of books, mostly memoirs. He’d been blind since early childhood, and while his is surely a story of overcoming great odds, the Author was notorious for his poor treatment of assistants. He actually alluded to this in our interview, telling me there were some unsavory rumors out there and not to believe a word of them. I was dubious but desperate for money. And there was a small part of me that hoped he’d softened with age. Or maybe that he’d sense some unfulfilled potential in me. That he’d treat me with the care one gives to a rare find — plucked from the detritus at a yard sale, snubbed by foolish bygone handlers.
The Author, his wife, and their two adult daughters went to their house on an island off the New England coast every August, and I was expected to go along. The only way onto the island was a 20-minute ferry ride from the nearest seaside town. One road ran through most of the 14-mile island, a hamlet of spruce tree forests and rolling pastures. The island was a private sanctuary for the Northeast’s inconspicuous elite, and on the drive from the ferry station, mansions flickered through the trees. The Author’s house was at the end of a short, wooded drive. He’d built it in the ’80s, with the help of a Modernist architect who’d designed a few New York skyscrapers. By the island’s standards, the house wasn’t sprawling or flashy, but it was distinctively lovely, perched on an embankment above the frigid harbor. Down the hill toward the beach was a pool and a pool house, tucked into an alcove of trees. Past the pool, a pathway cut through high grass and down to the rocky beachfront. I stayed in a spare basement bedroom, with a window that looked out onto the harbor. Their cook, a Brazilian woman in her 80s, slept in a room adjacent to mine.
It didn’t take long to realize that my presence was more a thing to be tolerated than embraced by the family. I wasn’t asked many questions about my life aside from those necessitated by politeness. And to be fair, I can’t imagine what it would be like for your most intimate family memories to include a revolving cast of paid help, always on their way somewhere else. Anyway, it seemed like I was mainly there to enable the Author’s wife and daughters not to be there, so he and I were often alone. My job title was “editorial assistant,” though the only editorial skill required was basic literacy. I read the New York Times aloud to the Author every morning, then we perused headlines from The Guardian. Then we responded to his emails, of which there were generally few of note. Then there was lunch, his nap, a walk, and an afternoon activity. Aside from the nap, we did everything together. Read more…
“The American work ethic, the motivation that drives Americans to work longer hours each week and more weeks each year than any of our economic peers, is a long-standing contributor to America’s success.” Thus reads the first sentence of a massive report the Trump administration released in July 2018. Americans’ drive to work ever harder, longer, and faster is at the heart of the American Dream: the idea, which has become more mythology than reality in a country with yawning income inequality and stagnating upward economic mobility, that if an American works hard enough she can attain her every desire. And we really try: We put in between 30 to 90 minutes more each day than the typical European. We work 400 hours more annually than the high-output Germans and clock more office time than even the work-obsessed Japanese.
The story of individual hard work is embedded into the very founding of our country, from the supposedly self-made, entrepreneurial Founding Fathers to the pioneers who plotted the United States’ western expansion; little do we acknowledge that the riches of this country were built on the backs of African slaves, many owned by the Founding Fathers themselves, whose descendants live under oppressive policies that continue to leave them with lower incomes and overall wealth and in greater poverty. We — the “we” who write the history books — would rather tell ourselves that the people who shaped our country did it through their own hard work and not by standing on the shoulders, or stepping on the necks, of others. It’s an easier story to live with. It’s one where the people with power and money have it because they deserve it, not because they took it, and where we each have an equal shot at doing the same.
Because for all our national pride in our puritanical work ethic, the ethic doesn’t apply evenly. At the highest income levels, wealthy Americans are making money passively, through investments and inheritances, and doing little of what most would consider “work.” Basic subsistence may soon be predicated on whether and how much a poor person works, while the rich count on tax credits and carve-outs designed to protect stockpiles of wealth created by money begetting itself. It’s the poor who are expected to work the hardest to prove that they are worthy of Americanness, or a helping hand, or humanity. At the same time, we idolize and imitate the rich. If you’re rich, you must have worked hard. You must be someone to emulate. Maybe you should even be president.
* * *
Trump has a long history of antipathy to the poor, a word which he uses as a synonym for “welfare,” which he understands only as a pejorative. When he and his father were sued by the Department of Justice in 1973 for discriminating against black tenants in their real estate business, he shot back that he was being forced to rent to “welfare recipients.” Nearly 40 years later, he called President Obama “our Welfare & Food Stamp President,” saying he “doesn’t believe in work.” He wrote in his 2011 book Time To Get Tough, “There’s nothing ‘compassionate’ about allowing welfare dependency to be passed from generation to generation.”
Perhaps. But Trump certainly knows about relying on things passed from generation to generation. His self-styled origin story is that he got his start with a “small” $1 million loan from his real estate tycoon father, Fred C. Trump, which he used to grow his own empire. “I built what I built myself,” he has claimed. “I did it by working long hours, and working hard and working smart.”
It’s an interesting interpretation of “myself”: A New York Times investigation in October reported that, instead, Trump has received at least $413 million from his father’s businesses over the course of his life. “By age 3, Mr. Trump was earning $200,000 a year in today’s dollars from his father’s empire. He was a millionaire by age 8. By the time he was 17, his father had given him part ownership of a 52-unit apartment building,” reporters David Barstow, Susanne Craig, and Russ Buettner wrote. “Soon after Mr. Trump graduated from college, he was receiving the equivalent of $1 million a year from his father. The money increased with the years, to more than $5 million annually in his 40s and 50s.” The Times found 295 different streams of revenue Fred created to enrich his son — loans that weren’t repaid, three trust funds, shares in partnerships, lump-sum gifts — much of it further inflated by reducing how much went to the government. Donald and his siblings helped their parents dodge taxes with sham corporations, improper deductions, and undervalued assets, helping evade levies on gifts and inheritances.
If you’re rich, you must have worked hard. You must be someone to emulate. Maybe you should even be president.
Even the money that was made squarely owed a debt to the government. Fred Trump nimbly rode the rising wave of federal spending on housing that began with the New Deal and continued with the G.I. Bill. “Fred Trump would become a millionaire many times over by making himself one of the nation’s largest recipients of cheap government-backed building loans,” the Times reported. Donald carried on this tradition of milking government subsidies to accumulate fortunes. He obtained at least $885 million in perfectly legal grants, subsidies, and tax breaks from New York to build his real estate business.
Someone could have taken this largesse and worked hard to grow it into something more, but Donald Trump was not that someone. Much of his fortune comes not from the down and dirty work of running businesses, but from slapping his name on everything from golf courses to steaks. Many of these deals entail merely licensing his name while a developer actually runs things. And as president, he still doesn’t seem inclined to clock much time doing actual work.
That hasn’t stopped him from putting work at the center of his administration’s poverty-related policies. In the White House Council of Economic Advisers’ lengthy tome, it argued for adding work requirements to a new universe of public benefits. These requirements, which up until the Trump administration only existed for direct cash assistance and food stamps, require a recipient not just to put in a certain number of hours at a job or some other qualifying activity, but to amass paperwork to prove those hours each month. The CEA report is focused, supposedly, on “the importance and dignity of work.” But the benefits of engaging in labor are only deemed important for a particular population: “welfare recipients who society expects to work.” Over and over, it takes for granted that our country only expects the poorest to work in order to prove themselves worthy of government funds, specifically targeting those who get food stamps to feed their families, housing assistance to keep roofs over their heads, and Medicaid to stay healthy.
* * *
The report doesn’t just represent an ethos in the administration; it was also a justification for concrete actions it had already taken and more it would soon roll out. Last April, Trump signed an executive order that ordered federal agencies to review public assistance programs in order to see if they could impose work requirements unilaterally to “ensure that they are consistent with principles that are central to the American spirit — work, free enterprise, and safeguarding human and economic resources,” as the document states, while also “reserving public assistance programs for those who are truly in need.”
The administration has also pushed forward on its own. In 2017, it announced that states could apply for waivers that would allow them to implement work requirements in Medicaid for the first time, and so far more than a dozen states have taken it up on the offer, with Arkansas’s rule in effect since June 2018. (It has now been halted by a federal judge.) In that state, Medicaid recipients had to spend 80 hours a month at work, school, or volunteering, and report those activities to the government in order to keep getting health insurance. And in April 2018, Housing and Urban Development Secretary Ben Carson unveiled a proposal to let housing authorities implement work requirements for public housing residents and rental assistance recipients. Trump pushed Congress to include more stringent work requirements in the food stamp program as it debated the most recent farm bill, arguing it would “get America back to work.” When that effort failed, the Agriculture Department turned around and proposed a rule to impose the requirements by itself.
These aren’t fiscal necessities — they’re crackdowns on the poor, justified by the idea that they should prove themselves worthy of the benefits that help them survive, that are not just cruel but out of step with real life. Most people who turn to public programs already work, and those who don’t often have good reason. More than 60 percent of people on Medicaid are working. They remain on Medicaid because their pay isn’t enough to keep them out of poverty, and many of the low-wage jobs they work don’t offer health insurance they can afford. Of those not working, most either have a physical impairment or conflicting responsibilities like school or caregiving.
Enrollment in food stamps tells the same story. Among the “work-capable” adults on food stamps, about two thirds work at some point during the year, while 84 percent live in a household where someone works. But low-wage work is often chaotic and unpredictable. Recipients are more likely to turn to food stamps during a spell of unemployment or too few hours, then stop when they resume steadier employment. Many of those who are supposedly capable of work but don’t have a job have a health barrier or live with someone who has one; they’re in school, they’re caring for family, or they just can’t find work in their community.
Work requirements, then, fail to account for the reality of poor people’s lives. It’s not that there’s a widespread lack of work ethic among people who earn the least, but that there’s a lack of steady pay and consistent opportunities that allow someone to sustain herself and her family without assistance. We also know work requirements just don’t work. They’ve existed in the Temporary Assistance for Needy Families cash-assistance program for decades, yet they don’t help people find meaningful, lasting work; instead they serve as a way to shove them out of programs they desperately need. The result is more poverty, not more jobs.
If this country were so concerned about helping people who might face barriers to working get jobs, we might not be the second-lowest among OECD member countries by percentage of GDP spent on labor-market programs like job-search assistance or retraining. The poor in particular face barriers like affordable childcare and reliable transportation, and could use education or training to reach for better-paid, more meaningful work. But we do little to extend these supports. Instead, we chastise them for not pulling on their frayed bootstraps hard enough.
We also seem content with the notion that a person who doesn’t work — either out of inability or refusal — doesn’t deserve the building blocks of staying alive. The programs Trump is targeting, after all, are about basic needs: housing to stay safe from the elements, food to keep from going hungry, healthcare to receive treatment and avoid dying of neglect. Even if it were true that there was a horde of poor people refusing to work, do we want to condemn them to starvation and likely death? In one of the world’s richest countries, do we really balk at spending money on keeping our people — even lazy ones — alive?
We also know work requirements just don’t work. They’ve existed in the Temporary Assistance for Needy Families cash-assistance program for decades, yet they don’t help people find meaningful, lasting work; instead they serve as a way to shove them out of programs they desperately need. The result is more poverty, not more jobs.
Plenty of other countries don’t do so. Single mothers experience higher rates of destitution than coupled parents or people without children all over the world. But the higher poverty rate in the U.S. as compared to other developed countries isn’t because we have more single mothers; instead, it’s because we do so little to help them. Compare us to Denmark, which gives parents unconditional cash benefits for each of their children regardless of whether or how much they work, on top of generously subsidizing childcare, offering universal health coverage, and guaranteeing paid leave. It’s no coincidence that they also have a lower poverty rate, both generally and for single mothers specifically. A recent examination of poverty across countries found that children are at higher risk in the U.S because we have a sparse social safety net that’s so closely tied to demanding that people work. It makes us an international outlier, the world’s miser that only opens a clenched fist to the poor if they’re willing to demonstrate their worthiness first.
Here, too, America’s history of slavery and ongoing racism rears its head. According to a trio of renowned economists, we don’t have a European-style social safety net because “racial animosity in the U.S. makes redistribution to the poor, who are disproportionately black, unappealing to many voters.” White people turn against funding public benefit programs when they feel their racial status threatened, particularly benefits they (falsely) believe mainly accrue to black people. The black poor are seen as the most undeserving of help and most in need of proving their worthiness to get it. States with larger percentages of black residents, for example, focus less on TANF’s goal of providing cash to the needy and have stingier benefits with higher hurdles to enrollment.
* * *
The CEA’s report on work requirements claimed that being an adult who doesn’t work is particularly prevalent among “those living in low-income households.” But that’s debatable. The more income someone has, the less likely he is to be getting it from wages. In 2012, those earning less than $25,000 a year made nearly three quarters of that money from a job. Those making more than $10 million, on the other hand, made about half of their money from capital gains — in other words, returns on investments. The bottom half of the country has, on average, just $826 in income from capital investments each; the average for those in the top 1 percent is more than $16 million.
The richest are the least likely to have their money come from hard labor — yet there’s no moral panic over whether they’re coddled or lacking in self reliance. Instead, government benefits help the rich protect and grow idle wealth. Capital gains and dividends are taxed at a lower rate than regular salaried income. Inheritances were taxed at an average rate of 4 percent in 2009, compared to the average rate of 18 percent for money earned by working and saving. When investments are bequeathed, the recipient owes no taxes on any asset appreciation.
Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.
In fact, government tax benefits that increase people’s take-home money at the expense of what the government collects for its own coffers overwhelmingly benefit the rich over the poor (or even the middle class). More than 60 percent of the roughly $900 billion in annual tax expenditures goes to the richest 20 percent of American families. That figure dwarfs what the government expends on many public benefit programs. The government spends more than three times as much on tax subsidies for homeowners, mostly captured by the well-to-do, than it does on rental assistance for the poor. The three benefit programs the Trump administration is concerned with — Medicaid, food stamps, and housing assistance — come to about $705 billion in combined spending.
While the administration has been concerned with what it can do to compel the poor to work, it’s handed out more largesse to the idle rich. Its signature tax-cut package, the Tax Cuts and Jobs Act, offered an extra cut for so-called “pass-through” businesses, like law or real estate firms. But the fine print included a wrinkle: If someone is considered actively involved in his pass-through business, only 30 percent of his earnings could qualify for the new discount. If someone is passively involved, however — a shareholder who doesn’t do much about the day-to-day work of the company — then he gets 100 percent of the new benefit.
Then there’s the law’s significant lowering of the estate tax. The tax is levied on only the biggest, most valuable inheritances passed down from wealthy parent to newly wealthy child. Before the Republicans’ tax bill, only the richest 0.2 percent of estates had to pay the tax when fortunes changed hands. Now it’s just the richest 0.1 percent, or a mere 1,800 very wealthy families worth more than $22 million. The rest get to pass money to their heirs tax-free. Those who do pay it will be paying less when tax time comes due — $4.4 million less, to be exact.
Despite the Republican rhetoric that lowering the estate tax is about saving family farms, it’s really about allowing an aristocracy to calcify — one in which rich parents ensure their children are rich before they lift a single finger in work. As those heirs receive their fortunes, they also receive the blessing that comes with riches: the halo of success and, therefore, deservedness without having to work to prove it. Yet there’s evidence that increasing taxes on inheritances has the potentially salutary effect of getting heirs to work more. The more their inheritances are taxed, the more they end up paying in labor taxes — evidence that they’re working harder for their livings, not just coasting on generational wealth. Perhaps our tax code could encourage rich heirs to experience the dignity of work.
* * *
Trump’s CEA report is accurate about at least one thing: Our country has a history of only offering public benefits to the poor either deemed worthy through their work or exempt through old age or disability. An outlier was the Aid to Families with Dependent Children program, which became Temporary Assistance for Needy Families after Bill Clinton signed welfare reform into law in the ’90s. But the 1996 transformation of the program took what was a promise of cash for poor mothers and changed it into an obstacle course of proving a mother’s worth before she can get anywhere close to a check. It paved the way for the current administration’s obsession with work requirements.
Largesse for the rich, on the other hand, has rarely included such tests. No one has been made to pee in a cup for tax breaks on their mortgages, which cost as much as the food stamp program but overwhelmingly benefit families that earn more than $100,000. No one has had to prove a certain number of work hours to get a lower tax rate on investment income or an inheritance. They get that discount on their money without having to do any work at all.
We haven’t always been so extreme in our dichotomous treatment of the rich and poor; throughout the 1940s, ’50s, and ’60s, we coupled high marginal taxes on the wealthy with a minimum wage that ensured that people who put in full-time work could rise out of poverty. The estate tax has been as high as 77 percent. As Dutch historian Rutger Bregman recently told an audience of the ultrawealthy at Davos, we’re living proof that high taxes can spread shared prosperity. “The United States, that’s where it has actually worked, in the 1950s, during Republican President Eisenhower,” he pointed out. “This is not rocket science.” It was during the same era that we also created significant anti-poverty programs such as Social Security, Medicare, and Medicaid. In fact, this country pioneered the idea of progressive taxation and has always had some form of tax on inheritance to avoid creating an aristocracy. But we’ve papered over that history as tax rates have cratered and poverty has climbed.
Instead, as Reaganomics and neoliberal ideas took hold of our politics, we turned back to the Horatio Alger myth that success is attained on an individual basis by hard work alone, and that riches are the proof of a dogged drive. Lower tax rates naturally follow under the theory that the rich should keep more of their deserved bounty. And if you’re poor, coming to the government seeking a helping hand up, you failed.
The country is due for a reckoning with our obsession with work. There are certainly financial and emotional benefits that come from having a job. But why are we only concerned with whether the poor reap those benefits? Is working ourselves to the bone the best signifier of our worth — and are there basic elements of life that we should guarantee regardless of work? It doesn’t mean dropping all emphasis on work ethic. But it does require a deeper examination of who we expect to work — and why.
* * *
Bryce Covert is an independent journalist writing about the economy and a contributing op-ed writer at The New York Times.
Soraya Roberts | Longreads | April 2019 | 9 minutes (2,392 words)
We all the know the stats, that by 2030 the richest 1 percent could be hoarding two-thirds of the world’s wealth. Tax the rich! Redistribute to the poor! It’s the kind of thing you hear lately set to some lame music in a weirdly cut NowThis News video of Alexandria Ocasio-Cortez or Rutger Bregman. (It’s always some scrappy progressive, not some bloated billionaire because, I don’t know, *yawns, eats some cake.*) Perhaps the rich will be moved by the fact that income equality is not only bad for the collective mental health, but their own? No? That the 10 percent’s multiplying accessories — private jets and yachts and enormous holiday homes — hogs nearly half the world’s emissions, killing the earth we all share? No? Nothing? What’s that you say, infrastructure investment started plummeting just as inequality began rising? But all the philanthropy! Which, sure, America’s largest donors may give a little more than before, but they also make way more than they used to. And asJacobin magazine recently noted, “those nations — mostly in Scandinavia — that have the highest levels of equality and social well-being have the tiniest philanthropic sectors.” When you have equality, you don’t need long Greek words.
To recognize this, as a rich person, you need to have a sort of reverse double consciousness. “Double consciousness” originates with W. E. B. Du Bois, one of the founders of the NAACP, who coined it in 1897 as one way to describe the experience of being an African American in a white supremacist world. InThe Atlantic Monthly he defined it as, “…this sense of always looking at one’s self through the eyes of others….” The concept is based on being oppressed. What I’m talking about is an inverted version based on being the oppressor. It is the recognition that not only do you have outsized means, but that they come at the expense of others. It requires not only self awareness, but other awareness, and it’s a prerequisite for change.
Roy Disney’s granddaughter, Abigail, for instance, has given $70 million away over the past four decades, which is more than she ever inherited. “The problem is that there’s a systematic favoring of people who have accumulated an enormous amount of wealth,” shetweeted after a viral appearance on CNBC last month in which she said CEOs were overpaid. “The U.S. must make structural changes by taxing the wealthy.” To say that, she had to have had some kind of awakening — but what was it? In her case it was a sudden burst of extraordinary wealth and its human toll — not on others, but on the wealthy themselves. In 1984, when the heiress was in college, Michael Eisner became the chairman and CEO of Disney and launched its stocks into the stratosphere. Abigail’s father embraced the excess income — the too-big private jet, the too-much drinking — and no one questioned him, not even about his alcoholism. “That’s when I feel that my dad really lost his way in life. And that’s why I feel hyperconscious about what wealth does to people,” she recently toldThe Cut. “I lived in one family as a child, and then I didn’t even recognize the family as I got older.” Read more…
It’s hard to know what would be a good place from which to imagine a future of bad smells and no privacy, deceit and propaganda, poverty and torture. Does a writer need to live in misery and ugliness to conjure up a dystopia?
Apparently not.
We’d been walking more than an hour. The road was two tracks of pebbled dirt separated by a strip of grass. The land was treeless as prairie, with wildflowers and the seedless tops of last year’s grass smudging the new growth.
We rounded a curve and looked down a hillside to the sea. A half mile in the distance, far back from the water, was a white house with three dormer windows. Behind it, a stone wall cut a diagonal to the water like a seam stitching mismatched pieces of green velvet. Far to the right, a boat moved along the shore, its sail as bright as the house.
This was where George Orwell wrote Nineteen Eighty-Four. The house, called Barnhill, sits near the northern end of Jura, an island off Scotland’s west coast in the Inner Hebrides. It was June 2, sunny, short-sleeve warm, with the midges barely out, and couldn’t have been more beautiful.
Orwell lived here for parts of the last three years of his life. He left periodically (mostly in the winter) to do journalism in London and, for seven months in 1947 and 1948, to undergo treatment for pulmonary tuberculosis. Although he rented Barnhill and didn’t own it, he put in fruit trees and a garden, built a chicken house, bought a truck and a boat, and invested numberless hours of labor in what he believed would be his permanent home. When he left it for the last time, in January 1949, he never again lived outside a sanatorium or hospital.
I came to Jura after a two-week backpacking trip across Scotland. My purpose was to drink single-malt on Islay, the island to the south, and enjoy two nights of indulgence at Ardlussa House, where Orwell’s landlord had lived. I was not on a literary pilgrimage. Barnhill is not open to the public, and no one among the island’s 235 residents remembers Orwell. Read more…
You must be logged in to post a comment.