Search Results for: City Journal

The Man Who’s Going to Save Your Neighborhood Grocery Store

Illustration by Vinnie Neuberg

Joe Fassler | The Counter & Longreads | April 2019 | 8,802 words (33 minutes)

This story is published in partnership with The Counter, with reporting supported by the 11th Hour Food and Farming Fellowship at the University of California, Berkeley.  


In 2014, Rich Niemann, president and CEO of the Midwestern grocery company Niemann Foods, made the most important phone call of his career. He dialed the Los Angeles office of Shook Kelley, an architectural design firm, and admitted he saw no future in the traditional grocery business. He was ready to put aside a century of family knowledge, throw away all his assumptions, completely rethink his brand and strategy — whatever it would take to carry Niemann Foods deep into the 21st century.

“I need a last great hope strategy,” he told Kevin Kelley, the firm’s cofounder and principal. “I need a white knight.”

Part square-jawed cattle rancher, part folksy CEO, Niemann is the last person you’d expect to ask for a fresh start. He’s spent his whole life in the business, transforming the grocery chain his grandfather founded in 1917 into a regional powerhouse with more than 100 supermarkets and convenience stores across four states. In 2014, he was elected chair of the National Grocery Association. It’s probably fair to say no one alive knows how to run a grocery store better than Rich Niemann. Yet Niemann was no longer sure the future had a place for stores like his.

He was right to be worried. The traditional American supermarket is dying. It’s not just Amazon’s purchase of Whole Foods, an acquisition that trade publication Supermarket News says marked “a new era” for the grocery business — or the fact that Amazon hopes to launch a second new grocery chain in 2019, according to a recent report from The Wall Street Journal, with a potential plan to scale quickly by buying up floundering supermarkets. Even in plush times, grocery is a classic red oceanindustry, highly undifferentiated and intensely competitive. (The term summons the image of a sea stained with the gore of countless skirmishes.) Now, the industrys stodgy old playbook — “buy one, get onesales, coupons in the weekly circular is hurtling toward obsolescence. And with new ways to sell food ascendant, legacy grocers like Rich Niemann are failing to bring back the customers they once took for granted. You no longer need grocery stores to buy groceries.

Niemann hired Kelley in the context of this imminent doom. The assignment: to conceive, design, and build the grocery store of the future. Niemann was ready to entertain any idea and invest heavily. And for Kelley, a man whos worked for decades honing his vision for what the grocery store should do and be, it was the opportunity of a lifetime carte blanche to build the working model hes long envisioned, one he believes can save the neighborhood supermarket from obscurity.

Kevin Kelley, illustration by Vinnie Neuberg

Rich Niemann, illustration by Vinnie Neuberg

The store that resulted is called Harvest Market, which opened in 2016. Its south of downtown Champaign, Illinois, out by the car dealerships and strip malls; 58,000 square feet of floor space mostly housed inside a huge, high-ceilinged glass barn. Its bulk calls to mind both the arch of a hayloft and the heavenward jut of a church. But you could also say its shaped like an ark, because its meant to survive an apocalypse.

Harvest Market is the anti-Amazon. Its designed to excel at what e-commerce can’t do: convene people over the mouth-watering appeal of prize ingredients and freshly prepared food. The proportion of groceries sold online is expected to swell over the next five or six years, but Harvest is a bet that behavioral psychology, spatial design, and narrative panache can get people excited about supermarkets again. Kelley isnt asking grocers to be more like Jeff Bezos or Sam Walton. Hes not asking them to be ruthless, race-to-the-bottom merchants. In fact, he thinks that grocery stores can be something far greater than we ever imagined a place where farmers and their urban customers can meet, a crucial link between the city and the country.

But first, if theyre going to survive, Kelley says, grocers need to start thinking like Alfred Hitchcock.

* * *

Kevin Kelley is an athletic-looking man in his mid-50s , with a piercing hazel gaze that radiates thoughtful intensity. In the morning, he often bikes two miles to Shook Kelley’s office in Hollywood — a rehabbed former film production studio on an unremarkable stretch of Melrose Avenue, nestled between Bogie’s Liquors and a driving school. Four nights a week, he visits a boxing gym to practice Muay Thai, a form of martial arts sometimes called “the art of eight limbs” for the way it combines fist, elbow, knee, and shin attacks. “Martial arts,” Kelley tells me, “are a framework for handling the unexpected.” That’s not so different from his main mission in life: He helps grocery stores develop frameworks for the unexpected, too.

You’ve never heard of him, but then it’s his job to be invisible. Kelley calls himself a supermarket ghostwriter: His contributions are felt more than seen, and the brands that hire him get all the credit. Countless Americans have interacted with his work in intimate ways, but will never know his name. Such is the thankless lot of the supermarket architect.

A film buff equally fascinated by advertising and the psychology of religion, Kelley has radical theories about how grocery stores should be built, theories that involve terms like emotional opportunity,” “brain activity,” “climax,and “mise-en-scène.But before he can talk to grocers about those concepts, he has to convince them of something far more elemental: that their businesses face near-certain annihilation and must change fundamentally to avoid going extinct.

It is the most daunting feeling when you go to a grocery store chain, and you meet with these starched-white-shirt executives,Kelley tells me. When we get a new job, we sit around this table we do it twenty, thirty times a year. Old men, generally. Don’t love food, progressive food. Just love their old food like Archie Bunkers, essentially. You meet these people and then you tour their stores. Then I’ve got to go convince Archie Bunker that there’s something called emotions, that there are these ideas about branding and feeling. It is a crazy assignment. I can’t get them to forget that they’re no longer in a situation where they’ve got plenty of customers. That its do-or-die time now.

Forget branding. Forget sales. Kelley’s main challenge is redirecting the attention of older male executives, scared of the future and yet stuck in their ways, to the things that really matter.

I make my living convincing male skeptics of the power of emotions,he says.

Human beings, it turns out, aren’t very good at avoiding large-scale disaster. As you read this, the climate is changing, thanks to the destructively planet-altering activities of our species. The past four years have been the hottest on record. If the trend continues — and virtually all experts agree it will — we’re likely to experience mass disruptions on a scale never before seen in human history. Drought will be epidemic. The ocean will acidify. Islands will be swallowed by the sea. People could be displaced by the millions, creating a new generation of climate refugees. And all because we didn’t move quickly enough when we still had time.

You know this already. But I bet you’re not doing much about it — not enough, at least, to help avert catastrophe. I’ll bet your approach looks a lot like mine: worry too much, accomplish too little. The sheer size of the problem is paralyzing. Vast, systemic challenges tend to short-circuit our primate brains. So we go on, as the grim future bears down.

Grocers, in their own workaday way, fall prey to the same inertia. They got used to an environment of relative stability. They don’t know how to prepare for an uncertain future. And they can’t force themselves to behave as if the good times are really going to go away — even if, deep down, they know it’s true.

I make my living convincing male skeptics of the power of emotions.

In the 1980s, you could still visit almost any community in the U.S. and find a thriving supermarket. Typically, it would be a dynasty family grocery store, one that had been in business for a few generations. Larger markets usually had two or three players, small chains that sorted themselves out along socioeconomic lines: fancy, middlebrow, thrifty. Competition was slack and demand — this is the beautiful thing about selling food — never waned. For decades, times were good in the grocery business. Roads and schools were named after local supermarket moguls, who often chaired their local chambers of commerce. “When you have that much demand, and not much competition, nothing gets tested. Kind of like a country with a military that really doesn’t know whether their bullets work,” Kelley says. “They’d never really been in a dogfight.”

It’s hard to believe now, but there was not a single Walmart on the West Coast until 1990. That decade saw the birth of the “hypermarket” and the beginning of the end for traditional grocery stores — Walmarts, Costcos, and Kmarts became the first aggressive competition supermarkets ever really faced, luring customers in with the promise of one-stop shopping on everything from Discmen to watermelon.

The other bright red flag: Americans started cooking at home less and eating out more. In 2010, Americans dined out more than in for the first time on record, the culmination of a slow shift away from home cooking that had been going on since at least the 1960s. That trend is likely to continue. According to a 2017 report from the USDA’s Economic Research Service, millennials shop at food stores less than any other age group, spend less time preparing food, and are more likely to eat carry-out, delivery, or fast food even when they do eat at home. But even within the shrinking market for groceries, competition has stiffened. Retailers not known for selling food increasingly specialize in it, a phenomenon called “channel blurring”; today, pharmacies like CVS sell pantry staples and packaged foods, while 99-cent stores like Dollar General are a primary source of groceries for a growing number of Americans. Then there’s e-commerce. Though only about 3 percent of groceries are currently bought online, that figure could rocket to 20 percent by 2025. From subscription meal-kit services like Blue Apron to online markets like FreshDirect and Amazon Fresh, shopping for food has become an increasingly digital endeavor — one that sidesteps traditional grocery stores entirely.

A cursory glance might suggest grocery stores are in no immediate danger. According to the data analytics company Inmar, traditional supermarkets still have a 44.6 percent market share among brick-and-mortar food retailers. And though a spate of bankruptcies has recently hit the news, there are actually more grocery stores today than there were in 2005. Compared to many industries — internet service, for example — the grocery industry is still a diverse, highly varied ecosystem. Forty-three percent of grocery companies have fewer than four stores, according to a recent USDA report. These independent stores sold 11 percent of the nation’s groceries in 2015, a larger collective market share than successful chains like Albertson’s (4.5 percent), Publix (2.25 percent), and Whole Foods (1.2 percent).

But looking at this snapshot without context is misleading — a little like saying that the earth can’t be warming because it’s snowing outside. Not long ago, grocery stores sold the vast majority of the food that was prepared and eaten at home — about 90 percent in 1988, according to Inmar. Today, their market share has fallen by more than half, even as groceries represent a diminished proportion of overall food sold. Their slice of the pie is steadily shrinking, as is the pie itself.

By 2025, the thinking goes, most Americans will rarely enter a grocery store. That’s according to a report called “Surviving the Brave New World of Food Retailing,” published by the Coca-Cola Retailing Research Council — a think tank sponsored by the soft drink giant to help retailers prepare for major changes. The report describes a retail marketplace in the throes of massive change, where supermarkets as we know them are functionally obsolete. Disposables and nonperishables, from paper towels to laundry detergent and peanut butter, will replenish themselves automatically, thanks to smart-home sensors that reorder when supplies are low. Online recipes from publishers like Epicurious will sync directly to digital shopping carts operated by e-retailers like Amazon. Impulse buys and last-minute errands will be fulfilled via Instacart and whisked over in self-driving Ubers. In other words, food — for the most part — will be controlled by a small handful of powerful tech companies.

The Coca-Cola report, written in consultation with a handful of influential grocery executives, including Rich Niemann, acknowledges that the challenges are dire. To remain relevant, it concludes, supermarkets will need to become more like tech platforms: develop a “robust set of e-commerce capabilities,” take “a mobile-first approach,” and leverage “enhanced digital assets.” They’ll need infrastructure for “click and collect” purchasing, allowing customers to order online and pick up in a jiffy. They’ll want to establish a social media presence, as well as a “chatbot strategy.” In short, they’ll need to become Amazon, and they’ll need to do it all while competing with Walmart — and its e-commerce platform, Jet.com — on convenience and price.

That’s why Amazon’s acquisition of Whole Foods Market was terrifying to so many grocers, sending the stocks of national chains like Kroger tumbling: It represents a future they can’t really compete in. Since August 2017, Amazon has masterfully integrated e-commerce and physical shopping, creating a muscular hybrid that represents an existential threat to traditional grocery stores. The acquisition was partially a real estate play: Whole Foods stores with Prime lockers now act as a convenient pickup depot for Amazon goods. But Amazon’s also doing its best to make it too expensive and inconvenient for its Prime members, who pay $129 a year for free two-day shipping and a host of other perks, to shop anywhere else. Prime members receive additional 10 percent discounts on select goods at Whole Foods, and Amazon is rolling out home grocery delivery in select areas. With the Whole Foods acquisition, then, Amazon cornered two markets: the thrift-driven world of e-commerce and the pleasure-seeking universe of high-end grocery. Order dish soap and paper towels in bulk on Amazon, and pick them up at Whole Foods with your grass-fed steak.

Traditional grocers are now expected to offer the same combination of convenience, flexibility, selection, and value. They’re understandably terrified by this scenario, which would require fundamental, complex, and very expensive changes. And Kelley is terrified of it, too, though for a different reason: He simply thinks it wont work. In his view, supermarkets will never beat Walmart and Amazon at what they do best. If they try to succeed by that strategy alone, theyll fail. That prospect keeps Kelley up at night because it could mean a highly consolidated marketplace overseen by just a handful of players, one at stark contrast to the regional, highly varied food retail landscape America enjoyed throughout the 20th century.

I’m afraid of what could happen if Walmart and Amazon and Lidl are running our food system, the players trying to get everything down to the lowest price possible,he tells me. What gives me hope is the upstarts who will do the opposite. Who arent going to sell convenience or efficiency, but fidelity.

The approach Kelley’s suggesting still means completely overhauling everything, with no guarantee of success. It’s a strategy that’s decidedly low-tech, though it’s no less radical. It’s more about people than new platforms. It means making grocery shopping more like going to the movies.

* * *

Nobody grows up daydreaming about designing grocery stores, including Kelley. As a student at the University of North Carolina at Charlotte, he was just like every other architect-in-training: He wanted to be a figure like Frank Gehry, building celebrated skyscrapers and cultural centers. But he came to feel dissatisfied with the culture of his profession. In his view, architects coldly fixate on the aesthetics of buildings and aren’t concerned enough with the people inside.

“Architecture worships objects, and Capital-A architects are object makers,” Kelley tells me. “They aren’t trying to fix social issues. People and their experience and their perceptions and behaviors don’t matter to them. They don’t even really want people in their photographs—or if they have to, they’ll blur them out.” What interested Kelley most was how people would use his buildings, not how the structures would fit into the skyline. He wanted to shape spaces in ways that could actually affect our emotions and personalities, bringing out the better angels of our nature. To his surprise, no one had really quantified a set of rules for how environment could influence behavior. Wasn’t it strange that advertising agencies spent so much time thinking about the links between storytelling, emotions, and decision-making — while commercial spaces, the places where we actually go to buy, often had no design principle beyond brute utility?

My ultimate goal was to create a truly multidisciplinary firm that was comprised of designers, social scientists and marketing types,” he says. “It was so unorthodox and so bizarrely new in terms of approach that everyone thought I was crazy.”

In 1992, when he was 28, Kelley cofounded Shook Kelley with the Charlotte, North Carolina–based architect and urban planner Terry Shook. Their idea was to offer a suite of services that bridged social science, branding, and design, a new field they called “perception management.” They were convinced space could be used to manage emotion, just the way cinema leads us through a guided sequence of feelings, and wanted to turn that abstract idea into actionable principles. While Shook focused on bigger, community-oriented spaces like downtown centers and malls, Kelley focused on the smaller, everyday commercial spaces overlooked by fancy architecture firms: dry cleaners, convenience stores, eateries, bars. One avant-garde restaurant Kelley designed in Charlotte, called Props, was an homage to the sitcom craze of the 1990s. It was built to look like a series of living rooms, based on the apartment scenes in shows like Seinfeld and Friends and featured couches and easy chairs instead of dining tables to encourage guests to mingle during dinner.

The shift to grocery stores didn’t happen until a few years later, almost by accident. In the mid-’90s, Americans still spent about 55 percent of their food dollars on meals eaten at home — but that share was declining quickly enough to concern top corporate brass at Harris Teeter, a Charlotte-area, North Carolina–based grocery chain with stores throughout the Southwestern United States. (Today, Harris Teeter is owned by Kroger, the country’s second-largest seller of groceries behind Walmart.) Harris Teeter execs reached out to Shook Kelley. “We hear you’re good with design, and you’re good with food,” Kelley remembers Harris Teeter reps saying. “Maybe you could help us.”

At first, it was Terry Shook’s account. He rebuilt each section of the store into a distinct “scene” that reinforced the themes and aesthetics of the type of food it sold. The deli counter became a mocked-up urban delicatessen, complete with awning and neon sign. The produce section resembled a roadside farmstand. The dairy cases were corrugated steel silos, emblazoned with the logo of a local milk supplier. And he introduced full-service cafés, a novelty for grocery stores at the time, with chrome siding like a vintage diner. It was pioneering work, winning that year’s Outstanding Achievement Award from the International Interior Design Association — according to Kelley, it was the first time the prestigious award had ever been given to a grocery store.

Shook backed off of grocery stores after launching the new Harris Teeter, but the experience sparked Kelley’s lifelong fascination with grocery stores, which he realized were ideal proving grounds for his ideas about design and behavior. Supermarkets contain thousands of products, and consumers make dozens of decisions inside them — decisions about health, safety, family, and tradition that get to the core of who they are. He largely took over the Harris Teeter account and redesigned nearly 100 of the chain’s stores, work that would go on to influence the way the industry saw itself and ultimately change the way stores are built and navigated.

Since then, Kelley has worked to show grocery stores that they don’t have to worship at the altar of supply-side economics. He urges grocers to appeal instead to our humanity. Kelley asks them to think more imaginatively about their stores, using physical space to evoke nostalgia, delight our senses, and appeal to the parts of us motivated by something bigger and more generous than plain old thrift. Shopping, for him, is all about navigating our personal hopes and fears, and grocery stores will only succeed when they play to those emotions.

When it works, the results are dramatic. Between 2003 and 2007, Whole Foods hired Shook Kelley for brand strategy and store design, working with the firm throughout a crucial period of the chain’s development. The fear was that as Whole Foods grew, its image would become too diffuse, harder to differentiate from other health food stores; at the same time, the company wanted to attract more mainstream shoppers. Kelley’s team was tasked with finding new ways to telegraph the brand’s singular value. Their solution was a hierarchical system of signage that would streamline the store’s crowded field of competing health and wellness claims.

Kelley’s view is that most grocery stores are “addicted” to signage, cramming their spaces with so many pricing details, promotions, navigational signs, ads, and brand assets that it “functionally shuts down [the customer’s] ability to digest the information in front of them.”

Kelley’s team stipulated that Whole Foods could only have seven layers of information, which ranged from evocative signage 60 feet away to descriptive displays six feet from customers to promotional info just six inches from their hands. Everything else was “noise,” and jettisoned from the stores entirely. If you’ve ever shopped at Whole Foods, you probably recognize the way that the store’s particular brand of feel-good, hippie sanctimony seems to permeate your consciousness at every turn. Kelley helped invent that. The system he created for pilot stores in Princeton, New Jersey, and Louisville, Kentucky, were scaled throughout the chain and are still in use today, he says. (Whole Foods did not respond to requests for comment for this story.)

With a carefully delineated set of core values guiding its purchasing and brand, Whole Foods was ripe for the kind of visual overhaul Kelley specializes in. But most regional grocery chains have a different set of problems: They don’t really have values to telegraph in the first place. Shook Kelley’s approach is about getting buttoned-down grocers to reflect on their beliefs, tapping into deeper, more primal reasons for wanting to sell food.

* * *

Today, Kelley and his team have developed a playbook for clients, a finely tuned process to get shoppers to think in terms that go beyond bargain-hunting. It embraces what he calls “the theater of retail” and draws inspiration from an unlikely place: the emotionally laden visual language of cinema. His goal is to convince grocers to stop thinking like Willy Loman — like depressed, dejected salesmen forever peddling broken-down goods, fixated on the past and losing touch with the present. In order to survive, Kelley says, grocers can’t be satisfied with providing a place to complete a chore. They’ll need to direct an experience.


Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.

Sign up


Today’s successful retail brands establish what Kelley calls a “brand realm,” or what screenwriters would call a story’s “setting.” We don’t usually think consciously about them, but realms subtly shape our attitude toward shopping the same way the foggy, noirishly lit streets in a Batman movie tell us something about Gotham City. Cracker Barrel is set in a nostalgic rural house. Urban Outfitters is set on a graffitied urban street. Tommy Bahama takes place on a resort island. It’s a well-known industry secret that Costco stores are hugely expensive to construct — they’re designed to resemble fantasy versions of real-life warehouses, and the appearance of thrift doesn’t come cheap. Some realms are even more specific and fanciful: Anthropologie is an enchanted attic, complete with enticing cupboards and drawers. Trader Joe’s is a crew of carefree, hippie traders shipping bulk goods across the sea. A strong sense of place helps immerse us in a store, getting us emotionally invested and (perhaps) ready to suspend the critical faculties that prevent a shopping spree.

Kelley takes this a few steps further. The Shook Kelly team, which includes a cultural anthropologist with a Ph.D., begins by conducting interviews with executives, staff, and locals, looking for the storytelling hooks they call “emotional opportunities.” These can stem from core brand values, but often revolve around the most intense, place-specific feelings locals have about food. Then Kelley finds ways to place emotional opportunities inside a larger realm with an overarching narrative, helping retailers tell those stories — not with shelves of product, but through a series of affecting “scenes.”

In Alberta, Canada, Shook Kelley redesigned a small, regional grocery chain now called Freson Bros. Fresh Market. In interviews, the team discovered that meat-smoking is a beloved pastime there, so Shook Kelley built huge, in-store smokers at each new location — a scene called “Banj’s Smokehouse” — that crank out pound after pound of the province’s signature beef, as well as elk, deer, and other kinds of meat (customers can even BYO meat to be smoked in-house). Kelley also designed stylized root cellars in each produce section, a cooler, darker corner of each store that nods to the technique Albertans use to keep vegetables fresh. These elements aren’t just novel ways to taste, touch, and buy. They reference cultural set points, triggering memories and personal associations. Kelley uses these open, aisle-less spaces, which he calls “perceptual rooms,” to draw customers through an implied sequence of actions, tempting them towards a specific purchase.

Something magical happens when you engage customers this way. Behavior changes in visible, quantifiable ways. People move differently. They browse differently. And they buy differently. Rather than progressing in a linear fashion, the way a harried customer might shoot down an aisle — Kelley hates aisles, which he says encourage rushed, menial shopping — customers zig-zag, meander, revisit. These behaviors are a sign a customer is “experimenting,” engaging with curiosity and pleasure rather than just trying to complete a task. “If I was doing a case study presentation to you, I would show you exact conditions where we don’t change the product, the price, the service. We just change the environment and we’ll change the behavior,” Kelley tells me. “That always shocks retailers. They’re like ‘Holy cow.’ They don’t realize how much environment really affects behavior.”

A strong sense of place helps immerse us in a store, getting us emotionally invested and (perhaps) ready to suspend the critical faculties that prevent a shopping spree.

In the mid-2000s, Nabisco approached Kelley’s firm, complaining that sales were down 16 percent in the cookie-and-cracker aisle. In response, Shook Kelley designed “Mom’s Kitchen,” which was piloted at Buehler’s, a 15-store chain in northern Ohio. Kelley took Nabisco’s products out of the center aisles entirely and installed them in a self-contained zone: a perceptual room built out to look like a nostalgic vision of suburban childhood, all wooden countertops, tile, and hanging copper pans. Shelves of Nabisco products from Ritz Crackers to Oreos lined the walls. Miniature packs of Animal Crackers waited out in a large bowl, drawers opened to reveal boxes of Saltines. The finishing touch had nothing to do with Nabisco and everything to do with childhood associations: Kelley had the retailers install fridge cases filled with milk, backlit and glowing. Who wants to eat Oreos without a refreshing glass of milk to wash them down?

The store operators weren’t sold. They found it confusing and inconvenient to stock milk in two places at once. But from a sales perspective, the experiment was a smash. Sales of Nabisco products increased by as much as 32 percent, and the entire cookie-and-cracker segment experienced a halo effect, seeing double-digit jumps. Then, the unthinkable: The stores started selling out of milk. They simply couldn’t keep it on the shelves.

You’d think that the grocery stores would be thrilled, that it would have them scrambling to knock over their aisles of goods, building suites of perceptual rooms. Instead, they retreated. Nabisco’s parent company at the time, Kraft, was excited by the results and kicked the idea over to a higher-up corporate division where it stalled. And Buehler’s, for its part, never did anything to capitalize on its success. When the Nabisco took “Mom’s Kitchen” displays down, Kelley says, the stores didn’t replace them.

Mom’s Kitchen, fully stocked. (Photo by Tim Buchman)

“We were always asking a different question: What is the problem you’re trying to solve through food?” Kelley says. “It’s not just a refueling exercise — instead, what is the social, emotional issue that food is solving for us? We started trying to work that into grocery. But we probably did it a little too early, because they weren’t afraid enough.”

Since then, Kelley has continued to build his case to unreceptive audiences of male executives with mixed success. He tells them that when customers experiment — when the process of sampling, engaging, interacting, and evaluating an array of options becomes a source of pleasure — they tend to take more time shopping. And that the more time customers spend in-store, the more they buy. In the industry, this all-important metric is called “dwell time.” Most retail experts agree that increasing dwell without increasing frustration (say, with long checkout times) will be key to the survival of brick-and-mortar retail. Estimates vary on how much dwell time increases sales; according to Davinder Jheeta, creative brand director of the British supermarket Simply Fresh, customers spent 1.3 percent more for every 1 percent increase in dwell time in 2015.

Another way to increase dwell time? Offer prepared foods. Delis, cafes, and in-store restaurants increase dwell time and facilitate pleasure while operating with much higher profit margins and recapturing some of the dining-out dollar that grocers are now losing. “I tell my clients, ‘In five years, you’re going to be in the restaurant business,” Kelley says, “‘or you’re going to be out of business.’”

Kelley’s job, then, is to use design in ways that get customers to linger, touch, taste, scrutinize, explore. The stakes are high, but the ambitions are startlingly low. Kelley often asks clients what he calls a provocative question: Rather than trying to bring in new customers, would it solve their problems if 20 percent of customers increased their basket size by just two dollars? The answer, he says, is typically an enthusiastic yes.

Just two more dollars per trip for every fifth customer — that’s what victory looks like. And failure? That looks like a food marketplace dominated by Walmart and Amazon, a world where the neighborhood supermarket is a thing of the past.

* * *

When Shook Kelley started working on Niemann’s account, things began the way they always did: looking for emotional opportunities. But the team was stumped. Niemann’s stores were clean and expertly run. There was nothing wrong with them. Niemann’s problem was that he had no obvious problem. There was no there there.

Many of the regionals Kelley works with have no obvious emotional hook; all they know is that they’ve sold groceries for a long time and would like to keep on selling them. When he asks clients what they believe in, they show him grainy black-and-white photos of the stores their parents and grandparents ran, but they can articulate little beyond the universal goal of self-perpetuation. So part of Shook Kelley’s specialty is locating the distinguishing spark in brands that do nothing especially well, which isn’t always easy. At Buehler’s Fresh Foods, the chain where “Mom’s Kitchen” was piloted, the store’s Shook Kelley–supplied emotional theme is “Harnessing the Power of Nice.”

Still, Niemann Foods was an especially challenging case. “We were like, ‘Is there any core asset here?’” Kelley told me. “And we were like, ‘No. You really don’t have anything.’”

What Kelley noticed most was how depressed Niemann seemed, how gloomy about the fate of grocery stores in general. Nothing excited him — with one exception. Niemann runs a cattle ranch, a family operation in northeast Missouri. “Whenever he talked about cattle and feed and antibiotics and meat qualities, his physical body would change. We’re like, ‘My god. This guy loves ranching.’ He only had three hundred cattle or something, but he had a thousand pounds of interest in it.”

Niemann’s farm now has about 600 cattle, though it’s still more hobby farm than full-time gig — but it ended up being a revelation. During an early phase of the process, someone brought up “So God Made a Farmer” — a speech radio host Paul Harvey gave at the 1978 Future Farmers of America Convention that had been used in an ad for Ram trucks in the previous year’s Super Bowl. It’s a short poem that imagines the eighth day of the biblical creation, where God looks down from paradise and realizes his new world needs a caretaker. What kind of credentials is God looking for? Someone “willing to get up before dawn, milk cows, work all day in the fields, milk cows again, eat supper and then go to town and stay past midnight at a meeting of the school board.” God needs “somebody willing to sit up all night with a newborn colt. And watch it die. Then dry his eyes and say, ‘Maybe next year.’” God needs “somebody strong enough to clear trees and heave bails, yet gentle enough to yean lambs and wean pigs and tend the pink-combed pullets, who will stop his mower for an hour to splint the broken leg of a meadow lark.” In other words, God needs a farmer.

Part denim psalm, part Whitmanesque catalogue, it’s a quintessential piece of Americana — hokey and humbling like a Norman Rockwell painting, and a bit behind the times (of course, the archetypal farmer is male). And when Kelley’s team played the crackling audio over the speakers in a conference room in Quincy, Illinois, something completely unexpected happened. Something that convinced Kelley that his client’s stores had an emotional core after all, one strong enough to provide the thematic backbone for a new approach to the grocery store.

Rich Niemann, the jaded supermarket elder statesman, broke down and wept.

* * *

I have never been a fan of shopping. Spending money stresses me out. I worry too much to enjoy it. So I wanted to see if a Kelley store could really be what he said it was, a meaningful experience, or if it would just feel fake and hokey. You know, like the movies. When I asked if there was one store I could visit to see his full design principles in action, he told me to go to Harvest, the most interesting store in America.

Champaign is two hours south of O’Hare by car. Crossing its vast landscape of unrelenting farmland, you appreciate the sheer scale of Illinois, how far the state’s lower half is from Chicago. It’s a college town, which comes with the usual trappings — progressive politics, cafes and bars, young people lugging backpacks with their earbuds in — but you forget that fast outside the city limits. In 2016, some townships in Champaign county voted for Donald Trump over Hillary Clinton by 50 points.

I was greeted in the parking lot by Gerry Kettler, Niemann Foods’ director of consumer affairs. Vintage John Deere tractors formed a caravan outside the store. The shopping cart vestibules were adorned with images of huge combines roving across fields of commodity crops. Outside the wide-mouthed entryway, local produce waited in picket-fence crates — in-season tomatoes from Johnstonville, sweet onions from Warrensburg.

And then we stepped inside.

Everywhere, sunlight poured in through the tall, glass facade, illuminating a sequence of discrete, airy, and largely aisle-less zones. Kettler bounded around the store, pointing out displays with surprised joy on his face, as if he couldn’t believe his luck. The flowers by the door come from local growers like Delight Flower Farm and Illinois Willows. “Can’t keep this shit in stock,” he said. He makes me hold an enormous jackfruit to admire its heft. The produce was beautiful, he was right, with more local options than I’ve ever seen in a grocery store. The Warrensville sweet corn is eye-poppingly cheap: two bucks a dozen. There were purple broccolini and clamshells filled with squash blossoms, a delicacy so temperamental that they’re rarely sold outside of farmers’ markets. Early on, they had to explain to some teenage cashiers what they were — they’d never seen squash blossoms before.

I started to sense the “realm” Harvest inhabits: a distinctly red-state brand of America, local food for fans of faith and the free market. It’s hunting gear. It’s Chevys. It’s people for whom commercial-scale pig barns bring back memories of home. Everywhere, Shook Kelley signage — a hierarchy of cues like what Kelley dreamed up for Whole Foods — drives the message home. A large, evocative sign on the far wall reads Pure Farm Flavor, buttressed by the silhouettes of livestock, so large it almost feels subliminal. Folksy slogans hang on the walls, sayings like FULL OF THE MILK OF HUMAN KINDNESS and THE CREAM ALWAYS RISES TO THE TOP.

Then there are the informational placards that point out suppliers and methods.

There are at least a half dozen varieties of small-batch honey; you can find pastured eggs for $3.69. The liquor section includes local selections, like whiskey distilled in DeKalb and a display with cutting boards made from local wood by Niemann Foods’ HR Manager. “Turns out we had some talent in our backyard,” Kettler said. Niemann’s willingness to look right under his nose, sidestepping middlemen distributors to offer reasonably priced, local goods, is a hallmark of Harvest Market.

That shortened chain of custody is only possible because of Niemann and the lifetime of supply-side know-how he brings to table. But finding ways to offer better, more affordable food has been a long-term goal of Kelley — who strained his relationship with Whole Foods CEO John Mackey over the issue. As obsessed as Kelley is with appearances, he insists to me that his work must be grounded in something “real”: that grocery stores only succeed when they really try to make the world a better place through food. In his view, Whole Foods wasn’t doing enough to address its notoriously high prices — opening itself up to be undercut by cheaper competition, and missing a kind of ethical opportunity to make better food available to more people.

“When,” Kelley remembers asking, “did you start to mistake opulence for success?”

In Kelley’s telling, demand slackened so much during the Great Recession that it nearly lead to Whole Foods’ downfall, a financial setback that the company never fully recovered from — and, one could argue, ultimately led to its acquisition. Harvest Market, for its part, has none of Whole Foods’ clean-label sanctimony. It takes an “all-of-the-above” approach: There’s local produce, but there’re also Oreos and Doritos and Coca-Cola; at Thanksgiving, you can buy a pastured turkey from Triple S Farms or a 20-pound Butterball. But that strong emphasis on making local food more accessible and affordable makes it an interesting counterpart to Kelley’s former client.

The most Willy Wonka–esque touch is the hulking piece of dairy processing equipment in a glass room by the cheese case. It’s a commercial-scale butter churner — the first one ever, Kettler told me, to grace the inside of a grocery store.

“So this was a Shook Kelley idea,” he said, “We said yes, without knowing how much it would cost. And the costs just kept accelerating. But we’re thrilled. People love it.”Harvest Market isn’t just a grocery store — it’s also a federally inspected dairy plant. The store buys sweet cream from a local dairy, which it churns into house-made butter, available for purchase by the brick and used throughout Harvest’s bakery and restaurant. The butter sells out as fast as they can make it. Unlike the grocers who objected to “Mom’s Kitchen,” the staff don’t seem to mind.

As I walked through the store, I couldn’t help wondering how impressed I really was. I found Harvest to be a beautiful example of a grocery store, no doubt, and a very unusual one. What was it that made me want to encounter something more outrageous, more radical, more theatrical and bizarre? I wanted animatronic puppets. I wanted fog machines.

I should have known better — Kelley had warned me that you can’t take the theater of retail too far without breaking the dream. He’d told me that he admires stores where “you’re just not even aware of the wonder of the scene, you’re just totally engrossed in it” — stores a universe away from the overwrought, hokey feel of Disneyland. But I had Amazon’s new stores in the back up my mind as a counterpoint, with all their cashierless bells and whistles, their ability to click and collect, their ability to test-drive Alexa and play a song or switch on a fan. I guess, deep down, I was wondering if something this subtle really could work.

“Here, this is Rich Niemann,” Kettler said, and I found myself face-to-face with Niemann himself. We shook hands and he asked if I’d ever been to Illinois before. Many times, I told him. My wife is from Chicago, so we’ve visited the city often.

He grinned at me.

“That’s not Illinois,” he said.

We walked to Harvest’s restaurant, a 40-person seating area plus an adjacent bar with a row of stools, that offers standards like burgers, salads, and flatbreads. There’s an additional 80-person seating area on the second-floor mezzanine, a simulated living room complete with couches and board games. Beyond that, they pointed out the brand-new wine bar — open, like the rest of the space, until midnight. There’s a cooking classroom by the corporate offices. Through the window, I saw a classroom full of children doing something to vegetables. Adult Cooking classes run two or three nights every week, plus special events for schools and other groups.

For a summer weekday at noon in a grocery store I’m amazed how many people are eating and working on laptops. One guy has his machine hooked up to a full-sized monitor he lugged up the stairs — he’s made a customized wooden piece that hooks into Harvest’s wrought-iron support beams to create a platform for his plus-size screen. He comes every day, like it’s his office. He’s a dwell-time dream.

We sit down, and Kettler insists I eat the corn first, slathering it with the house-made butter and eating it while it’s hot. He reminds me that it’s grown by the Maddoxes, a family in Warrensburg, about 50 miles west of Champaign.

The corn was good, but I wanted to ask Niemann if the grocery industry was really that bad, and he told me it is. I assume he’ll want to talk about Amazon and its acquisition of Whole Foods and the way e-commerce has changed the game. He acknowledges that, but to my surprise he said the biggest factor is something else entirely — a massive shift happening in the world of consumer packaged goods, or CPGs.

For years, grocery stores never had to advertise, because the largest companies in the world — Proctor and Gamble, Coca-Cola, Nestle — did their advertising for them, just the way Nabisco helped finance “Mom’s Kitchen” to benefit the stores. People came to supermarkets to buy the foods they saw on TV. But Americans are falling out of love with legacy brands. They’re looking for something different, locality, a sense of novelty and adventure. Kellogg’s and General Mills don’t have the pull they once had.

When their sales flag, grocery sales do too — and the once-bulletproof alliance between food brands and supermarkets is splitting. For the past two years, the Grocery Manufacturers’ Association, an influential trade group representing the biggest food companies in the world, started to lose members. It began with Campbell’s Soup. Dean Foods, Mars, Tyson Foods, Unilever, Hershey Company, the Kraft Heinz Company, and others followed. That profound betrayal was a rude awakening: CPG companies don’t need grocery stores. They have Amazon. They can sell directly through their websites. They can launch their own pop-ups.

It’s only then that I realized how dire the predicament of grocery stores really is, and why Niemann was so frustrated when he first called Kevin Kelley. It’s one thing when you can’t sell as cheaply and conveniently as your competitors. But it’s another thing when no one wants what you’re selling.

Harvest doesn’t feel obviously futuristic in the way an Amazon store might. If I went there as a regular shopper and not as a journalist sniffing around for a story, I’m sure I’d find it to be a lovely and transporting way to buy food. But what’s going on behind the scenes is, frankly, unheard of.

Grocery stores have two ironclad rules. First, that grocers set the prices, and farmers do what they can within those mandates. And second, that everyone works with distributors who oversee the aggregation and transport of all goods. Harvest has traditional relationships with companies like Coca-Cola, but it breaks those rules with local farmers and foodmakers. Suppliers — from the locally milled wheat to the local produce to the Kilgus Farms sweet cream that goes into the churner — truck their products right to the back. By avoiding middlemen and their surcharges, Harvest is able to pay suppliers more directly and charge customers less. And it keeps costs low. You can still find $4.29 pints of Halo Top ice cream in the freezer, but the produce section features stunning bargains. When the Maddox family pulls up with its latest shipment of corn, people sometimes start buying it off the back of the truck in the parking lot. Thats massive change, and its virtually unheard of in supermarkets. At the same time, suppliers get to set their own prices. Niemann’s suppliers tell him what they need to charge; Niemann adds a standard margin and lets customers decide if they’re willing to pay.

If there’s a reason Harvest matters, it’s only partly because of the aesthetics. It’s mainly because the model of what a grocery store is has been tossed out and rebuilt. And why not? The world as Rich Niemann knows it is ending.

* * *

In 2017, just months after Harvest Market’s opening, Niemann won the Thomas K. Zaucha Entrepreneurial Excellence Award — the National Grocers Association’s top honor, given for “persistence, vision, and creative entrepreneurship.” That spring, Harvest was spotlighted in a “Store of the Month” cover feature in the influential trade magazine Progressive Grocer. Characteristically, the contributions of Kelley and his firm were not mentioned in the piece.

Niemann tells me his company is currently planning to open a second Harvest Market in Springfield, Illinois, about 90 minutes west of Champaign, in 2020. Without sharing specifics about profitability or sales numbers, he says the store was everything he’d hoped it would be as far as the metrics that most matter — year-over-year sales growth and customer engagement. His only complaint about the store, has to do with parking. For years, Niemann has relied on the same golden ratio to determine the size of parking lot needed for his stores — a certain number of spots for every thousand dollars of expected sales. Harvest’s lot uses the same logic, and it’s nowhere near enough space.

“In any grocery store, the customer’s first objective is pantry fill — to take care of my needs as best I can on my budget,” Niemann says. “But we created a different atmosphere. These customers want to talk. They want to know. They want to experience. They want to taste. They’re there because it’s an adventure.”

They stay so much longer than expected that the parking lot sometimes struggles to fit all their cars at once. Unlike the Amazon stores that may soon be cropping up in a neighborhood near you — reportedly, the company is considering plans to open 3,000 of them in by 2021 — it’s not about getting in and out quickly without interacting with another human being. At Harvest, you stay awhile. And that’s the point.

But Americans are falling out of love with legacy brands. They’re looking for something different, locality, a sense of novelty and adventure. Kellogg’s and General Mills don’t have the pull they once had.

So far, Harvest’s success hasn’t made it any easier for Kelley, who still struggles to persuade clients to make fundamental changes. They’re still as scared as they’ve always been, clinging to the same old ideas. He tells them that, above all else, they need to develop a food philosophy — a reason why they do this in the first place, something that goes beyond mere nostalgia or the need to make money. They need to build something that means something, a store people return to not just to complete a task but because it somehow sustains them. For some, that’s too tall an order. “They go, ‘I’m not going to do that.’ I’m like, ‘Then what are you going to do?’ And they literally tell me: ‘I’m going to retire.’” It’s easier to cash out. Pass the buck, and consign the fate of the world to younger people with bolder dreams.

Does it even matter? The world existed before supermarkets, and it won’t end if they vanish. And in the ongoing story of American food, the 20th-century grocery store is no great hero. A&P — the once titanic chain, now itself defunct — was a great mechanizer, undercutting the countless smaller, local businesses that used to populate the landscape. More generally, the supermarket made it easier for Americans to distance ourselves from what we eat, shrouding food production behind a veil and letting us convince ourselves that price and convenience matter above all else. We let ourselves be satisfied with the appearance of abundance — even if great stacks of unblemished fruit contribute to waste and spoilage, even if the array of brightly colored packages are all owned by the same handful of multinational corporations.

But whatever springs up to replace grocery stores will have consequences, too, and the truth is that brick-and-mortar is not going away any time soon — far from it. Instead, the most powerful retailers in the world have realized that physical spaces have advantages they want to capitalize on. It’s not just that stores in residential neighborhoods work well as distribution depots, ones that help facilitate the home delivery of packages. And it’s not just that we can’t always be home to pick up the shipments we ordered when they arrive, so stores remain useful. The world’s biggest brands are now beginning to realize what Kelley has long argued: Physical stores are a way to capture attention, to subject customers to an experience, to influence the way they feel and think. What could be more useful? And what are Amazon’s proposed cashierless stores, but an illustration of Kelley’s argument? They take a brand thesis, a set of core values — that shopping should be quick and easy and highly mechanized — and seduce us with it, letting us feel the sweep and power of that vision as we pass with our goods through the doors without paying, flushed with the thrill a thief feels.

This is where new troubles start. Only a few companies in the world will be able to compete at Amazon’s scale — the scale where building 3,000 futuristic convenience stores in three years may be a realistic proposition. Unlike in the golden age of grocery, where different family owned chains catered to different demographics, we’ll have only a handful of players. We’ll have companies that own the whole value chain, low to high. Amazon owns the e-commerce site where you can find almost anything in the world for the cheapest price. And for when you want to feel the heft of an heirloom tomato in your hand or sample some manchego before buying, there is Whole Foods. Online retail for thrift, in-person shopping for pleasure. Except one massive company now owns them both.

If this new landscape comes to dominate, we may find there are things we miss about the past. For all its problems, the grocery industry is at least decentralized, owned by no one dominant company and carved up into more players than you could ever count. It’s run by people who often live alongside the communities they serve and share their concerns. We might miss that competition, that community. They are small. They are nimble. They are independently, sometimes even cooperatively, owned. They employ people. And if they are scrappy, and ingenious, and willing to change, there’s no telling what they might do. It is not impossible that they could use their assets — financial resources, industry connections, prime real estate — to find new ways to supply what we all want most: to be happier, to be healthier, to feel more connected. To be better people. To do the right thing.

I want to believe that, anyway. That stores — at least in theory — could be about something bigger, and better than mere commerce. The way Harvest seems to want to be, with some success. But I wonder if that’s just a fantasy, too: the dream that we can buy and sell our way to a better world, that it will take no more than that.

Which one is right?

I guess it depends on how you feel about the movies.

Maybe a film is just a diversion, a way to feel briefly better about our lives, the limitations and disappointments that define us, the things we cannot change. Most of us leave the theater, after all, and just go on being ourselves.

Still, maybe something else is possible. Maybe in the moment when the music swells, and our hearts beat faster, and we feel overcome by the beauty of an image — in the instant that we feel newly brave and noble, and ready to be different, braver versions of ourselves — that we are who we really are.

* * *

Joe Fassler, The Counter’s deputy editor, has covered the intersection of food, policy, technology, and culture for the magazine since 2015. His food reporting has twice been a finalist for the James Beard Foundation Award in Journalism. He’s also editor of Light the Dark: Writers on Creativity, Inspiration, and the Creative Process (Penguin, 2017), a book based on “By Heart,” his ongoing series of literary conversations for The Atlantic

Editor: Michelle Weber
Fact checker: Matt Giles
Copy editor: Jacob Z. Gross

No Heart, No Moon

AP Photo

Matt Jones | The Southern Review | Summer 2018 | 22 minutes (4,337 words)

 

The space race killed the sparrow.

Of course, there were other factors.

There was the decision in ’46 by the Brevard Mosquito Control District to slather the Merritt Island salt marshes in DDT dropped aerially from a No. 2 diesel–fuel carrier.

Then, because the mosquitoes grew resistant to DDT, there was the application of BHC and Dieldrin and Malathion.

Read more…

None of the President’s Men

Warner Bros.

Soraya Roberts | Longreads | April 2019 | 10 minutes (2,422 words)

INT. COFFEE SHOP – DAY

SORAYA sits down at her laptop with a cookieor some cake or that weirdly oversize banana bread. As she startsworking on a column like this one, the woman next to her, workingon a spreadsheet, glances at Soraya’s desktop and turns to her.

WOMAN: What do you do?

SORAYA: I’m a columnist.

WOMAN: Holy shit, that’s cool.

I starred in this scene two weeks ago, and again just this past week at a party. The women don’t have to tell me why they think it’s cool, I know why: Carrie Bradshaw. An apartment in New York, a photo on the side of a bus, Louboutins, tutus, and a column at the top of each week. Which is why I qualify it every time: “I don’t make as much as Carrie Bradshaw.” Yes, the job is cool, and it is holy-shit-worthy because so few journalists are able to actually work as journalists. But I’m freelance: I can cover my rent but can’t buy a house, I don’t get benefits, and I might be out of a job next week. Not to mention that I might not be so lucky next time. The women usually turn back to their admin after that — admin looks a lot cooler than journalism these days. But only if you’re not going by Sex and the City or basically every other journalism movie or series that has come after, all of which romanticize an industry which has a knack for playing into that.

“This is the end of an era, everything’s changing,” Gina Rodriguez tells her friends in the trailer for Someone Great, a new Netflix rom-com in which she, a music journalist, gets a job. At a magazine. In San Francisco. This is not a sci-fi movie in which the character has time traveled back to, I don’t know, 1975. It is only one recent example of the obfuscation of what journalism actually means now. There’s also the Hulu series Shrill, which presents itself as if it were current-day but is based on the life of Lindy West, who had a staff job at the Seattle alt-weekly The Stranger when you could still have a staff job and make a name for yourself with first-person essays, i.e., 2009. Special (another Netflix show) also harkens back to that time, and though it’s more overt about how exploitative online media can be — the hero is an intern with cerebral palsy who writes about his disability (which he claims is from a car accident) for clicks — the star is still hired straight out of an internship. (What’s an internship?)

Hollywood romanticizes everything, you say? Perhaps, but this is a case where the media itself seems to be actively engaging in a certain kind of deception about how bad its own situation actually is. In February, The Washington Post, which is no doubt still benefiting from the press off the still-gold-standard journalism movie — 1976’s All the President’s Men — ran a Super Bowl ad narrated by Tom Hanks, which applauds late journalists Marie Colvin and Jamal Khashoggi, who, in their words, brought the story, “no matter the cost.” The spot highlighted what we already know, which is that we need journalism to be a functioning democracy and that many journalists risk their lives to guarantee it. What it kept in darkness (ha), however, was that to do their job properly, those journalists need protection and they need resources — provided by their editors and by their publishers. Hanks, of course, starred in The Post, Steven Spielberg’s 2017 film based on the journalists who reported on the Pentagon Papers in 1971. The ad was using the past to promote the future, rather than dealing with a present, in which more than 2,400 people lost media jobs in the first three months of the year and journalists are trying to unionize en masse. But that’s not particularly telegenic, is it?

* * *

The romanticized idea of the journalist — dogged, trenchcoated — really took off at the movies. In 1928, ex-reporters Ben Hecht and Charles MacArthur wrote a play which was adapted into The Front Page, a 1931 screwball that became the journalism movie prototype, with fast dialogue and faster morals. My favorite part is that not only is the star reporter trying to quit the paper (in this economy?), but his editor will do anything — including harboring an accused murderer — to keep him on staff. Matt Ehrlich, coauthor of Heroes and Scoundrels: The Image of the Journalist in Popular Culture, once told me for Maclean’s that The Front Page came out of the “love-hate relationship” the writers had with the industry even back then. “The reporters are absolute sleazebags, they do horrible things,” he said. “At the same time The Front Page makes journalism seem very exciting, and they do get the big scoop.” Ehrlich also told me that some initially thought All the President’s Men, which eventually became the prototype of the journalism movie, was reminiscent of the earlier era of the genre. In case you are not a journalist and so haven’t seen it, Robert Redford and Dustin Hoffman starred as Bob Woodward and Carl Bernstein, The Washington Post reporters whose stories on the Watergate burglary and subsequent cover-up helped lead to President Nixon’s resignation. While the film also played fast and loose with the truth, it had a veneer of rumpled repetitious reality — not to mention a strong moral core that made taking down the president with a typewriter seem, if implausible, at least not impossible.

In February, Education Week reported that a survey of 500 high school journalism teachers across 45 states found that, in the past two years, 44 percent of U.S. school teachers saw a rise in journalism enrollment and a 30 percent increase in interest in journalism higher education. “This is this generation’s Watergate,” the executive director of the National Scholastic Press Association said. “With President Trump, everyone is really in tune to the importance of a free press.” Sure. But this isn’t 1976. No doubt there are scores of WoodSteins out there, but not only do a number of journalists no longer have the resources or the time to follow stories of any kind, they rarely have the salaried staff positions to finance them, nor the editors and publishers to support them doing the job they were hired to do. In All the President’s Men, executive editor Ben Bradlee asks WoodStein if they trust their source, before muttering “I can’t do the reporting for my reporters, which means I have to trust them. And I hate trusting anybody.” Then he tells them to “Run that baby.” These days there is little trust in anything beyond the bottom line.

The myth is that All the President’s Men led to a surge of interest in journalism as a career. But in reality it was women, increasingly educated post-liberation, whose interest explained the surge. (My editor is asking: “Is it an accident that shitting on journalism as a worthy profession coincided with women moving into journalism?” My reply is: “I think not.”) Still, women remain underrepresented in the field to this day, a fact reflected by the paucity of movies about the work of female journalists. While there were scores of ’70s and ’80s thrillers built around male reporters with too much hair taking down the man, for the women … there was The China Syndrome, with Jane Fonda as a television reporter named Kimberly covering a nuclear power plant conspiracy. And, um, Absence of Malice? Sally Field is a newspaper reporter who sleeps with her subject (I mean, it is Paul Newman). I guess I could include Broadcast News, which stars Holly Hunter as a neurotic-but-formidable producer and personified the pull between delivering the news and delivering ratings (the analog version of clicks). But Network did that first and more memorably, with its suicidal anchorman lamenting the demise of media that matters. “I’m a human being, GODDAMN IT!!!” he shouts into the void. “My life has value!!!” You don’t hear female journalists saying that on-screen, though you do hear them saying “I do” a whole lot.

The quintessential journalism film and the quintessential rom-com are in fact connected. Nora Ephron, who was briefly married to Carl Bernstein, actually cowrote an early script for All the President’s Men. While it was chucked in favor of William Goldman’s, she went on to write When Harry Met Sally, and I’ll forgive you for not remembering that Sally was a journalist. She probably only mentions it twice because this was 1989, an era in which you decided to be a journalist and then you became one — the end. The movie treats reporting like it’s so stable it’s not even worth mentioning, like being a bureaucrat. Sally could afford a nice apartment, she had plenty of time to hang out with Harry, so what was there to gripe about (Good Girls Revolt would suggest Ephron’s trajectory was less smooth, but that’s another story)? Four years later, in Sleepless in Seattle, Meg Ryan is another journalist in another Ephron movie, equally comfortable, so comfortable in fact that her editor pays her to fly across the country to stalk Tom Hanks. This newspaper editor literally assigns a reporter to take a plane to Seattle from Chicago to “look into” a possible lifestyle story about a single white guy. (Am I doing something wrong?!?!)

Journalism and rom-coms were fused from almost the start, around the ’30s and ’40s. The Front Page went from being a journalism movie to being a rom-com when it turned its hero into a heroine for His Girl Friday. The reporter repartee and the secretive nature of the job appeared to lend themselves well to Hays-era screwballs, though they also indelibly imprinted a lack of seriousness onto their on-screen female journalists. After a brief moment in the 1970s when The Mary Tyler Moore Show embodied the viability of a woman journalist who puts work first, the post-Ephron rom-coms of the 2000s were basically glossy romances in “offices” that were really showrooms for a pink-frosted fantasy girl-reporter gig no doubt thought up by male executives who almost certainly saw All the President’s Men and almost certainly decided a woman couldn’t do that and who cares anyway because the real story is how you’re going to get Matthew McConaughey to pop the question. I can’t with the number of women who recently announced that 13 Going on 30 — the movie in which Jennifer Garner plays a literal child successfully running a fashion magazine — made them want to be journalists. But the real death knell of the aughts journo-rom-com, according to rom-com columnist Caroline Siede, was in 2003 with How to Lose a Guy in 10 Days in 2003. In that caper, Kate Hudson has a job as a columnist despite thinking it is completely rational to write a piece called “How to Bring Peace to Tajikistan” for her Cosmo-type fashion magazine.

* * *

In 2016, the Oscar for Best Picture went to Spotlight, which follows The Boston Globe’s titular investigative team — three men, one woman — as it uncovers the Catholic Church abuse scandal. The film earned comparisons to All the President’s Men for its focus on journalistic drudgery, but it also illustrated the growing precariousness of the newsroom with the arrival of the web. In one scene, executive editor Marty Baron expresses shock when he is told it takes a couple of months for the team to settle on a story and then a year or more to investigate it. At the same time, Baron and two other editors are heavily involved and supportive of the three reporters, who went on to win the Pulitzer in 2003 and remained on the team for years after. Released only 12 years after the fact, the film suggested that journalists who win Pulitzers have some kind of security, which, you know, makes sense, and is maybe true at The Boston Globe. But two years after Spotlight came out, David Wood, who had won HuffPost its only Pulitzer, was laid off. As one of BuzzFeed’s reporters told The Columbia Journalism Review after BuzzFeed shed 15 percent of its staff, “It’s this sense that your job security isn’t tied to the quality of your work.”

“We have so much to learn from these early media companies and in many ways it feels like we’re at the start of another formative era of media history where iconic companies will emerge and thrive for many decades,” BuzzFeed founder and CEO Jonah Peretti blew hard in a memo in 2014, referring to traditional outfits like Time and The New York Times. But both those publications have unions, which Peretti has been clear he doesn’t think “is right” for his company. “A lot of the best new-economy companies are environments where there’s an alliance between managers and employees,” he said in 2015. “People have shared goals.” In this case the shared goals seem to be that Peretti profits (his company was valued at more than $1 billion in 2016) while his staff is disposable.

Which brings us back to the Globe in 2019. That is to say the real one, not the romanticized one. This version of the Globe hires a Gonzo-esque leftist political writer named Luke O’Neil as a freelancer and publishes his “controversial” op-ed about the Secretary of Homeland Security’s resignation titled “Keep Kirstjen Nielsen unemployed and eating Grubhub over her kitchen sink.” “One of the biggest regrets of my life is not pissing in Bill Kristol’s salmon,” it opened, and it concluded with, “As for the waiters out there, I’m not saying you should tamper with anyone’s food, as that could get you into trouble. You might lose your serving job. But you’d be serving America. And you won’t have any regrets years later.” The article was gone by Friday, pulled upon the request of the paper’s owners (O’Neil sent me the original). According to WGBH, a now-deleted note on the opinion page stated that the article “did not receive sufficient editorial oversight and did not meet Globe standards. The Globe regrets its lack of vigilance on the matter. O’Neil is not on staff.” And, oh, man, that last line. It says everything there is to say about modern journalism that is unspoken not only on-screen but by the culture at large and the media in it. It says you serve us but we provide no security, no benefits, no loyalty. It says, unlike Spotlight or All the President’s Men or even The Front Page, we do not have your back. Because if they did, you better believe it would have a good chance of ending up on-screen.

* * *

Soraya Roberts is a culture columnist at Longreads.

MFA vs. NYC: A Reading List

42nd Street with Chrysler Bulding during Manhattanhenge in 2018, captured in Manhattan, NYC. (Getty Images)

Near the end of my MFA, someone asked what my plans were after graduation. Before allowing me to answer, he said, somewhat wistfully, that he thought I should move to New York City and “live a little” before writing anything else. In the moment, I probably nodded politely and smiled, as I’m prone to doing, but his suggestion frustrated me. How, after living for two years on a barely-sufficient stipend, did he expect that I’d be able — or want — to fling myself across the country to a city with exorbitant rent prices where I had no job, no insurance, and no community? And what did he mean by living? Had I not been living during the two years of my MFA, during which I moved to an unfamiliar-to-me city, taught classes at the university for the first time, learned to edit a journal, found my way into a community of writers, and struggled in draft after draft to improve my own prose?

Instead of moving to New York City, I did what might be considered the opposite; I started a PhD in creative writing in the middle of Oklahoma, which I’m finishing up now. During my years here, I’ve certainly grown as a writer and a teacher, and had the opportunity to build lasting relationships with people who have supported me in innumerable ways. But I also have remained aware of the problems within academia: there is a food pantry for graduate students in the room across from my office, for example, a lack of diversity within my program and many others, and a job market that dwindles every year. Sometimes I think back to that person telling me to move to NYC, and I wonder who I might be now — as a writer, as a person, as a professional — had I “lived life” rather than pursuing another degree. I’ve probably thought about his offhand comment more than I should, but it also seems to encapsulate some of the larger conversations about the function of MFA and PhD creative writing programs and the various pros and cons of making a life as a writer within or outside of academia.

More interesting to me than prescribing one way of life over another, however, is to examine the challenges and sources of nourishment in each, and to wonder about the possibilities that exist beyond a reductive dichotomy. The essays curated in this reading list illuminate problems that exist within MFA and PhD creative writing programs, explore the idea of mentorship both within and outside of the academy, and offer insight on how to live a fruitful writing life without the support and constraints of a formal program.

1. MFA vs. NYC (Chad Harbach, November 26, 2010, Slate)

Chad Harbach theorizes about how MFA programs are influencing both the craft and professional development of fiction writers, as well as impacting the landscape of publishing, in this viral essay.

It’s time to do away with this distinction between the MFAs and the non-MFAs, the unfree and the free, the caged and the wild. Once we do, perhaps we can venture a new, less normative distinction, based not on the writer’s educational background but on the system within which she earns (or aspires to earn) her living: MFA or NYC.

Related read: Which Creates Better Writers: An MFA Program or New York City? (Leslie Jamison, February 27, 2014, The New Republic) and “MFA vs NYC”: Both, Probably (Andrew Martin, March 28, 2014, The New Yorker)

2. Going Hungry at The Most Prestigious MFA in America (Katie Prout, Lit Hub)

The idea of writers living without substantial income is one that’s sometimes romanticized, as Katie Prout notes while listening to an audiobook of A Moveable Feast, in which Hemingway says that “he and Pound agreed that the best way to be a writer is to live poorly.” One month away from turning 30, Prout writes about the realities — which include food banks and multiple jobs — of living with very little money while pursuing her MFA at Iowa.

I’m an instructor at the university where I attend the best nonfiction writing program in the country, and I make approximately $18,000 a year before taxes. When I was denied a second teaching assistantship at the university this summer for the upcoming school year even though I already had signed a contract with the offering department, my director explained that it was in the school’s best interests to look after my best interests, and my best interest was to make sure that I had time [to] write my thesis.

3. Every Day is a Writing Day, With or Without an MFA (Emily O’Neill, November 27, 2018, Catapult)

The requirement to relocate and the insufficiency of fully-funded spots are just two of many reasons why MFA degrees are not possible for many people, as Emily O’Neill explains in this essay about how she nurtures a writing life outside of the academy.

I don’t have an MFA. It often makes me feel like the man on that mortifying date to admit this to writers I don’t know well. So many people who write are academics or at least aspiring to an MFA or PhD, and mentioning I don’t feel specifically drawn to the demands of graduate school is often seen as a sin against literature.

4. Woman of Color in Wide Open Spaces (Minda Honey, March 2017, Longreads)

After two years, Minda Honey longs to escape from the whiteness of her MFA program, and plans a trip to four national parks, not realizing that “80% of National Parks visitors and employees are white.” Weaving together moments from her travels and memories from her writing program, Honey lays bare the lack of diversity in both spaces.

When I’d first started my MFA program, I thought it would be an escape from the oppressive whiteness of Corporate America. I thought without suits to button my body into, I would be free to exist. But Academia proved to be just as oppressive.

5. How Applying to Grad School Becomes a Display of Trauma for People of Color (Deena ElGenaidi, April 17, 2018, Electric Lit)

When consulting with people about how to apply to PhD programs, Deena ElGenaidi’s advisor tells her to play up her minority status in her personal statement. ElGenaidi explores the problematic and pervasive nature of this advice, while also discussing what it means that minority students and people of color are encouraged to use their trauma in order to be admitted into academic programs.

The experience taught me that society, white America specifically, regularly asks minorities and people of color to tokenize and exploit themselves, talking about their cultural backgrounds in a marketable way in order to gain acceptance into programs and institutions we are otherwise barred from.

6. The Mentor Series: Allie Rowbottom and Maggie Nelson (Allie Rowbottom, ed. Monet Patrice Thomas, March 25, 2019, The Rumpus)

How do writers balance the challenge of seeking publication in a difficult fast-paced market while nurturing their craft? And what role do mentors play in a writer’s development? In the inaugural installment of “The Mentor Series,” a series of interviews between mentors and students curated by Monet Patrice Thomas, Allie Rowbottom and Maggie Nelson ruminate on these questions and more.

Allie Rowbottom: I remember once, after I finished my MFA thesis, you advised I take my time and sit on the project. You said something about not publishing too young, or rushing out of the gate, and I’ve thought about that a lot now that I have published—one of my biggest challenges (or strengths?) as a writer is that I push myself. Now that my first book is out in the world, I feel an urgency to produce more, at the same time I worry that rushing never makes for solid work.

***

Jacqueline Alnes is working on a memoir about running and neurological illness. You can find her on Instagram and Twitter @jacquelinealnes.

United States of Conspiracy: An Interview with Anna Merlan

Mike Rosiana / Getty

Rebecca McCarthy | Longreads | April 2019 | 17 minutes (4,461 words)

 

On March 13, 2019, a twenty-four year old construction worker named Anthony Comello drove to Staten Island and backed his pickup into a Cadillac owned by the head of the Gambino crime family, Frank Cali. When Cali came to the door, Comello shot him. Comello was arrested a few days later in Brick, New Jersey, and upon his appearance in court, it became clear that he was a believer in the confusing and ever-shifting conspiracy theory, QAnon — whose adherents believe President Trump is locked in a mortal battle with a “deep state,” which they contend is running child sex trafficking rings (among other things). A photo from the arraignment shows that Comello had written the letter “Q” on his hand, along with “MAGA FOREVER” and “United We Stand.”

A mob boss, a cadillac, a murder, a town called Brick, New Jersey — all of those things make sense when itemized and grouped together. In 2019 it’s not even that surprising that a member of QAnon was involved. But, barring new information, what is surprising is the simplicity of the actual motive — Comello wanted to date Cali’s niece and Cali disapproved.

“Life is so much more random than we would like it to be,” Anna Merlan told me over the phone, when we were talking about Cali’s murder. “Everything is so much weirder and less meaningful than we would like it to be and I constantly see people that I talk to grappling with that idea — that maybe there isn’t a grand narrative under the surface animating everything.” Read more…

Edible Complex

Getty, Alberto E. Tamargo / AP, Photo illustration by Katie Kosma

Jen Doll | Longreads | April 2019 | 18 minutes (4,598 words)

According to those jaded but constant belief systems that keep the worst romantic comedies in business, the third date is the make-or-break one. In these busy times, the idea goes, by date three you’ve spent enough time together to determine if either of you is a serial killer, or hiding something very bad in your closet (metaphorical or otherwise), or has the tendency to type “hehehe” when laughing by text. And if the relationship by date three veers toward make rather than break, well, finally the “rules” have lifted: It is THE MOMENT to get naked (not at the restaurant, please). The thinking is based in some combination of propriety and sexual policing and also sheer time management: You haven’t put so much energy or effort into this budding romance that uncovering an in-the-sheets incompatibility ruins your entire life — but it’s also not so soon it’s considered “rushing in,” which, when applied to women, of course, means “being too slutty.”

No matter that “slutty” is an outmoded, sexist concept and that you should sleep with a person if and when you feel like it (and if and when they consent), I grew up with “the third date’s the sex date!” pressed upon me as, if not law, then at least a kind of informed ideology: Do it then to uncover any latent micropenises or irrecoverable technique problems; do it then to get it over with because would you look at that elephant in the room?; do it then to get the rest of your relationship started; do it then because by the third date, what else is there to do?

So, when it came time for the third date with a man I’d been seeing — a guy who lived in upstate New York, which meant our third date would be more of a weekend visit; did each night count as a date, I wondered, or was it the whole package, a kind of Club Med situation with dinners and entertainment included? — there was a certain amount of buried internal stress and anticipation related to the event. Not that I was going to go get a Brazilian, or anything. I was in my 40s. Those days of paying a stranger to rip large swathes of hair from my nether regions had blessedly gone by the by. (Yes, I said “nether regions.”) But in my brain, a place far more difficult for strangers to reach, my thoughts were going a little bit wild. I’d been dumped earlier in the year, I’d gotten back up and shaken myself off, I’d tried again, and I’d actually met someone. But how many rounds of the dating game was I prepared to endure? If things went in the direction of “break” — what next, not only for me and this guy, but maybe for me and anyone? This is what rom-coms never really tackle: What happens when you get so tired of dating, so disappointed by all the prospects, you just give up?

In the absence of answers, I sought to occupy myself. I took a train to Beacon, New York, a town about an hour away from where my date lived — he’d pick me up there the next day, and our third date would begin — and met some friends I was just getting to know. We watched a poet read from her impressive collection in a garden, surrounded by trees and flowers and sunshine. I wasn’t even so sure how I felt about poetry readings, but I liked this version of me, trying new things, with different people. I bought several of the poet’s books, and had her sign one, even though I’d not known much of her work until that moment.
Read more…

For the Thirsty Girl

Getty

Soraya Roberts | Longreads | April 2019 | 9 minutes (2,387 words)

“She’s got the nerve to say / She wants to fuck that boy so badly.” These are the lyrics to the titular track from Third Eye Blind’s 2003 album Out of the Vein (stay with me). They are written by Stephan Jenkins, who has admitted his three-year relationship with Charlize Theron acted as inspiration. Whether or not that particular song is about her, one thing is clear: Charlize Theron knows she wants to fuck a specific boy, even if she is uncertain who that boy is. “I’ve been single for ten years, it’s not a long shot,” she said recently in some interview, dorkily referencing the title of her new film, which is about a presidential hopeful who falls for Seth Rogen (why not?). “Somebody just needs to grow a pair and step up.”

Charlize Theron is thirsty. That surprises people. And by people, I mean me. How is it possible that Charlize Theron has to desire at all, considering she is so desired herself? (Doesn’t one negate the other?) You could sense an army of unworthy men clutching their collective pearls in response to her statement. That this statuesque blond with the kind of face you only see carved out of marble not only has to, God forbid, ask for it, but that she can speak like a sailor about it, shatters the pristine image of beauty — no wants, no desires — she otherwise projects. Theron’s words jolted us back to her humanity. The balls she asked for were the balls to approach her with desire, knowing that she has the power not to desire in return. Charlize Theron is dictating the expression of her thirst, but also the man who is worthy of it.

If the original iteration of “thirst” was a plunging desperation, this one is an uplifting affirmation. NPR traced its root, “thirst trap,” back to 2011; but Jezebel actually defined the singular “thirst” first in 2014, as lust “for sex, for fame, for approval. It’s unseemly striving for an unrealistic goal, or an unnecessary amount of praise.” This was the definition picked up in 2017 by The New York Times Magazine, imbuing thirst with negativity. But in the intervening years, women got a hold of it. These women, objects for so long within an atmosphere of men’s ambient lust, emerged to twist thirst from a cloying wish into full-bodied desire. Out of the wreckage of male toxicity, they used thirst to mark the men who remained worthy. There’s a reason Theron is still single — few men can step up. What’s more, in a world run by female desire, some are terrified of being left unwanted if they do.

* * *

It’s hard to get a clear picture of female desire across a history mostly seen through the male gaze, afflicted as it was with the rare myopia that focuses only on the virgin and the whore. So you had virtuous, prim, usually classier orderly women who were worth marrying, and sinful, messy, gutter-dwelling hysterics who were worth a quick screw, and that’s it. If a woman expressed desire and wasn’t faking it for money, she was a deranged man-eater, like a witch or a harpy. Men’s lust was natural, women’s was the most unnatural. Eventually, fandom offered a means of escape. “While it was risky for individual women to lose control or to surrender to passion, there could be safety in numbers,” wrote Carol Dyhouse in Heartthrobs: A History of Women and Desire. So women swooned all over the place for Franz Liszt in the mid-19th century before having a collective orgasm over Vaslav Nijinsky, then Rudolph Valentino — the first man (the first person) for whom the word “sexy” was deemed worthy of use. What these men had in common was fluidity — of gender, of sexuality, of race. “I hate [him],” cartoonist Dick Dorgan wrote of Valentino. “The women are all dizzy over him.” Real men hated this new masculine ideal because real women wanted it and they couldn’t deliver. So they took sexy back. The Hays Code put women who wanted sex in movie jail and in their place installed women with whom men wanted to have sex.

The new “sexy” icon became Marilyn Monroe, described by Molly Haskell (From Reverence to Rape: The Treatment of Women in the Movies) as “the lie that a woman has no sexual needs, that she is there to cater to, or enhance, a man’s needs.” It is a meandering but fairly unbroken line from Monroe to reality star and one-time child bride Courtney Stodden, who has not only physically fashioned herself into her idol, but also appears as troubled. In a recent interview with BuzzFeed, the now 24-year-old pitied her boyfriend for not cashing in on his expectations. “He thought he was going to get in a relationship with this hot young celebrity who’s all sexual and fun,” she said. “He gets in there and I don’t have sex, I’m a mess, and I’m crazy.” So, not really much change from the original dichotomy, the one which limits big-busted babes like her, like Kim Kardashian-West, to conduits for sex. The latter can launch her career off a sex tape, while Jennifer Lawrence, the slapstick virginal non-bottle blonde, can almost be undone by a couple of photos. And forget being a woman who has sex with more than one man; Kristen Stewart had to apologize publicly for that, forced to do a glorified perp walk in a world where husbands have had mistresses longer than Edward Cullen has been undead.

Almost every article I read about female sexuality cited Freud — specifically his inability to figure out what women want. It says a lot that on this subject we are still deferring to a psychoanalyst who predates women’s liberation. It served men like Freud and those who followed him to theorize that women had a lower sex drive (unproven and kind of the opposite), were more romantic than randy (unproven and kind of the opposite), because it meant women could not use men for sex the way men used women. Yet, as Psychology Today reported back in 2013, “If women believe that they will not be harmed and that the sex will be good, their willingness to engage in casual sex equals that of men.” Relax, bros, rape culture keeps that in check. “It is anti-sex and anti-pleasure,” writes Laurie Penny. “It teaches us to deny our own desire as an adaptive strategy for surviving a sexist world.” And now you can stop relaxing; since women have begun dismantling that world, they have also begun releasing their desire — these days better known as thirst.

Some men think the objectification of women has simply turned into women’s objectification of men, but that’s not what thirst is: Where the male gaze limits women to the flesh, the female gaze fleshes men out. Famous guys provide an aspirational model, with women filling in the holes with their wants, showing real guys how to enhance themselves to satisfy women like Charlize.

We have women of color to thank for pushing men to meet us halfway. Their brand of lady thirst went mainstream in 2017, the year ELLE announced “the Golden Age of Thirst Journalism,” and BuzzFeed got celebrities to read “thirst tweets” — their fans’ horny messages — and launched the “Thirst Aid Kit” podcast. That show centered on the famous crushes of hosts Bim Adewunmi and Nichole Perkins, from established hunks like Chris Evans to pensive actors of color like John Cho. “We are two straight black women talking about lust and desire and sexuality,” Adewunmi told Salon last year, “and all these expressions of humanity [are] not something that has traditionally been given to black women.” In their wake, black Canadian writer Kyrell Grant quietly articulated the concept of “big dick energy” (in reference to recently deceased chef Anthony Bourdain). “It’s a phrase I’d used with friends to refer to guys who aren’t that great but for whatever reason you still find attractive,” she wrote in The Guardian. But while black women are stereotyped for being game, they aren’t expected to set the rules. The Cut sought to profit off the term without crediting Grant, effectively muting her, though it was writer Hunter Harris whose desire was more directly silenced.

Vulture’s resident thirst critic — “i have something adam can drive” — was suspended by Twitter last week amid protests by fellow writers. “JUSTICE FOR HUNTER HARRIS, a thirst maestro and one of the funniest people on this hellsite,” Alanna Bennett tweeted. I DM’d Harris for the details of her suspension and she told me that a photographer had issued a copyright complaint about an image she used last summer in a tweet on the “secret romance” between Rihanna and Leonardo DiCaprio (she can’t remember the exact words and, because Twitter removed it, she can’t check). Around the same time that this happened, Quinn Hough, the editor of a tiny online film and music publication, Vague Visages, went viral (in a bad way) after pulling a strong anti-thirst stance on Twitter. The tweet in question has since been deleted, but Hough told me via email that he’d written “a poorly worded thread after seeing tweets from young critics that I thought were excessive and wouldn’t necessarily be acceptable in a professional environment.”

With women being the ones who thirst tweet most visibly, Hough’s comments were interpreted as an attempt to police women’s desire. “I just get very angry at any kind of sex-shaming because I’ve been told my whole life that if I express sexual desire, I’m a slut or dirty,” Danielle Ryan tweeted in response. “It really comes across differently to women.” While Hough’s site may be small, he still acts as a gatekeeper in the world of criticism, a conduit to larger more established outlets. His discrimination against what appeared to be young female writers, was a microcosm of a wider systemic double standard, particularly when he claimed, “Critics can say anything they want, but expressing sexual desire for subjects will minimize their chances for a staff position somewhere.”

This is where Hunter Harris resurfaces. The simultaneous timing of her suspension with the Vague Visages pile-on acted as a trigger for women accustomed to being muted, turning a copyright notice into a symbol of the suppression of black women’s desire. Meanwhile, other Twitter users expressed their delight at Harris’s expulsion. “It’s sad that @vulture encouraged her psychosis, but will probably be looking to dump her, now that @hunteryharris got her twitter account suspended,” wrote one guy who goes by Street Poetics (“PhD in These Streets”). A man he referenced in that same tweet, Jurg Bajiour, responded, “It’s true. @hunteryharris seemed to want to show me that it was *her job* to endlessly horny-tweet about actors.” (Harris denies this).

The missives were rich considering male film critics readily maintain staff positions despite waving around their boners in their actual reviews. “I didn’t miss Lynda Carter’s buxom, apple-cheeked pinup,” New York’s David Edelstein wrote in his Wonder Woman review. You may remember him also writing of Harry Potter, “prepubescent Watson is absurdly alluring,” in a review that originally appeared in Slate in 2001 and resurfaced after his Wonder Woman hard-on. Compare this to famously thirsty film critic Pauline Kael, whose books boast titles like I Lost It at the Movies and Kiss Kiss Bang Bang: “There is a thick, raw sensuality that some adolescents have which seems almost preconscious. In Saturday Night Fever, John Travolta has this rawness to such a degree that he seems naturally exaggerated.” There is a lot of sex here, but Kael is not the subject, Travolta not the object, and it layers rather than reduces. In fact, Female Film Critics’ Twitter poll on critical thirst — “What do you think of ‘thirst’ in film criticism?” — which followed the Vague Visages controversy, attracted 468 votes with a runaway 44 percent responding, “A grand tradition (Kael!)” Still, Hunter Harris admits she felt odd being erroneously credited as its icon. “i dont want to be like a martyr for the horny cause lmao,” she told me via DM, “but it is very nice that ppl are defensive of woc being openly desirous !”

* * *

While thirst is most common in the field of Hollywood celebrity — ground zero for idolatry — it has recently moved into politics, a place where masculinity has increasingly become a bone of contention. At one time we thirsted for Justin Trudeau’s “it’s 2019” yoga moves; more recently that thirst turned toward an emo crossdressing Beto. “Ojeda and Avenatti as candidates are like the guy who thinks good sex is pumping away while you’re making a grocery list in your head wondering when he’ll be done,” political analyst Leah McElrath tweeted in November 2018. “O’Rourke is like the guy who is all sweet and nerdy but holds you down and makes you cum until your calves cramp.” While politicians have an extensive history of abusing their positions for their own sexual gratification, this explicit dispatch from the beltway still left a number of us open-mouthed. Yet this is where we are — in the context of a presidency rife with toxic masculinity oft expressed in terms of sexual harassment, good sex acts as an analogy for progressive politics.

Over the past couple of years, women have also elected Noah Centineo, Benedict Cumberbatch, Jeff Goldblum, and Mahershala Ali as worthy of their thirst. Like the men who have historically inflamed female desire, they represent an aspirational form of masculinity, one which counteracts the retrograde misogyny trumpeted by the president. The thirst women express for these men’s physical form is informed by the men’s insides as much as their outsides. And the strongest men do not shrink at the prospect of not measuring up, but adapt the way women always have. In this new world, on the red carpet for their shared movie, Long Shot, Charlize Theron’s Alexander McQueen gown is matched by Seth Rogen’s Prada suit. “I was highly aware I was going to be standing next to Charlize for a lot of pictures,” Rogen said at the time. “I always have that image in my head of Beyoncé next to Ed Sheeran in a T-shirt, and I don’t want that.” Finally, it’s no longer about what a guy wants.

* * *

Soraya Roberts is a culture columnist at Longreads.

Family Animals

The Philippine Constabulary Band at the 1904 World’s Fair. Grace’s great grandfather, Pedro Navarro, stands in the front row second to the right holding a piccolo. Photo courtesy of the Missouri Historical Society, St. Louis / Restless Books

Grace Talusan| an excerpt from The Body Papers | Restless Books | April 2019 | 16 minutes (4,046 words)

 

“Did I ever tell you about the dog I had in the Philippines?” my father asked me when I was younger.

As a boy, my father lived in Tondo, the most densely populated area of Manila, infamous for its slums and high crime rates. Before it burned down, his family lived in a house above their sari-sari store, where they sold prepared foods, snacks, soda, and other convenience items. You could buy single sticks of cigarettes and gum, a dose of aspirin, or a packet of shampoo good for one wash. When he shared stories about his childhood, my American sensibilities were always shocked.

One day, a street dog followed him home and joined the other dogs already living in his family’s yard. The dogs didn’t have names; they were all called aso, dog. “Our dogs were not for petting,” my father explained. “They were low-tech burglar alarms and garbage disposals.”

But this dog was special. Totoy named his dog, “Lucky,” after, Lucky Strikes cigarettes. This detail still astounds me: At eight years old, my father had a favorite brand of cigarettes.

Read more…

The American Worth Ethic

Getty / Photo Illustration by Longreads

Bryce Covert | Longreads | April 2019 | 13 minutes (3,374 words)

“The American work ethic, the motivation that drives Americans to work longer hours each week and more weeks each year than any of our economic peers, is a long-standing contributor to America’s success.” Thus reads the first sentence of a massive report the Trump administration released in July 2018. Americans’ drive to work ever harder, longer, and faster is at the heart of the American Dream: the idea, which has become more mythology than reality in a country with yawning income inequality and stagnating upward economic mobility, that if an American works hard enough she can attain her every desire. And we really try: We put in between 30 to 90 minutes more each day than the typical European. We work 400 hours more annually than the high-output Germans and clock more office time than even the work-obsessed Japanese.

The story of individual hard work is embedded into the very founding of our country, from the supposedly self-made, entrepreneurial Founding Fathers to the pioneers who plotted the United States’ western expansion; little do we acknowledge that the riches of this country were built on the backs of African slaves, many owned by the Founding Fathers themselves, whose descendants live under oppressive policies that continue to leave them with lower incomes and overall wealth and in greater poverty. We — the “we” who write the history books — would rather tell ourselves that the people who shaped our country did it through their own hard work and not by standing on the shoulders, or stepping on the necks, of others. It’s an easier story to live with. It’s one where the people with power and money have it because they deserve it, not because they took it, and where we each have an equal shot at doing the same.

Because for all our national pride in our puritanical work ethic, the ethic doesn’t apply evenly. At the highest income levels, wealthy Americans are making money passively, through investments and inheritances, and doing little of what most would consider “work.” Basic subsistence may soon be predicated on whether and how much a poor person works, while the rich count on tax credits and carve-outs designed to protect stockpiles of wealth created by money begetting itself. It’s the poor who are expected to work the hardest to prove that they are worthy of Americanness, or a helping hand, or humanity. At the same time, we idolize and imitate the rich. If you’re rich, you must have worked hard. You must be someone to emulate. Maybe you should even be president.

* * *

Trump has a long history of antipathy to the poor, a word which he uses as a synonym for “welfare,” which he understands only as a pejorative. When he and his father were sued by the Department of Justice in 1973 for discriminating against black tenants in their real estate business, he shot back that he was being forced to rent to “welfare recipients.” Nearly 40 years later, he called President Obama “our Welfare & Food Stamp President,” saying he “doesn’t believe in work.” He wrote in his 2011 book Time To Get Tough, “There’s nothing ‘compassionate’ about allowing welfare dependency to be passed from generation to generation.”

Perhaps. But Trump certainly knows about relying on things passed from generation to generation. His self-styled origin story is that he got his start with a “small” $1 million loan from his real estate tycoon father, Fred C. Trump, which he used to grow his own empire. “I built what I built myself,” he has claimed. “I did it by working long hours, and working hard and working smart.”

It’s an interesting interpretation of “myself”: A New York Times investigation in October reported that, instead, Trump has received at least $413 million from his father’s businesses over the course of his life. “By age 3, Mr. Trump was earning $200,000 a year in today’s dollars from his father’s empire. He was a millionaire by age 8. By the time he was 17, his father had given him part ownership of a 52-unit apartment building,” reporters David Barstow, Susanne Craig, and Russ Buettner wrote. “Soon after Mr. Trump graduated from college, he was receiving the equivalent of $1 million a year from his father. The money increased with the years, to more than $5 million annually in his 40s and 50s.” The Times found 295 different streams of revenue Fred created to enrich his son — loans that weren’t repaid, three trust funds, shares in partnerships, lump-sum gifts — much of it further inflated by reducing how much went to the government. Donald and his siblings helped their parents dodge taxes with sham corporations, improper deductions, and undervalued assets, helping evade levies on gifts and inheritances.

If you’re rich, you must have worked hard. You must be someone to emulate. Maybe you should even be president.

Even the money that was made squarely owed a debt to the government. Fred Trump nimbly rode the rising wave of federal spending on housing that began with the New Deal and continued with the G.I. Bill. “Fred Trump would become a millionaire many times over by making himself one of the nation’s largest recipients of cheap government-backed building loans,” the Times reported. Donald carried on this tradition of milking government subsidies to accumulate fortunes. He obtained at least $885 million in perfectly legal grants, subsidies, and tax breaks from New York to build his real estate business.

Someone could have taken this largesse and worked hard to grow it into something more, but Donald Trump was not that someone. Much of his fortune comes not from the down and dirty work of running businesses, but from slapping his name on everything from golf courses to steaks. Many of these deals entail merely licensing his name while a developer actually runs things. And as president, he still doesn’t seem inclined to clock much time doing actual work.

That hasn’t stopped him from putting work at the center of his administration’s poverty-related policies. In the White House Council of Economic Advisers’ lengthy tome, it argued for adding work requirements to a new universe of public benefits. These requirements, which up until the Trump administration only existed for direct cash assistance and food stamps, require a recipient not just to put in a certain number of hours at a job or some other qualifying activity, but to amass paperwork to prove those hours each month. The CEA report is focused, supposedly, on “the importance and dignity of work.” But the benefits of engaging in labor are only deemed important for a particular population: “welfare recipients who society expects to work.” Over and over, it takes for granted that our country only expects the poorest to work in order to prove themselves worthy of government funds, specifically targeting those who get food stamps to feed their families, housing assistance to keep roofs over their heads, and Medicaid to stay healthy.

* * *

The report doesn’t just represent an ethos in the administration; it was also a justification for concrete actions it had already taken and more it would soon roll out. Last April, Trump signed an executive order that ordered federal agencies to review public assistance programs in order to see if they could impose work requirements unilaterally to “ensure that they are consistent with principles that are central to the American spirit — work, free enterprise, and safeguarding human and economic resources,” as the document states, while also “reserving public assistance programs for those who are truly in need.”

The administration has also pushed forward on its own. In 2017, it announced that states could apply for waivers that would allow them to implement work requirements in Medicaid for the first time, and so far more than a dozen states have taken it up on the offer, with Arkansas’s rule in effect since June 2018. (It has now been halted by a federal judge.) In that state, Medicaid recipients had to spend 80 hours a month at work, school, or volunteering, and report those activities to the government in order to keep getting health insurance. And in April 2018, Housing and Urban Development Secretary Ben Carson unveiled a proposal to let housing authorities implement work requirements for public housing residents and rental assistance recipients. Trump pushed Congress to include more stringent work requirements in the food stamp program as it debated the most recent farm bill, arguing it would “get America back to work.” When that effort failed, the Agriculture Department turned around and proposed a rule to impose the requirements by itself.

These aren’t fiscal necessities — they’re crackdowns on the poor, justified by the idea that they should prove themselves worthy of the benefits that help them survive, that are not just cruel but out of step with real life. Most people who turn to public programs already work, and those who don’t often have good reason. More than 60 percent of people on Medicaid are working. They remain on Medicaid because their pay isn’t enough to keep them out of poverty, and many of the low-wage jobs they work don’t offer health insurance they can afford. Of those not working, most either have a physical impairment or conflicting responsibilities like school or caregiving.

Enrollment in food stamps tells the same story. Among the “work-capable” adults on food stamps, about two thirds work at some point during the year, while 84 percent live in a household where someone works. But low-wage work is often chaotic and unpredictable. Recipients are more likely to turn to food stamps during a spell of unemployment or too few hours, then stop when they resume steadier employment. Many of those who are supposedly capable of work but don’t have a job have a health barrier or live with someone who has one; they’re in school, they’re caring for family, or they just can’t find work in their community.

Work requirements, then, fail to account for the reality of poor people’s lives. It’s not that there’s a widespread lack of work ethic among people who earn the least, but that there’s a lack of steady pay and consistent opportunities that allow someone to sustain herself and her family without assistance. We also know work requirements just don’t work. They’ve existed in the Temporary Assistance for Needy Families cash-assistance program for decades, yet they don’t help people find meaningful, lasting work; instead they serve as a way to shove them out of programs they desperately need. The result is more poverty, not more jobs.

If this country were so concerned about helping people who might face barriers to working get jobs, we might not be the second-lowest among OECD member countries by percentage of GDP spent on labor-market programs like job-search assistance or retraining. The poor in particular face barriers like affordable childcare and reliable transportation, and could use education or training to reach for better-paid, more meaningful work. But we do little to extend these supports. Instead, we chastise them for not pulling on their frayed bootstraps hard enough.

We also seem content with the notion that a person who doesn’t work — either out of inability or refusal — doesn’t deserve the building blocks of staying alive. The programs Trump is targeting, after all, are about basic needs: housing to stay safe from the elements, food to keep from going hungry, healthcare to receive treatment and avoid dying of neglect. Even if it were true that there was a horde of poor people refusing to work, do we want to condemn them to starvation and likely death? In one of the world’s richest countries, do we really balk at spending money on keeping our people — even lazy ones — alive?

We also know work requirements just don’t work. They’ve existed in the Temporary Assistance for Needy Families cash-assistance program for decades, yet they don’t help people find meaningful, lasting work; instead they serve as a way to shove them out of programs they desperately need. The result is more poverty, not more jobs.

Plenty of other countries don’t do so. Single mothers experience higher rates of destitution than coupled parents or people without children all over the world. But the higher poverty rate in the U.S. as compared to other developed countries isn’t because we have more single mothers; instead, it’s because we do so little to help them. Compare us to Denmark, which gives parents unconditional cash benefits for each of their children regardless of whether or how much they work, on top of generously subsidizing childcare, offering universal health coverage, and guaranteeing paid leave. It’s no coincidence that they also have a lower poverty rate, both generally and for single mothers specifically. A recent examination of poverty across countries found that children are at higher risk in the U.S because we have a sparse social safety net that’s so closely tied to demanding that people work. It makes us an international outlier, the world’s miser that only opens a clenched fist to the poor if they’re willing to demonstrate their worthiness first.

Here, too, America’s history of slavery and ongoing racism rears its head. According to a trio of renowned economists, we don’t have a European-style social safety net because “racial animosity in the U.S. makes redistribution to the poor, who are disproportionately black, unappealing to many voters.” White people turn against funding public benefit programs when they feel their racial status threatened, particularly benefits they (falsely) believe mainly accrue to black people. The black poor are seen as the most undeserving of help and most in need of proving their worthiness to get it. States with larger percentages of black residents, for example, focus less on TANF’s goal of providing cash to the needy and have stingier benefits with higher hurdles to enrollment.

* * *

The CEA’s report on work requirements claimed that being an adult who doesn’t work is particularly prevalent among “those living in low-income households.” But that’s debatable. The more income someone has, the less likely he is to be getting it from wages. In 2012, those earning less than $25,000 a year made nearly three quarters of that money from a job. Those making more than $10 million, on the other hand, made about half of their money from capital gains — in other words, returns on investments. The bottom half of the country has, on average, just $826 in income from capital investments each; the average for those in the top 1 percent is more than $16 million.

The richest are the least likely to have their money come from hard labor — yet there’s no moral panic over whether they’re coddled or lacking in self reliance. Instead, government benefits help the rich protect and grow idle wealth. Capital gains and dividends are taxed at a lower rate than regular salaried income. Inheritances were taxed at an average rate of 4 percent in 2009, compared to the average rate of 18 percent for money earned by working and saving. When investments are bequeathed, the recipient owes no taxes on any asset appreciation.


Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.

Sign up


In fact, government tax benefits that increase people’s take-home money at the expense of what the government collects for its own coffers overwhelmingly benefit the rich over the poor (or even the middle class). More than 60 percent of the roughly $900 billion in annual tax expenditures goes to the richest 20 percent of American families. That figure dwarfs what the government expends on many public benefit programs. The government spends more than three times as much on tax subsidies for homeowners, mostly captured by the well-to-do, than it does on rental assistance for the poor. The three benefit programs the Trump administration is concerned with — Medicaid, food stamps, and housing assistance — come to about $705 billion in combined spending.

While the administration has been concerned with what it can do to compel the poor to work, it’s handed out more largesse to the idle rich. Its signature tax-cut package, the Tax Cuts and Jobs Act, offered an extra cut for so-called “pass-through” businesses, like law or real estate firms. But the fine print included a wrinkle: If someone is considered actively involved in his pass-through business, only 30 percent of his earnings could qualify for the new discount. If someone is passively involved, however — a shareholder who doesn’t do much about the day-to-day work of the company — then he gets 100 percent of the new benefit.

Then there’s the law’s significant lowering of the estate tax. The tax is levied on only the biggest, most valuable inheritances passed down from wealthy parent to newly wealthy child. Before the Republicans’ tax bill, only the richest 0.2 percent of estates had to pay the tax when fortunes changed hands. Now it’s just the richest 0.1 percent, or a mere 1,800 very wealthy families worth more than $22 million. The rest get to pass money to their heirs tax-free. Those who do pay it will be paying less when tax time comes due — $4.4 million less, to be exact.

Despite the Republican rhetoric that lowering the estate tax is about saving family farms, it’s really about allowing an aristocracy to calcify — one in which rich parents ensure their children are rich before they lift a single finger in work. As those heirs receive their fortunes, they also receive the blessing that comes with riches: the halo of success and, therefore, deservedness without having to work to prove it. Yet there’s evidence that increasing taxes on inheritances has the potentially salutary effect of getting heirs to work more. The more their inheritances are taxed, the more they end up paying in labor taxes — evidence that they’re working harder for their livings, not just coasting on generational wealth. Perhaps our tax code could encourage rich heirs to experience the dignity of work.

* * *

Trump’s CEA report is accurate about at least one thing: Our country has a history of only offering public benefits to the poor either deemed worthy through their work or exempt through old age or disability. An outlier was the Aid to Families with Dependent Children program, which became Temporary Assistance for Needy Families after Bill Clinton signed welfare reform into law in the ’90s. But the 1996 transformation of the program took what was a promise of cash for poor mothers and changed it into an obstacle course of proving a mother’s worth before she can get anywhere close to a check. It paved the way for the current administration’s obsession with work requirements.

Largesse for the rich, on the other hand, has rarely included such tests. No one has been made to pee in a cup for tax breaks on their mortgages, which cost as much as the food stamp program but overwhelmingly benefit families that earn more than $100,000. No one has had to prove a certain number of work hours to get a lower tax rate on investment income or an inheritance. They get that discount on their money without having to do any work at all.

We haven’t always been so extreme in our dichotomous treatment of the rich and poor; throughout the 1940s, ’50s, and ’60s, we coupled high marginal taxes on the wealthy with a minimum wage that ensured that people who put in full-time work could rise out of poverty. The estate tax has been as high as 77 percent. As Dutch historian Rutger Bregman recently told an audience of the ultrawealthy at Davos, we’re living proof that high taxes can spread shared prosperity. “The United States, that’s where it has actually worked, in the 1950s, during Republican President Eisenhower,” he pointed out. “This is not rocket science.” It was during the same era that we also created significant anti-poverty programs such as Social Security, Medicare, and Medicaid. In fact, this country pioneered the idea of progressive taxation and has always had some form of tax on inheritance to avoid creating an aristocracy. But we’ve papered over that history as tax rates have cratered and poverty has climbed.

Instead, as Reaganomics and neoliberal ideas took hold of our politics, we turned back to the Horatio Alger myth that success is attained on an individual basis by hard work alone, and that riches are the proof of a dogged drive. Lower tax rates naturally follow under the theory that the rich should keep more of their deserved bounty. And if you’re poor, coming to the government seeking a helping hand up, you failed.

The country is due for a reckoning with our obsession with work. There are certainly financial and emotional benefits that come from having a job. But why are we only concerned with whether the poor reap those benefits? Is working ourselves to the bone the best signifier of our worth — and are there basic elements of life that we should guarantee regardless of work? It doesn’t mean dropping all emphasis on work ethic. But it does require a deeper examination of who we expect to work — and why.

* * *

Bryce Covert is an independent journalist writing about the economy and a contributing op-ed writer at The New York Times.

Editor: Michelle Weber
Fact checker: Ethan Chiel
Copy editor: Jacob Z. Gross   

A Rich Awakening

iStock / Getty Images Plus

Soraya Roberts | Longreads | April 2019 | 9 minutes (2,392 words)

We all the know the stats, that by 2030 the richest 1 percent could be hoarding two-thirds of the world’s wealth. Tax the rich! Redistribute to the poor! It’s the kind of thing you hear lately set to some lame music in a weirdly cut NowThis News video of Alexandria Ocasio-Cortez or Rutger Bregman. (It’s always some scrappy progressive, not some bloated billionaire because, I don’t know, *yawns, eats some cake.*) Perhaps the rich will be moved by the fact that income equality is not only bad for the collective mental health, but their own? No? That the 10 percent’s multiplying accessories — private jets and yachts and enormous holiday homes — hogs nearly half the world’s emissions, killing the earth we all share? No? Nothing? What’s that you say, infrastructure investment started plummeting just as inequality began rising? But all the philanthropy! Which, sure, America’s largest donors may give a little more than before, but they also make way more than they used to. And as Jacobin magazine recently noted, “those nations — mostly in Scandinavia — that have the highest levels of equality and social well-being have the tiniest philanthropic sectors.” When you have equality, you don’t need long Greek words.

To recognize this, as a rich person, you need to have a sort of reverse double consciousness. “Double consciousness” originates with W. E. B. Du Bois, one of the founders of the NAACP, who coined it in 1897 as one way to describe the experience of  being an African American in a white supremacist world. In The Atlantic Monthly he defined it as, “…this sense of always looking at one’s self through the eyes of others….” The concept is based on being oppressed. What I’m talking about is an inverted version based on being the oppressor. It is the recognition that not only do you have outsized means, but that they come at the expense of others. It requires not only self awareness, but other awareness, and it’s a prerequisite for change.

Roy Disney’s granddaughter, Abigail, for instance, has given $70 million away over the past four decades, which is more than she ever inherited. “The problem is that there’s a systematic favoring of people who have accumulated an enormous amount of wealth,” she tweeted after a viral appearance on CNBC last month in which she said CEOs were overpaid. “The U.S. must make structural changes by taxing the wealthy.” To say that, she had to have had some kind of awakening — but what was it? In her case it was a sudden burst of extraordinary wealth and its human toll — not on others, but on the wealthy themselves. In 1984, when the heiress was in college, Michael Eisner became the chairman and CEO of Disney and launched its stocks into the stratosphere. Abigail’s father embraced the excess income — the too-big private jet, the too-much drinking — and no one questioned him, not even about his alcoholism. “That’s when I feel that my dad really lost his way in life. And that’s why I feel hyperconscious about what wealth does to people,” she recently told The Cut. “I lived in one family as a child, and then I didn’t even recognize the family as I got older.” Read more…