When my sons were younger, I remember explaining to them the difference between real and imaginary. Their dreams and nightmares weren’t real; you couldn’t see or touch them. The stories in their books weren’t real; I soothed their worries about monsters coming to life by assuring my boys it was all just imaginary.
Those conversations have surfaced in my mind as I’ve been thinking about borders; these made-up lines etched across the Earth by the powerful to hold their power in place — lines that are imaginary at first and then all too real.
Just look to the killing field that Israel has sown around Gaza, imprisoning people on a spit of land so ruined that it will soon be uninhabitable. It’s over one year since people there rose up to stage on-going protests against the occupation that has ruined lives and destroyed communities.
There’s also the US-Mexico border in Arizona, cutting across the land of the Indigenous Tohono O’odham People, now thick with the apparatus of state violence: cameras, fences, drones, guns, jails. Or the line that was drawn to divide Korea, now the world’s most militarized border, stuck with the Orwellian designation DMZ, for “demilitarized zone.”
As the director of MADRE, an international women’s rights organization, I’ve spent time recently at each of these borders, with feminist peace activists and Indigenous women leaders. In each place, I listened as women described what it’s like to be trapped by borders, as mothers told of their responsibility for the survival and peace of mind of their children in these zones of hostility and violence, loss and separation.
To see the world through the eyes of those who are responsible for its most vulnerable people: that’s what it means to work from the perspective of mothers. When we do this, we understand anew the issues that drive migration and border brutality — and the solutions needed to address them.
In 2014, Rich Niemann, president and CEO of the Midwestern grocery company Niemann Foods, made the most important phone call of his career. He dialed the Los Angeles office of Shook Kelley, an architectural design firm, and admitted he saw no future in the traditional grocery business. He was ready to put aside a century of family knowledge, throw away all his assumptions, completely rethink his brand and strategy — whatever it would take to carry Niemann Foods deep into the 21st century.
“I need a last great hope strategy,” he told Kevin Kelley, the firm’s cofounder and principal. “I need a white knight.”
Part square-jawed cattle rancher, part folksy CEO, Niemann is the last person you’d expect to ask for a fresh start. He’s spent his whole life in the business, transforming the grocery chain his grandfather founded in 1917 into a regional powerhouse with more than 100 supermarkets and convenience stores across four states. In 2014, he was elected chairof the National Grocery Association. It’s probably fair to say no one alive knows how to run a grocery store better than Rich Niemann. Yet Niemann was no longer sure the future had a place for stores like his.
He was right to be worried. The traditional American supermarket is dying. It’s not just Amazon’s purchase of Whole Foods, an acquisition that trade publication Supermarket News says marked “a new era” for the grocery business — or the fact that Amazon hopes to launch a second new grocery chain in 2019, according to a recent report from The Wall Street Journal, with a potential plan to scale quickly by buying up floundering supermarkets. Even in plush times, grocery isa classic “red ocean” industry, highly undifferentiated and intensely competitive. (The term summons the image of a sea stained with the gore of countless skirmishes.) Now, the industry’s stodgy old playbook — “buy one, get one” sales, coupons in the weekly circular — is hurtling toward obsolescence. And with new ways to sell food ascendant, legacy grocers like Rich Niemann are failing to bring back the customers they once took for granted. You no longer need grocery stores to buy groceries.
Niemann hired Kelley in the context of this imminent doom. The assignment: to conceive, design, and build the grocery store of the future. Niemann was ready to entertain any idea and invest heavily. And for Kelley, a man who’s worked for decades honing his vision for what the grocery store should do and be, it was the opportunity of a lifetime — carte blanche to build the working model he’s long envisioned, one he believes can save the neighborhood supermarket from obscurity.
Kevin Kelley, illustration by Vinnie Neuberg
Rich Niemann, illustration by Vinnie Neuberg
The store that resulted is calledHarvest Market, which opened in 2016. It’s south of downtown Champaign, Illinois, out by the car dealerships and strip malls; 58,000 square feet of floor space mostly housed inside a huge, high-ceilinged glass barn. Its bulk calls to mind both the arch of a hayloft and the heavenward jut of a church. But you could also say it’s shaped like an ark, because it’s meant to survive an apocalypse.
Harvest Market is the anti-Amazon. It’s designed to excel at what e-commerce can’t do: convene people over the mouth-watering appeal of prize ingredients and freshly prepared food. The proportion of groceries sold onlineis expected to swell over the next five or six years, but Harvest is a bet that behavioral psychology, spatial design, and narrative panache can get people excited about supermarkets again. Kelley isn’t asking grocers to be more like Jeff Bezos or Sam Walton. He’s not asking them to be ruthless, race-to-the-bottom merchants. In fact, he thinks that grocery stores can be something far greater than we ever imagined — a place where farmers and their urban customers can meet, a crucial link between the city and the country.
But first, if they’re going to survive, Kelley says, grocers need to start thinking like Alfred Hitchcock.
* * *
Kevin Kelley is an athletic-looking man in his mid-50s, with a piercing hazel gaze that radiates thoughtful intensity. In the morning, he often bikes two miles to Shook Kelley’s office in Hollywood — a rehabbed former film production studio on an unremarkable stretch of Melrose Avenue, nestled between Bogie’s Liquors and a driving school. Four nights a week, he visits a boxing gym to practice Muay Thai, a form of martial arts sometimes called “the art of eight limbs” for the way it combines fist, elbow, knee, and shin attacks. “Martial arts,” Kelley tells me, “are a framework for handling the unexpected.”That’s not so different from his main mission in life: He helps grocery stores develop frameworks for the unexpected, too.
You’ve never heard of him, but then it’s his job to be invisible. Kelley calls himself a supermarket ghostwriter: His contributions are felt more than seen, and the brands that hire him get all the credit. Countless Americans have interacted with his work in intimate ways, but will never know his name. Such is the thankless lot of the supermarket architect.
A film buff equally fascinated by advertising and the psychology of religion, Kelley has radical theories about how grocery stores should be built, theories that involve terms like “emotional opportunity,” “brain activity,” “climax,” and “mise-en-scène.” But before he can talk to grocers about those concepts, he has to convince them of something far more elemental: that their businesses face near-certain annihilation and must change fundamentally to avoid going extinct.
“It is the most daunting feeling when you go to a grocery store chain, and you meet with these starched-white-shirt executives,” Kelley tells me. “When we get a new job, we sit around this table — we do it twenty, thirty times a year. Old men, generally. Don’t love food, progressive food. Just love their old food — like Archie Bunkers, essentially. You meet these people and then you tour their stores. Then I’ve got to go convince Archie Bunker that there’s something called emotions, that there are these ideas about branding and feeling. It is a crazy assignment. I can’t get them to forget that they’re no longer in a situation where they’ve got plenty of customers. That it’s do-or-die time now.”
Forget branding. Forget sales. Kelley’s main challenge is redirecting the attention of older male executives, scared of the future and yet stuck in their ways, to the things that really matter.
“I make my living convincing male skeptics of the power of emotions,” he says.
Human beings, it turns out, aren’t very good at avoiding large-scale disaster. As you read this, the climate is changing, thanks to the destructively planet-altering activities of our species. The past four years have beenthe hottest on record. If the trend continues — and virtually all experts agree it will — we’re likely to experiencemass disruptions on a scale never before seen in human history.Drought will be epidemic. Theocean will acidify. Islands will be swallowed by the sea. People could be displaced by the millions, creatinga new generation of climate refugees. And all becausewe didn’t move quickly enough when we still had time.
You know this already. But I bet you’re not doing much about it — not enough, at least, to help avert catastrophe. I’ll bet your approach looks a lot like mine: worry too much, accomplish too little. The sheer size of the problem is paralyzing. Vast, systemic challenges tend to short-circuit our primate brains. So we go on, as the grim future bears down.
Grocers, in their own workaday way, fall prey to the same inertia. They got used to an environment of relative stability. They don’t know how to prepare for an uncertain future. And they can’t force themselves to behave as if the good times are really going to go away — even if, deep down, they know it’s true.
I make my living convincing male skeptics of the power of emotions.
In the 1980s, you could still visit almost any community in the U.S. and find a thriving supermarket. Typically, it would be a dynasty family grocery store, one that had been in business for a few generations. Larger markets usually had two or three players, small chains that sorted themselves out along socioeconomic lines: fancy, middlebrow, thrifty. Competition was slack and demand — this is the beautiful thing about selling food — never waned. For decades, times were good in the grocery business. Roads and schools were named after local supermarket moguls, who often chaired their local chambers of commerce. “When you have that much demand, and not much competition, nothing gets tested. Kind of like a country with a military that really doesn’t know whether their bullets work,” Kelley says. “They’d never really been in a dogfight.”
It’s hard to believe now, but there wasnot a single Walmart on the West Coast until 1990. That decade saw the birth of the “hypermarket” and the beginning of the end for traditional grocery stores — Walmarts, Costcos, and Kmarts became the first aggressive competition supermarkets ever really faced, luring customers in with the promise of one-stop shopping on everything from Discmen to watermelon.
The other bright red flag: Americans started cooking at home less and eating out more. In 2010, Americansdined out more than in for the first time on record, the culmination of a slow shift away from home cooking that had been going onsince at least the 1960s. That trend is likely to continue. According to a 2017 report from the USDA’s Economic Research Service, millennials shop at food stores less than any other age group, spend less time preparing food, and are more likely to eat carry-out, delivery, or fast food even when they do eat at home. But even within the shrinking market for groceries, competition has stiffened. Retailers not known for selling food increasingly specialize in it, a phenomenon called“channel blurring”; today, pharmacies like CVS sell pantry staples and packaged foods, while 99-cent stores like Dollar General are a primary source of groceries for a growing number of Americans. Then there’s e-commerce. Though only about 3 percent of groceries are currently bought online, that figure couldrocket to 20 percent by 2025. From subscription meal-kit services like Blue Apron to online markets like FreshDirect and Amazon Fresh, shopping for food has become an increasingly digital endeavor — one that sidesteps traditional grocery stores entirely.
A cursory glance might suggest grocery stores are in no immediate danger. According to the data analytics company Inmar, traditional supermarkets still have a 44.6 percent market share among brick-and-mortar food retailers. And thougha spate of bankruptcies has recently hit the news,there are actually more grocery stores today than there were in 2005. Compared to many industries — internet service, for example — the grocery industry is still a diverse, highly varied ecosystem. Forty-three percent of grocery companies have fewer than four stores, according to a recent USDAreport. These independent stores sold 11 percent of the nation’s groceriesin 2015, a larger collective market share than successful chains like Albertson’s (4.5 percent), Publix (2.25 percent), and Whole Foods (1.2 percent).
But looking at this snapshot without context is misleading — a little like saying that the earth can’t be warming because it’s snowing outside. Not long ago, grocery stores sold the vast majority of the food that was prepared and eaten at home — about 90 percent in 1988, according to Inmar. Today, their market share has fallen by more than half, even as groceries represent a diminished proportion of overall food sold. Their slice of the pie is steadily shrinking, as is the pie itself.
By 2025, the thinking goes, most Americans will rarely enter a grocery store. That’s according to areport called “Surviving the Brave New World of Food Retailing,” published by the Coca-Cola Retailing Research Council — a think tank sponsored by the soft drink giant to help retailers prepare for major changes. The reportdescribes a retail marketplace in the throes of massive change, where supermarkets as we know them are functionally obsolete. Disposables and nonperishables, from paper towels to laundry detergent and peanut butter, will replenish themselves automatically, thanks to smart-home sensors that reorder when supplies are low. Online recipes from publishers like Epicurious will sync directly to digital shopping carts operated by e-retailers like Amazon. Impulse buys and last-minute errands will be fulfilled via Instacart and whisked over in self-driving Ubers. In other words, food — for the most part — will be controlled by a small handful of powerful tech companies.
The Coca-Cola report, written in consultation with a handful of influential grocery executives, including Rich Niemann, acknowledges that the challenges are dire. To remain relevant, it concludes, supermarkets will need to become more like tech platforms: develop a “robust set of e-commerce capabilities,” take “a mobile-first approach,” and leverage “enhanced digital assets.” They’ll need infrastructure for “click and collect” purchasing, allowing customers to order online and pick up in a jiffy. They’ll want to establish a social media presence, as well as a “chatbot strategy.” In short, they’ll need to become Amazon, and they’ll need to do it all while competing with Walmart — and its e-commerce platform, Jet.com — on convenience and price.
That’s why Amazon’s acquisition of Whole Foods Market was terrifying to so many grocers, sending the stocks of national chains like Kroger tumbling: It represents a future they can’t really compete in. Since August 2017, Amazon has masterfully integrated e-commerce and physical shopping, creating a muscular hybrid that represents an existential threat to traditional grocery stores. The acquisition was partially a real estate play: Whole Foods stores with Prime lockers nowact as a convenient pickup depot for Amazon goods. But Amazon’s also doing its best to make it too expensive and inconvenient for its Prime members, who pay $129 a year for free two-day shipping and a host of other perks, to shop anywhere else. Prime membersreceive additional 10 percent discounts on select goods at Whole Foods, and Amazon is rolling out home grocery delivery in select areas. With the Whole Foods acquisition, then, Amazon cornered two markets: the thrift-driven world of e-commerce and the pleasure-seeking universe of high-end grocery. Order dish soap and paper towels in bulk on Amazon, and pick them up at Whole Foods with your grass-fed steak.
An Amazon worker wheels back a cart after loading a bag of groceries into a customer’s car at an AmazonFresh Pickup location in Seattle. (AP Photo/Elaine Thompson, File)
Ingredients from a three-meal Blue Apron box. (AP Photo/Bree Fowler)
An employee of grocery delivery service Amazon Fresh scans ordered products before putting them into a transport bag. (Monika Skolimowska/picture-alliance/dpa/AP Images)
Traditional grocers are now expected to offer the same combination of convenience, flexibility, selection, and value. They’re understandably terrified by this scenario, which would require fundamental, complex, and very expensive changes. And Kelley is terrified of it, too, though for a different reason: He simply thinks it won’t work. In his view, supermarkets will never beat Walmart and Amazon at what they do best. If they try to succeed by that strategy alone, they’ll fail. That prospect keeps Kelley up at night — because it could mean a highly consolidated marketplace overseen by just a handful of players, one at stark contrast to the regional, highly varied food retail landscape America enjoyed throughout the 20th century.
“I’m afraid of what could happen if Walmart and Amazon and Lidl are running our food system, the players trying to get everything down to the lowest price possible,” he tells me. “What gives me hope is the upstarts who will do the opposite. Who aren’t going to sell convenience or efficiency, but fidelity.”
The approach Kelley’s suggesting still means completely overhauling everything, with no guarantee of success. It’s a strategy that’s decidedly low-tech, though it’s no less radical. It’s more about people than new platforms. It means making grocery shopping more like going to the movies.
* * *
Nobody grows up daydreaming about designing grocery stores, including Kelley. As a student at the University of North Carolina at Charlotte, he was just like every other architect-in-training: He wanted to be a figure like Frank Gehry, building celebrated skyscrapers and cultural centers. But he came to feel dissatisfied with the culture of his profession. In his view, architects coldly fixate on the aesthetics of buildings and aren’t concerned enough with the people inside.
“Architecture worships objects, and Capital-A architects are object makers,” Kelley tells me. “They aren’t trying to fix social issues. People and their experience and their perceptions and behaviors don’t matter to them. They don’t even really want people in their photographs—or if they have to, they’ll blur them out.”What interested Kelley most was how people would use his buildings, not how the structures would fit into the skyline. He wanted to shape spaces in ways that could actually affect our emotions and personalities, bringing out the better angels of our nature. To his surprise, no one had really quantified a set of rules for how environment could influence behavior. Wasn’t it strange that advertising agencies spent so much time thinking about the links between storytelling, emotions, and decision-making — while commercial spaces, the places where we actually go to buy, often had no design principle beyond brute utility?
“My ultimate goal was to create a truly multidisciplinary firm that was comprised of designers, social scientists and marketing types,” he says. “It was so unorthodox and so bizarrely new in terms of approach that everyone thought I was crazy.”
In 1992, when he was 28, Kelley cofounded Shook Kelley with the Charlotte, North Carolina–based architect and urban planner Terry Shook. Their idea was to offer a suite of services that bridged social science, branding, and design, a new field they called “perception management.” They were convinced space could be used to manage emotion, just the way cinema leads us through a guided sequence of feelings, and wanted to turn that abstract idea into actionable principles. While Shook focused on bigger, community-oriented spaces like downtown centers and malls, Kelley focused on the smaller, everyday commercial spaces overlooked by fancy architecture firms: dry cleaners, convenience stores, eateries, bars. One avant-garde restaurant Kelley designed in Charlotte, called Props, was an homage to the sitcom craze of the 1990s. It was built to look like a series of living rooms, based on the apartment scenes in shows like Seinfeld and Friends and featured couches and easy chairs instead of dining tables to encourage guests to mingle during dinner.
The shift to grocery stores didn’t happen until a few years later, almost by accident. In the mid-’90s, Americans still spent about 55 percent of their food dollars on meals eaten at home — but that share was declining quickly enough to concern top corporate brass at Harris Teeter, a Charlotte-area, North Carolina–based grocery chain with stores throughout the Southwestern United States. (Today, Harris Teeter is owned by Kroger, the country’ssecond-largest seller of groceries behind Walmart.) Harris Teeter execs reached out to Shook Kelley.“We hear you’re good with design, and you’re good with food,” Kelley remembers Harris Teeter reps saying. “Maybe you could help us.”
At first, it was Terry Shook’s account. He rebuilt each section of the store into a distinct “scene” that reinforced the themes and aesthetics of the type of food it sold. The deli counter became a mocked-up urban delicatessen, complete with awning and neon sign. The produce section resembled a roadside farmstand. The dairy cases were corrugated steel silos, emblazoned with the logo of a local milk supplier. And he introduced full-service cafés, a novelty for grocery stores at the time, with chrome siding like a vintage diner.It was pioneering work, winning that year’s Outstanding Achievement Award from the International Interior Design Association — according to Kelley, it was the first time the prestigious award had ever been given to a grocery store.
Shook backed off of grocery stores after launching the new Harris Teeter, but the experience sparked Kelley’s lifelong fascination with grocery stores, which he realized were ideal proving grounds for his ideas about design and behavior. Supermarkets contain thousands of products, and consumers make dozens of decisions inside them — decisions about health, safety, family, and tradition that get to the core of who they are. He largely took over the Harris Teeter account and redesigned nearly 100 of the chain’s stores, work that would go on to influence the way the industry saw itself and ultimately change the way stores are built and navigated.
Since then, Kelley has worked to show grocery stores that they don’t have to worship at the altar of supply-side economics. He urges grocers to appeal instead to our humanity. Kelley asks them to think more imaginatively about their stores, using physical space to evoke nostalgia, delight our senses, and appeal to the parts of us motivated by something bigger and more generous than plain old thrift. Shopping, for him, is all about navigating our personal hopes and fears, and grocery stores will only succeed when they play to those emotions.
When it works, the results are dramatic. Between 2003 and 2007, Whole Foods hired Shook Kelley for brand strategy and store design, working with the firm throughout a crucial period of the chain’s development. The fear was that as Whole Foods grew, its image would become too diffuse, harder to differentiate from other health food stores; at the same time, the company wanted to attract more mainstream shoppers. Kelley’s team was tasked with finding new ways to telegraph the brand’s singular value. Their solution was a hierarchical system of signage that would streamline the store’s crowded field of competing health and wellness claims.
Kelley’s view is that most grocery stores are “addicted” to signage, cramming their spaces with so many pricing details, promotions, navigational signs, ads, and brand assets that it “functionally shuts down [the customer’s] ability to digest the information in front of them.”
Kelley’s team stipulated that Whole Foods could only have seven layers of information, which ranged from evocative signage 60 feet away to descriptive displays six feet from customers to promotional info just six inches from their hands. Everything else was “noise,” and jettisoned from the stores entirely. If you’ve ever shopped at Whole Foods, you probably recognize the way that the store’s particular brand of feel-good, hippie sanctimony seems to permeate your consciousness at every turn. Kelley helped invent that. The system he created for pilot stores in Princeton, New Jersey, and Louisville, Kentucky, were scaled throughout the chain and are still in use today, he says. (Whole Foods did not respond to requests for comment for this story.)
With a carefully delineated set of core values guiding its purchasing and brand, Whole Foods was ripe for the kind of visual overhaul Kelley specializes in. But most regional grocery chains have a different set of problems: They don’t really have values to telegraph in the first place. Shook Kelley’s approach is about getting buttoned-down grocers to reflect on their beliefs, tapping into deeper, more primal reasons for wanting to sell food.
* * *
Today, Kelley and his team have developed a playbook for clients, a finely tuned process to get shoppers to think in terms that go beyond bargain-hunting. It embraces what he calls “the theater of retail” and draws inspiration from an unlikely place: the emotionally laden visual language of cinema. His goal is to convince grocers to stop thinking like Willy Loman — like depressed, dejected salesmen forever peddling broken-down goods, fixated on the past and losing touch with the present. In order to survive, Kelley says, grocers can’t be satisfied with providing a place to complete a chore. They’ll need to direct an experience.
Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.
Today’s successful retail brands establish what Kelley calls a “brand realm,” or what screenwriters would call a story’s “setting.” We don’t usually think consciously about them, but realms subtly shape our attitude toward shopping the same way the foggy, noirishly lit streets in a Batman movie tell us something about Gotham City. Cracker Barrel is set in a nostalgic rural house. Urban Outfitters is set on a graffitied urban street. Tommy Bahama takes place on a resort island. It’s a well-known industry secret that Costco stores arehugely expensive to construct — they’re designed to resemble fantasy versions of real-life warehouses, and the appearance of thrift doesn’t come cheap. Some realms are even more specific and fanciful: Anthropologie is an enchanted attic, complete with enticing cupboards and drawers. Trader Joe’s is a crew of carefree, hippie traders shipping bulk goods across the sea.A strong sense of place helps immerse us in a store, getting us emotionally invested and (perhaps) ready to suspend the critical faculties that prevent a shopping spree.
Kelley takes this a few steps further. The Shook Kelly team, which includes a cultural anthropologist with a Ph.D., begins by conducting interviews with executives, staff, and locals, looking for the storytelling hooks they call “emotional opportunities.” These can stem from core brand values, but often revolve around the most intense, place-specific feelings locals have about food. Then Kelley finds ways to place emotional opportunities inside a larger realm with an overarching narrative, helping retailers tell those stories — not with shelves of product, but through a series of affecting “scenes.”
In Alberta, Canada, Shook Kelley redesigned a small, regional grocery chain now called Freson Bros. Fresh Market. In interviews, the team discovered that meat-smoking is a beloved pastime there, so Shook Kelley built huge, in-store smokers at each new location — a scene called “Banj’s Smokehouse” — that crank out pound after pound of the province’s signature beef, as well as elk, deer, and other kinds of meat (customers can even BYO meat to be smoked in-house). Kelley also designed stylized root cellars in each produce section, a cooler, darker corner of each store that nods to the technique Albertans use to keep vegetables fresh. These elements aren’t just novel ways to taste, touch, and buy. They reference cultural set points, triggering memories and personal associations. Kelley uses these open, aisle-less spaces, which he calls “perceptual rooms,” to draw customers through an implied sequence of actions, tempting them towards a specific purchase.
Something magical happens when you engage customers this way. Behavior changes in visible, quantifiable ways. People move differently. They browse differently. And they buy differently. Rather than progressing in a linear fashion, the way a harried customer might shoot down an aisle — Kelley hates aisles, which he says encourage rushed, menial shopping — customers zig-zag, meander, revisit. These behaviors are a sign a customer is “experimenting,” engaging with curiosity and pleasure rather than just trying to complete a task. “If I was doing a case study presentation to you, I would show you exact conditions where we don’t change the product, the price, the service. We just change the environment and we’ll change the behavior,” Kelley tells me. “That always shocks retailers. They’re like ‘Holy cow.’ They don’t realize how much environment really affects behavior.”
A strong sense of place helps immerse us in a store, getting us emotionally invested and (perhaps) ready to suspend the critical faculties that prevent a shopping spree.
In the mid-2000s, Nabisco approached Kelley’s firm, complaining that sales were down 16 percent in the cookie-and-cracker aisle. In response, Shook Kelley designed “Mom’s Kitchen,” which was piloted at Buehler’s, a 15-store chain in northern Ohio. Kelley took Nabisco’s products out of the center aisles entirely and installed them in a self-contained zone: a perceptual room built out to look like a nostalgic vision of suburban childhood, all wooden countertops, tile, and hanging copper pans. Shelves of Nabisco products from Ritz Crackers to Oreos lined the walls. Miniature packs of Animal Crackers waited out in a large bowl, drawers opened to reveal boxes of Saltines. The finishing touch had nothing to do with Nabisco and everything to do with childhood associations: Kelley had the retailers install fridge cases filled with milk, backlit and glowing. Who wants to eat Oreos without a refreshing glass of milk to wash them down?
The store operators weren’t sold. They found it confusing and inconvenient to stock milk in two places at once. But from a sales perspective, the experiment was a smash. Sales of Nabisco products increased by as much as 32 percent, and the entire cookie-and-cracker segment experienced a halo effect, seeing double-digit jumps. Then, the unthinkable: The stores started selling out of milk. They simply couldn’t keep it on the shelves.
You’d think that the grocery stores would be thrilled, that it would have them scrambling to knock over their aisles of goods, building suites of perceptual rooms. Instead, they retreated. Nabisco’s parent company at the time, Kraft, was excited by the results and kicked the idea over to a higher-up corporate division where it stalled. And Buehler’s, for its part, never did anything to capitalize on its success. When the Nabisco took “Mom’s Kitchen” displays down, Kelley says, the stores didn’t replace them.
Mom’s Kitchen, fully stocked. (Photo by Tim Buchman)
“We were always asking a different question: What is the problem you’re trying to solve through food?” Kelley says. “It’s not just a refueling exercise — instead, what is the social, emotional issue that food is solving for us? We started trying to work that into grocery. But we probably did it a little too early, because they weren’t afraid enough.”
Since then, Kelley has continued to build his case to unreceptive audiences of male executiveswith mixed success. He tells them that when customers experiment — when the process of sampling, engaging, interacting, and evaluating an array of options becomes a source of pleasure — they tend to take more time shopping. And that the more time customers spend in-store, the more they buy. In the industry, this all-important metric is called “dwell time.” Most retail experts agree that increasing dwell without increasing frustration (say, with long checkout times) will be key to the survival of brick-and-mortar retail. Estimates vary on how much dwell time increases sales; according to Davinder Jheeta, creative brand director of the British supermarket Simply Fresh,customers spent 1.3 percent more for every 1 percent increase in dwell time in 2015.
Another way to increase dwell time? Offer prepared foods. Delis, cafes, and in-store restaurants increase dwell time and facilitate pleasure while operating with much higher profit margins and recapturing some of the dining-out dollar that grocers are now losing. “I tell my clients, ‘In five years, you’re going to be in the restaurant business,” Kelley says, “‘or you’re going to be out of business.’”
Kelley’s job, then, is to use design in ways that get customers to linger, touch, taste, scrutinize, explore. The stakes are high, but the ambitions are startlingly low. Kelley often asks clients what he calls a provocative question: Rather than trying to bring in new customers, would it solve their problems if 20 percent of customers increased their basket size by just two dollars? The answer, he says, is typically an enthusiastic yes.
Just two more dollars per trip for every fifth customer — that’s what victory looks like. And failure? That looks like a food marketplace dominated by Walmart and Amazon, a world where the neighborhood supermarket is a thing of the past.
* * *
When Shook Kelley started working on Niemann’s account, things began the way they always did: looking for emotional opportunities. But the team was stumped. Niemann’s stores were clean and expertly run. There was nothing wrong with them. Niemann’s problem was that he had no obvious problem. There was no there there.
Many of the regionals Kelley works with have no obvious emotional hook; all they know is that they’ve sold groceries for a long time and would like to keep on selling them. When he asks clients what they believe in, they show him grainy black-and-white photos of the stores their parents and grandparents ran, but they can articulate little beyond the universal goal of self-perpetuation. So part of Shook Kelley’s specialty is locating the distinguishing spark in brands that do nothing especially well, which isn’t always easy. At Buehler’s Fresh Foods, the chain where “Mom’s Kitchen” was piloted, the store’s Shook Kelley–supplied emotional theme is“Harnessing the Power of Nice.”
Still, Niemann Foods was an especially challenging case. “We were like, ‘Is there any core asset here?’” Kelley told me. “And we were like, ‘No. You really don’t have anything.’”
What Kelley noticed most was how depressed Niemann seemed, how gloomy about the fate of grocery stores in general. Nothing excited him — with one exception. Niemann runs a cattle ranch, a family operation in northeast Missouri. “Whenever he talked about cattle and feed and antibiotics and meat qualities, his physical body would change. We’re like, ‘My god. This guy loves ranching.’ He only had three hundred cattle or something, but he had a thousand pounds of interest in it.”
Niemann’s farm now has about 600 cattle, though it’s still more hobby farm than full-time gig — but it ended up being a revelation.During an early phase of the process, someone brought up “So God Made a Farmer” — a speech radio host Paul Harvey gave at the 1978 Future Farmers of America Convention that had been used in an ad for Ram trucks in the previous year’s Super Bowl. It’s a short poem that imagines the eighth day of the biblical creation, where God looks down from paradise and realizes his new world needs a caretaker. What kind of credentials is God looking for? Someone “willing to get up before dawn, milk cows, work all day in the fields, milk cows again, eat supper and then go to town and stay past midnight at a meeting of the school board.” God needs “somebody willing to sit up all night with a newborn colt. And watch it die. Then dry his eyes and say, ‘Maybe next year.’” God needs “somebody strong enough to clear trees and heave bails, yet gentle enough to yean lambs and wean pigs and tend the pink-combed pullets, who will stop his mower for an hour to splint the broken leg of a meadow lark.” In other words, God needs a farmer.
Part denim psalm, part Whitmanesque catalogue, it’s a quintessential piece of Americana — hokey and humbling like a Norman Rockwell painting, and a bit behind the times (of course, the archetypal farmer is male). And when Kelley’s team played the crackling audio over the speakers in a conference room in Quincy, Illinois, something completely unexpected happened. Something that convinced Kelley that his client’s stores had an emotional core after all, one strong enough to provide the thematic backbone for a new approach to the grocery store.
Rich Niemann, the jaded supermarket elder statesman, broke down and wept.
* * *
I have never been a fan of shopping. Spending money stresses me out. I worry too much to enjoy it. So I wanted to see if a Kelley store could really be what he said it was, a meaningful experience, or if it would just feel fake and hokey. You know, like the movies. When I asked if there was one store I could visit to see his full design principles in action, he told me to go to Harvest, “the most interesting store in America.”
Champaign is two hours south of O’Hare by car. Crossing its vast landscape of unrelenting farmland, you appreciate the sheer scale of Illinois, how far the state’s lower half is from Chicago. It’s a college town, which comes with the usual trappings — progressive politics, cafes and bars, young people lugging backpacks with their earbuds in — but you forget that fast outside the city limits. In 2016, some townships in Champaign county voted for Donald Trump over Hillary Clinton by 50 points.
I was greeted in the parking lot by Gerry Kettler, Niemann Foods’ director of consumer affairs. Vintage John Deere tractors formed a caravan outside the store. The shopping cart vestibules were adorned with images of huge combines roving across fields of commodity crops. Outside the wide-mouthed entryway, local produce waited in picket-fence crates — in-season tomatoes from Johnstonville, sweet onions from Warrensburg.
And then we stepped inside.
Everywhere, sunlight poured in through the tall, glass facade, illuminating a sequence of discrete, airy, and largely aisle-less zones. Kettler bounded around the store, pointing out displays with surprised joy on his face, as if he couldn’t believe his luck. The flowers by the door come from local growers like Delight Flower Farm and Illinois Willows. “Can’t keep this shit in stock,” he said. He makes me hold an enormous jackfruit to admire its heft. The produce was beautiful, he was right, with more local options than I’ve ever seen in a grocery store. The Warrensville sweet corn is eye-poppingly cheap: two bucks a dozen. There were purple broccolini and clamshells filled with squash blossoms, a delicacy so temperamental that they’re rarely sold outside of farmers’ markets. Early on, they had to explain to some teenage cashiers what they were — they’d never seen squash blossoms before.
I started to sense the “realm” Harvest inhabits: a distinctly red-state brand of America, local food for fans of faith and the free market. It’s hunting gear. It’s Chevys. It’s people for whom commercial-scale pig barns bring back memories of home. Everywhere, Shook Kelley signage — a hierarchy of cues like what Kelley dreamed up for Whole Foods — drives the message home. A large, evocative sign on the far wall reads Pure Farm Flavor, buttressed by the silhouettes of livestock, so large it almost feels subliminal. Folksy slogans hang on the walls, sayings like FULL OF THE MILK OF HUMAN KINDNESS and THE CREAM ALWAYS RISES TO THE TOP.
Then there are the informational placards that point out suppliers and methods.
There are at least a half dozen varieties of small-batch honey; you can find pastured eggs for $3.69. The liquor section includes local selections, like whiskey distilled in DeKalb and a display with cutting boards made from local wood by Niemann Foods’ HR Manager. “Turns out we had some talent in our backyard,” Kettler said. Niemann’s willingness to look right under his nose, sidestepping middlemen distributors to offer reasonably priced, local goods, is a hallmark of Harvest Market.
Champaign, IL’s Harvest Market is styled like Whole Foods for the Heartland—complete with a John Deere tractor stationed outside. (Photo courtesy of the author.)
Unlike most large-format grocery stores, Harvest Market buys some produce directly from farmers (like these sweet candy onions from Warrensville, IL, o, about 50 miles away). (Photo courtesy of the author.)
Interior of Harvest Market from the upper mezzanine, where shoppers gather for lunch and board games during the day and glasses of wine at night. (Photo courtesy of the author.)
By the cheese section, a glassed in contraption works night and day: a butter churner, which transforms local sweet cream into yellow, briskly selling-bricks of fat. (Photo courtesy of the author.)
Interior of Harvest Market from the upper mezzanine, where shoppers gather for lunch and board games during the day and glasses of wine at night. (Photo courtesy of the author.)
Harvest Market executives Gerry Kettler, left, and Rich Niemann chat with a salsa vendor visiting to do demos. (Photo courtesy of the author.)
That shortened chain of custody is only possible because of Niemann and the lifetime of supply-side know-how he brings to table. But finding ways to offer better, more affordable food has been a long-term goal of Kelley — who strained his relationship with Whole Foods CEO John Mackey over the issue. As obsessed as Kelley is with appearances, he insists to me that his work must be grounded in something “real”: that grocery stores only succeed when they really try to make the world a better place through food. In his view, Whole Foods wasn’t doing enough to address its notoriously high prices — opening itself up to be undercut by cheaper competition, and missing a kind of ethical opportunity to make better food available to more people.
“When,” Kelley remembers asking, “did you start to mistake opulence for success?”
In Kelley’s telling, demand slackened so much during the Great Recession that it nearly lead to Whole Foods’ downfall, a financial setback that the company never fully recovered from — and, one could argue, ultimately led to its acquisition. Harvest Market, for its part, has none of Whole Foods’ clean-label sanctimony. It takes an “all-of-the-above” approach: There’s local produce, but there’re also Oreos and Doritos and Coca-Cola; at Thanksgiving, you can buy a pastured turkey from Triple S Farms or a 20-pound Butterball. But that strong emphasis on making local food more accessible and affordable makes it an interesting counterpart to Kelley’s former client.
The most Willy Wonka–esque touch is the hulking piece of dairy processing equipment in a glass room by the cheese case. It’s a commercial-scale butter churner — the first one ever, Kettler told me, to grace the inside of a grocery store.
“So this was a Shook Kelley idea,” he said, “We said yes, without knowing how much it would cost. And the costs just kept accelerating. But we’re thrilled. People love it.”Harvest Market isn’t just a grocery store — it’s also a federally inspected dairy plant. The store buys sweet cream from a local dairy, which it churns into house-made butter, available for purchase by the brick and used throughout Harvest’s bakery and restaurant. The butter sells out as fast as they can make it. Unlike the grocers who objected to “Mom’s Kitchen,” the staff don’t seem to mind.
As I walked through the store, I couldn’t help wondering how impressed I really was. I found Harvest to be a beautiful example of a grocery store, no doubt, and a very unusual one. What was it that made me want to encounter something more outrageous, more radical, more theatrical and bizarre? I wanted animatronic puppets. I wanted fog machines.
I should have known better — Kelley had warned me that you can’t take the theater of retail too far without breaking the dream. He’d told me that he admires stores where “you’re just not even aware of the wonder of the scene, you’re just totally engrossed in it” — stores a universe away from the overwrought, hokey feel of Disneyland. But I had Amazon’s new stores in the back up my mind as a counterpoint, with all their cashierless bells and whistles, their ability to click and collect, their ability to test-drive Alexa and play a song or switch on a fan. I guess, deep down, I was wondering if something this subtle really could work.
“Here, this is Rich Niemann,” Kettler said, and I found myself face-to-face with Niemann himself. We shook hands and he asked if I’d ever been to Illinois before. Many times, I told him. My wife is from Chicago, so we’ve visited the city often.
He grinned at me.
“That’s not Illinois,” he said.
We walked to Harvest’s restaurant, a 40-person seating area plus an adjacent bar with a row of stools, that offers standards like burgers, salads, and flatbreads. There’s an additional 80-person seating area on the second-floor mezzanine, a simulated living room complete with couches and board games. Beyond that, they pointed out the brand-new wine bar — open, like the rest of the space, until midnight. There’s a cooking classroom by the corporate offices. Through the window, I saw a classroom full of children doing something to vegetables. Adult Cooking classes run two or three nightsevery week, plus special events for schools and other groups.
For a summer weekday at noon in a grocery store I’m amazed how many people are eating and working on laptops. One guy has his machine hooked up to a full-sized monitor he lugged up the stairs — he’s made a customized wooden piece that hooks into Harvest’s wrought-iron support beams to create a platform for his plus-size screen. He comes every day, like it’s his office. He’s a dwell-time dream.
We sit down, and Kettler insists I eat the corn first, slathering it with the house-made butter and eating it while it’s hot. He reminds me that it’s grown by the Maddoxes, a family in Warrensburg, about 50 miles west of Champaign.
The corn was good, but I wanted to ask Niemann if the grocery industry was really that bad, and he told me it is. I assume he’ll want to talk about Amazon and its acquisition of Whole Foods and the way e-commerce has changed the game. He acknowledges that, but to my surprise he said the biggest factor is something else entirely — a massive shift happening in the world of consumer packaged goods, or CPGs.
For years, grocery stores never had to advertise, because the largest companies in the world — Proctor and Gamble, Coca-Cola, Nestle — did their advertising for them, just the way Nabisco helped finance “Mom’s Kitchen” to benefit the stores. People came to supermarkets to buy the foods they saw on TV. ButAmericans are falling out of love with legacy brands. They’re looking for something different, locality, a sense of novelty and adventure. Kellogg’s and General Mills don’t have the pull they once had.
When their sales flag, grocery sales do too — and the once-bulletproof alliance between food brands and supermarkets is splitting. For the past two years, the Grocery Manufacturers’ Association, an influential trade group representing the biggest food companies in the world,started to lose members. It began with Campbell’s Soup. Dean Foods, Mars, Tyson Foods, Unilever, Hershey Company, the Kraft Heinz Company, and others followed. That profound betrayal was a rude awakening: CPG companies don’t need grocery stores. They have Amazon. They can sell directly through their websites. They can launch their own pop-ups.
It’s only then that I realized how dire the predicament of grocery stores really is, and why Niemann was so frustrated when he first called Kevin Kelley. It’s one thing when you can’t sell as cheaply and conveniently as your competitors. But it’s another thing when no one wants what you’re selling.
Harvest doesn’t feel obviously futuristic in the way an Amazon store might. If I went there as a regular shopper and not as a journalist sniffing around for a story, I’m sure I’d find it to be a lovely and transporting way to buy food. But what’s going on behind the scenes is, frankly, unheard of.
Grocery stores have two ironclad rules. First, that grocers set the prices, and farmers do what they can within those mandates. And second, that everyone works with distributors who oversee the aggregation and transport of all goods. Harvest has traditional relationships with companies like Coca-Cola, but it breaks those rules with local farmers and foodmakers. Suppliers — from the locally milled wheat to the local produce to the Kilgus Farms sweet cream that goes into the churner — truck their products right to the back. By avoiding middlemen and their surcharges, Harvest is able to pay suppliers more directly and charge customers less. And it keeps costs low. You can still find $4.29 pints of Halo Top ice cream in the freezer, but the produce section features stunning bargains. When the Maddox family pulls up with its latest shipment of corn, people sometimes start buying it off the back of the truck in the parking lot. That’s massive change, and it’s virtually unheard of in supermarkets. At the same time, suppliers get to set their own prices. Niemann’s suppliers tell him what they need to charge; Niemann adds a standard margin and lets customers decide if they’re willing to pay.
If there’s a reason Harvest matters, it’s only partly because of the aesthetics. It’s mainly because the model of what a grocery store is has been tossed out and rebuilt. And why not? The world as Rich Niemann knows it is ending.
* * *
In 2017, just months after Harvest Market’s opening, Niemannwon the Thomas K. Zaucha Entrepreneurial Excellence Award — the National Grocers Association’s top honor, given for “persistence, vision, and creative entrepreneurship.” That spring, Harvest was spotlighted in a “Store of the Month” cover feature in the influential trade magazine Progressive Grocer. Characteristically, the contributions of Kelley and his firm were not mentioned in the piece.
Niemann tells me his company is currently planning to open a second Harvest Market in Springfield, Illinois, about 90 minutes west of Champaign, in 2020. Without sharing specifics about profitability or sales numbers, he says the store was everything he’d hoped it would be as far as the metrics that most matter — year-over-year sales growth and customer engagement. His only complaint about the store, has to do with parking. For years, Niemann has relied on the same golden ratio to determine the size of parking lot needed for his stores — a certain number of spots for every thousand dollars of expected sales. Harvest’s lot uses the same logic, and it’s nowhere near enough space.
“In any grocery store, the customer’s first objective is pantry fill — to take care of my needs as best I can on my budget,” Niemann says. “But we created a different atmosphere. These customers want to talk. They want to know. They want to experience. They want to taste. They’re there because it’s an adventure.”
They stay so much longer than expected that the parking lot sometimes struggles to fit all their cars at once. Unlike the Amazon stores that may soon be cropping up in a neighborhood near you — reportedly, the company is considering plans to open 3,000 of them in by 2021 — it’s not about getting in and out quickly without interacting with another human being. At Harvest, you stay awhile. And that’s the point.
ButAmericans are falling out of love with legacy brands. They’re looking for something different, locality, a sense of novelty and adventure. Kellogg’s and General Mills don’t have the pull they once had.
So far, Harvest’s success hasn’t made it any easier for Kelley, who still struggles to persuade clients to make fundamental changes. They’re still as scared as they’ve always been, clinging to the same old ideas. He tells them that, above all else, they need to develop a food philosophy — a reason why they do this in the first place, something that goes beyond mere nostalgia or the need to make money. They need to build something that means something, a store people return to not just to complete a task but because it somehow sustains them. For some, that’s too tall an order. “They go, ‘I’m not going to do that.’ I’m like, ‘Then what are you going to do?’ And they literally tell me: ‘I’m going to retire.’” It’s easier to cash out. Pass the buck, and consign the fate of the world to younger people with bolder dreams.
Does it even matter? The world existed before supermarkets, and it won’t end if they vanish. And in the ongoing story of American food, the 20th-century grocery store is no great hero. A&P — the once titanic chain, now itself defunct — was a great mechanizer, undercutting the countless smaller, local businesses that used to populate the landscape. More generally, the supermarket made it easier for Americans to distance ourselves from what we eat, shrouding food production behind a veil and letting us convince ourselves that price and convenience matter above all else. We let ourselves be satisfied with the appearance of abundance — even if great stacks of unblemished fruit contribute to waste and spoilage, even if the array of brightly colored packages are all owned by the same handful of multinational corporations.
But whatever springs up to replace grocery stores will have consequences, too, and the truth is that brick-and-mortar is not going away any time soon — far from it. Instead, the most powerful retailers in the world have realized that physical spaces have advantages they want to capitalize on. It’s not just that stores in residential neighborhoods work well as distribution depots, ones that help facilitate the home delivery of packages. And it’s not just that we can’t always be home to pick up the shipments we ordered when they arrive, so stores remain useful. The world’s biggest brands are now beginning to realize what Kelley has long argued: Physical stores are a way to capture attention, to subject customers to an experience, to influence the way they feel and think. What could be more useful? And what are Amazon’s proposed cashierless stores, but an illustration of Kelley’s argument? They take a brand thesis, a set of core values — that shopping should be quick and easy and highly mechanized — and seduce us with it, letting us feel the sweep and power of that vision as we pass with our goods through the doors without paying, flushed with the thrill a thief feels.
This is where new troubles start. Only a few companies in the world will be able to compete at Amazon’s scale — the scale where building 3,000 futuristic convenience stores in three years may be a realistic proposition. Unlike in the golden age of grocery, where different family owned chains catered to different demographics, we’ll have only a handful of players. We’ll have companies that own the whole value chain, low to high. Amazon owns the e-commerce site where you can find almost anything in the world for the cheapest price. And for when you want to feel the heft of an heirloom tomato in your hand or sample some manchego before buying, there is Whole Foods. Online retail for thrift, in-person shopping for pleasure. Except one massive company now owns them both.
If this new landscape comes to dominate, we may find there are things we miss about the past. For all its problems, the grocery industry is at least decentralized, owned by no one dominant company and carved up into more players than you could ever count. It’s run by people who often live alongside the communities they serve and share their concerns. We might miss that competition, that community. They are small. They are nimble. They are independently, sometimes even cooperatively, owned. They employ people. And if they are scrappy, and ingenious, and willing to change, there’s no telling what they might do. It is not impossible that they could use their assets — financial resources, industry connections, prime real estate — to find new ways to supply what we all want most: to be happier, to be healthier, to feel more connected. To be better people. To do the right thing.
I want to believe that, anyway. That stores — at least in theory — could be about something bigger, and better than mere commerce. The way Harvest seems to want to be, with some success. But I wonder if that’s just a fantasy, too: the dream that we can buy and sell our way to a better world, that it will take no more than that.
Which one is right?
I guess it depends on how you feel about the movies.
Maybe a film is just a diversion, a way to feel briefly better about our lives, the limitations and disappointments that define us, the things we cannot change. Most of us leave the theater, after all, and just go on being ourselves.
Still, maybe something else is possible. Maybe in the moment when the music swells, and our hearts beat faster, and we feel overcome by the beauty of an image — in the instant that we feel newly brave and noble, and ready to be different, braver versions of ourselves — that we are who we really are.
* * *
Joe Fassler, The Counter’s deputy editor, has covered the intersection of food, policy, technology, and culture for the magazine since 2015. His food reporting has twice been a finalist for the James Beard Foundation Award in Journalism. He’s also editor ofLight the Dark: Writers on Creativity, Inspiration, and the Creative Process (Penguin, 2017), a book based on “By Heart,” his ongoing series of literary conversations for The Atlantic.
Editor: Michelle Weber Fact checker: Matt Giles Copy editor: Jacob Z. Gross
Soraya Roberts | Longreads | April 2019 | 10 minutes (2,422 words)
INT. COFFEE SHOP – DAY
SORAYA sits down at her laptop with a cookieor some cake or that weirdly oversize banana bread. As she startsworking on a column like this one, the woman next to her, workingon a spreadsheet, glances at Soraya’s desktop and turns to her.
WOMAN: What do you do?
SORAYA: I’m a columnist.
WOMAN: Holy shit, that’s cool.
I starred in this scene two weeks ago, and again just this past week at a party. The women don’t have to tell me why they think it’s cool, I know why: Carrie Bradshaw. An apartment in New York, a photo on the side of a bus, Louboutins, tutus, and a column at the top of each week. Which is why I qualify it every time: “I don’t make as much as Carrie Bradshaw.” Yes, the job is cool, and it is holy-shit-worthy because so few journalists are able to actually work as journalists. But I’m freelance: I can cover my rent but can’t buy a house, I don’t get benefits, and I might be out of a job next week. Not to mention that I might not be so lucky next time. The women usually turn back to their admin after that — admin looks a lot cooler than journalism these days. But only if you’re not going by Sex and the City or basically every other journalism movie or series that has come after, all of which romanticize an industry which has a knack for playing into that.
“This is the end of an era, everything’s changing,” Gina Rodriguez tells her friends in the trailer for Someone Great, a new Netflix rom-com in which she, a music journalist, gets a job. At a magazine. In San Francisco. This is not a sci-fi movie in which the character has time traveled back to, I don’t know, 1975. It is only one recent example of the obfuscation of what journalism actually means now. There’s also the Hulu series Shrill, which presents itself as if it were current-day but is based on the life of Lindy West, who had a staff job at the Seattle alt-weekly The Stranger when you could still have a staff job and make a name for yourself with first-person essays, i.e., 2009. Special (another Netflix show) also harkens back to that time, and though it’s more overt about how exploitative online media can be — the hero is an intern with cerebral palsy who writes about his disability (which he claims is from a car accident) for clicks — the star is still hired straight out of an internship. (What’s an internship?)
Hollywood romanticizes everything, you say? Perhaps, but this is a case where the media itself seems to be actively engaging in a certain kind of deception about how bad its own situation actually is. In February, The Washington Post, which is no doubt still benefiting from the press off the still-gold-standard journalism movie — 1976’s All the President’s Men — ran a Super Bowl ad narrated by Tom Hanks, which applauds late journalists Marie Colvin and Jamal Khashoggi, who, in their words, brought the story, “no matter the cost.” The spot highlighted what we already know, which is that we need journalism to be a functioning democracy and that many journalists risk their lives to guarantee it. What it kept in darkness (ha), however, was that to do their job properly, those journalists need protection and they need resources — provided by their editors and by their publishers. Hanks, of course, starred in The Post, Steven Spielberg’s 2017 film based on the journalists who reported on the Pentagon Papers in 1971. The ad was using the past to promote the future, rather than dealing with a present, in which more than 2,400 people lost media jobs in the first three months of the year and journalists are trying to unionize en masse. But that’s not particularly telegenic, is it?
* * *
The romanticized idea of the journalist — dogged, trenchcoated — really took off at the movies. In 1928, ex-reporters Ben Hecht and Charles MacArthur wrote a play which was adapted into The Front Page, a 1931 screwball that became the journalism movie prototype, with fast dialogue and faster morals. My favorite part is that not only is the star reporter trying to quit the paper (in this economy?), but his editor will do anything — including harboring an accused murderer — to keep him on staff. Matt Ehrlich, coauthor of Heroes and Scoundrels: The Image of the Journalist in Popular Culture, once told me for Maclean’s that The Front Page came out of the “love-hate relationship” the writers had with the industry even back then. “The reporters are absolute sleazebags, they do horrible things,” he said. “At the same time The Front Page makes journalism seem very exciting, and they do get the big scoop.” Ehrlich also told me that some initially thought All the President’s Men, which eventually became the prototype of the journalism movie, was reminiscent of the earlier era of the genre. In case you are not a journalist and so haven’t seen it, Robert Redford and Dustin Hoffman starred as Bob Woodward and Carl Bernstein, The Washington Post reporters whose stories on the Watergate burglary and subsequent cover-up helped lead to President Nixon’s resignation. While the film also played fast and loose with the truth, it had a veneer of rumpled repetitious reality — not to mention a strong moral core that made taking down the president with a typewriter seem, if implausible, at least not impossible.
In February, Education Week reported that a survey of 500 high school journalism teachers across 45 states found that, in the past two years, 44 percent of U.S. school teachers saw a rise in journalism enrollment and a 30 percent increase in interest in journalism higher education. “This is this generation’s Watergate,” the executive director of the National Scholastic Press Association said. “With President Trump, everyone is really in tune to the importance of a free press.” Sure. But this isn’t 1976. No doubt there are scores of WoodSteins out there, but not only do a number of journalists no longer have the resources or the time to follow stories of any kind, they rarely have the salaried staff positions to finance them, nor the editors and publishers to support them doing the job they were hired to do. In All the President’s Men, executive editor Ben Bradlee asks WoodStein if they trust their source, before muttering “I can’t do the reporting for my reporters, which means I have to trust them. And I hate trusting anybody.” Then he tells them to “Run that baby.” These days there is little trust in anything beyond the bottom line.
The myth is that All the President’s Men led to a surge of interest in journalism as a career. But in reality it was women, increasingly educated post-liberation, whose interest explained the surge. (My editor is asking: “Is it an accident that shitting on journalism as a worthy profession coincided with women moving into journalism?” My reply is: “I think not.”) Still, women remain underrepresented in the field to this day, a fact reflected by the paucity of movies about the work of female journalists. While there were scores of ’70s and ’80s thrillers built around male reporters with too much hair taking down the man, for the women … there was The China Syndrome, with Jane Fonda as a television reporter named Kimberly covering a nuclear power plant conspiracy. And, um, Absence of Malice? Sally Field is a newspaper reporter who sleeps with her subject (I mean, it is Paul Newman). I guess I could include Broadcast News, which stars Holly Hunter as a neurotic-but-formidable producer and personified the pull between delivering the news and delivering ratings (the analog version of clicks). But Network did that first and more memorably, with its suicidal anchorman lamenting the demise of media that matters. “I’m a human being, GODDAMN IT!!!” he shouts into the void. “My life has value!!!” You don’t hear female journalists saying that on-screen, though you do hear them saying “I do” a whole lot.
The quintessential journalism film and the quintessential rom-com are in fact connected. Nora Ephron, who was briefly married to Carl Bernstein, actually cowrote an early script for All the President’s Men. While it was chucked in favor of William Goldman’s, she went on to write When Harry Met Sally, and I’ll forgive you for not remembering that Sally was a journalist. She probably only mentions it twice because this was 1989, an era in which you decided to be a journalist and then you became one — the end. The movie treats reporting like it’s so stable it’s not even worth mentioning, like being a bureaucrat. Sally could afford a nice apartment, she had plenty of time to hang out with Harry, so what was there to gripe about (Good Girls Revolt would suggest Ephron’s trajectory was less smooth, but that’s another story)? Four years later, in Sleepless in Seattle, Meg Ryan is another journalist in another Ephron movie, equally comfortable, so comfortable in fact that her editor pays her to fly across the country to stalk Tom Hanks. This newspaper editor literally assigns a reporter to take a plane to Seattle from Chicago to “look into” a possible lifestyle story about a single white guy. (Am I doing something wrong?!?!)
Journalism and rom-coms were fused from almost the start, around the ’30s and ’40s. The Front Page went from being a journalism movie to being a rom-com when it turned its hero into a heroine for His Girl Friday. The reporter repartee and the secretive nature of the job appeared to lend themselves well to Hays-era screwballs, though they also indelibly imprinted a lack of seriousness onto their on-screen female journalists. After a brief moment in the 1970s when The Mary Tyler Moore Show embodied the viability of a woman journalist who puts work first, the post-Ephron rom-coms of the 2000s were basically glossy romances in “offices” that were really showrooms for a pink-frosted fantasy girl-reporter gig no doubt thought up by male executives who almost certainly saw All the President’s Men and almost certainly decided a woman couldn’t do that and who cares anyway because the real story is how you’re going to get Matthew McConaughey to pop the question. I can’t with the number of women who recently announced that 13 Going on 30 — the movie in which Jennifer Garner plays a literal child successfully running a fashion magazine — made them want to be journalists. But the real death knell of the aughts journo-rom-com, according to rom-com columnist Caroline Siede, was in 2003 with How to Lose a Guy in 10 Days in 2003. In that caper, Kate Hudson has a job as a columnist despite thinking it is completely rational to write a piece called “How to Bring Peace to Tajikistan” for her Cosmo-type fashion magazine.
* * *
In 2016, the Oscar for Best Picture went to Spotlight, which follows The Boston Globe’s titular investigative team — three men, one woman — as it uncovers the Catholic Church abuse scandal. The film earned comparisons to All the President’s Men for its focus on journalistic drudgery, but it also illustrated the growing precariousness of the newsroom with the arrival of the web. In one scene, executive editor Marty Baron expresses shock when he is told it takes a couple of months for the team to settle on a story and then a year or more to investigate it. At the same time, Baron and two other editors are heavily involved and supportive of the three reporters, who went on to win the Pulitzer in 2003 and remained on the team for years after. Released only 12 years after the fact, the film suggested that journalists who win Pulitzers have some kind of security, which, you know, makes sense, and is maybe true at The Boston Globe. But two years after Spotlight came out, David Wood, who had won HuffPost its only Pulitzer, was laid off. As one of BuzzFeed’s reporters told The Columbia Journalism Review after BuzzFeed shed 15 percent of its staff, “It’s this sense that your job security isn’t tied to the quality of your work.”
“We have so much to learn from these early media companies and in many ways it feels like we’re at the start of another formative era of media history where iconic companies will emerge and thrive for many decades,” BuzzFeed founder and CEO Jonah Peretti blew hard in a memo in 2014, referring to traditional outfits like Time and The New York Times. But both those publications have unions, which Peretti has been clear he doesn’t think “is right” for his company. “A lot of the best new-economy companies are environments where there’s an alliance between managers and employees,” he said in 2015. “People have shared goals.” In this case the shared goals seem to be that Peretti profits (his company was valued at more than $1 billion in 2016) while his staff is disposable.
Which brings us back to the Globe in 2019. That is to say the real one, not the romanticized one. This version of the Globe hires a Gonzo-esque leftist political writer named Luke O’Neil as a freelancer and publishes his “controversial” op-ed about the Secretary of Homeland Security’s resignation titled “Keep Kirstjen Nielsen unemployed and eating Grubhub over her kitchen sink.” “One of the biggest regrets of my life is not pissing in Bill Kristol’s salmon,” it opened, and it concluded with, “As for the waiters out there, I’m not saying you should tamper with anyone’s food, as that could get you into trouble. You might lose your serving job. But you’d be serving America. And you won’t have any regrets years later.” The article was gone by Friday, pulled upon the request of the paper’s owners (O’Neil sent me the original). According to WGBH, a now-deleted note on the opinion page stated that the article “did not receive sufficient editorial oversight and did not meet Globe standards. The Globe regrets its lack of vigilance on the matter. O’Neil is not on staff.” And, oh, man, that last line. It says everything there is to say about modern journalism that is unspoken not only on-screen but by the culture at large and the media in it. It says you serve us but we provide no security, no benefits, no loyalty. It says, unlike Spotlight or All the President’s Men or even The Front Page, we do not have your back. Because if they did, you better believe it would have a good chance of ending up on-screen.
The 11,000 people who attend the Association of Writers & Writing Programs’ annual Conference & Bookfair (AWP) come for professional advancement and to build community. They come to attend panels, to stay motivated after graduate school, to promote their magazines, book presses, and graduate programs and to choose magazines to write for, books to read, and graduate programs to attend. For many attendees, AWPis a chance to talk shop deep into the night. I came this year for many of these reasons, and also to improve my editing abilities.
Even though I work as an editor, I have a lot to learn, and the editors on the panel “Editor-Author Relationships: How Should They Be?” offered tons of practical wisdom. Jennifer Acker from The Common magazine moderated a group that included John Freeman of Grove/Atlantic, Freeman’s, and Granta, One Story editor Patrick Ryan, and Catapult managing editor Matthew Ortile. Freeman is a quote machine; his mind moved so quickly I could barely write down what he said. Read more…
Getty, Alberto E. Tamargo / AP, Photo illustration by Katie Kosma
Jen Doll | Longreads | April 2019 | 18 minutes (4,598 words)
According to those jaded but constant belief systems that keep the worst romantic comedies in business, the third date is the make-or-break one. In these busy times, the idea goes, by date three you’ve spent enough time together to determine if either of you is a serial killer, or hiding something very bad in your closet (metaphorical or otherwise), or has the tendency to type “hehehe” when laughing by text. And if the relationship by date three veers toward make rather than break, well, finally the “rules” have lifted: It is THE MOMENT to get naked (not at the restaurant, please). The thinking is based in some combination of propriety and sexual policing and also sheer time management: You haven’t put so much energy or effort into this budding romance that uncovering an in-the-sheets incompatibility ruins your entire life — but it’s also not so soon it’s considered “rushing in,” which, when applied to women, of course, means “being too slutty.”
No matter that “slutty” is an outmoded, sexist concept and that you should sleep with a person if and when you feel like it (and if and when they consent), I grew up with “the third date’s the sex date!” pressed upon me as, if not law, then at least a kind of informed ideology: Do it then to uncover any latent micropenises or irrecoverable technique problems; do it then to get it over with because would you look at that elephant in the room?; do it then to get the rest of your relationship started; do it then because by the third date, what else is there to do?
So, when it came time for the third date with a man I’d been seeing — a guy who lived in upstate New York, which meant our third date would be more of a weekend visit; did each night count as a date, I wondered, or was it the whole package, a kind of Club Med situation with dinners and entertainment included? — there was a certain amount of buried internal stress and anticipation related to the event. Not that I was going to go get a Brazilian, or anything. I was in my 40s. Those days of paying a stranger to rip large swathes of hair from my nether regions had blessedly gone by the by. (Yes, I said “nether regions.”) But in my brain, a place far more difficult for strangers to reach, my thoughts were going a little bit wild. I’d been dumped earlier in the year, I’d gotten back up and shaken myself off, I’d tried again, and I’d actually met someone. But how many rounds of the dating game was I prepared to endure? If things went in the direction of “break” — what next, not only for me and this guy, but maybe for me and anyone? This is what rom-coms never really tackle: What happens when you get so tired of dating, so disappointed by all the prospects, you just give up?
In the absence of answers, I sought to occupy myself. I took a train to Beacon, New York, a town about an hour away from where my date lived — he’d pick me up there the next day, and our third date would begin — and met some friends I was just getting to know. We watched a poet read from her impressive collection in a garden, surrounded by trees and flowers and sunshine. I wasn’t even so sure how I felt about poetry readings, but I liked this version of me, trying new things, with different people. I bought several of the poet’s books, and had her sign one, even though I’d not known much of her work until that moment. Read more…
John Stillwell / PA Wire / Press Association via AP Images
Alana Mohamed | Longreads | April 2019 | 10 minutes (2,756 words)
Tracy Chevalier’s 1999 novel, Girl With a Pearl Earring, was a surprise best-seller. “Who was going to read a book about a Dutch painter?” Chevalier remembers wondering. But her fictional, highly compelling heroine, Griet, made for a popular window into Vermeer’s world. As the maid sent to work for Vermeer’s family in 17th century Delft, Griet elucidates many of the divisions of the time — between rich and poor, man and woman, and Catholic and Protestant. Chevalier said she was compelled to write the novel after wondering “what Vermeer did to her [the model] to make her look like that … I saw it as a portrait of a relationship rather than a portrait of a girl.” Readers praised Chevalier’s research, which took her to Amsterdam and the Hague while pregnant. “Chevalier’s writing skill and her knowledge of seventeenth-century Delft are such that she creates a world reminiscent of a Vermeer interior,” a brief New Yorker review reads. The New York Times and Christian Science Monitor were both similarly impressed with Chevalier’s world-building.
Some readers were, however, resistant to the idea that Griet, who in the novel possesses a keen artistic eye, would become an integral part of Vermeer’s work. In its review, Publisher’s Weekly claimed these details “demands one stretch of the reader’s imagination,” and “threaten to rob the novel of its credibility.” In 2017, Wolf Hall author Hilary Mantel rankled feathers in the historical literature community when she criticized the proclivity of modern writers to empower their historical subjects in such a way. She asked, “If we write about the victims of history, are we reinforcing their status by detailing it? Or shall we rework history so victims are the winners?” The question is reductive and misleads, but does point to the impossibility of writing about women forgotten by history as just themselves. Like Griet, they become conduits by which we dissect their cultures.
Today, uncovering women’s lives has become a mainstream project. The Paris Review has started a “Feminize Your Canon” series dedicated to underappreciated women writers. The New York Times’ “Overlooked” series is a retrograde edit of its obituary section, long dominated by white men. Both projects seek to increase the visibility of women who have long been rendered invisible by historical ambivalence. However, these are women who accomplished the extraordinary, women who may have been waylaid from greatness. As the Telegraph also notes, for Chevalier, “Research failed to make good the gaps Chevalier’s imagination was already painting in like a picture restorer.” Read more…
I tap lightly on the computer on my lap, trying to go unnoticed. I’m on the couch in the living room, and my only child Luis Manuel, who is 17, is playing the piano in the dining room. I can see him from where I’m seated, his head down, engrossed in a solo, playing licks I’ve heard him play before and some that sound new. I try not to stare, to stay focused on my work, because I know he’ll see me from the corner of his eye, and I’ll have broken the spell.
I hate when he asks me to leave — “Can’t you go upstairs?”
He used to cry whenever I was out of sight, wouldn’t let anyone but his dad or me hold him, and cried incessantly when babysat. He did this until he was 4. When I’d take him to the park, he’d play for only a minute or two at a time before looking up to make sure I was still there. His difficult case of stranger anxiety made it so he wouldn’t walk on his own until he was 16 months, even though I knew he could. He held onto my index finger and walked confidently, but he wouldn’t let go. If I tried to get him to release my finger and walk unattached, he’d sit straight down on the floor. When I couldn’t stoop over to let him hold my finger any longer, he’d happily go back to being carried in a sling on my hip, one dimpled baby-hand resting on my chest.
Many suggested I was coddling him, that I was not letting my-small-for-his-age, shy, only 1-and-a-half-year-old child be independent.
I watch him play piano when I’m cooking, too. In the kitchen on the other side of the dining room, his back to me, it’s easier for him not to notice me there listening for a song I haven’t heard him play before, straining my eyes to make out the title at the top of the sheet music. Sometimes, I’ll pour a glass of wine and lean on the counter, and just listen while the food simmers on the stove. He is astoundingly good. It feels more like hanging in a jazz club than cooking dinner.
When he’s out at one of his many rehearsals or gigs, on nights when I’m preparing a meal and waiting for him to get home, I stand in the doorway between the kitchen and the dining room, and look at the piano, dark red-brown in a high gloss with gold hinges, no piano light, no head full of black hair hanging over the keyboard, no music. I try not to think about the long stretches of time the piano will sit unplayed. Like death, I force the thought out of my head and put on a record instead, because sooner than his dada and I can handle, the time with our son, as we have known it, is coming to an end. If all goes as planned, in a hand-full of months, he’ll be gone, playing piano at some college for teachers who will help him improve his technique, and teach him to compose, but nobody will ever appreciate the way he plays like we do, at all hours of the day and night. Read more…
"Orb of Ambivalence," Jenny Odell, digital print, 2017. "This print collects people from 1980s-era computer ads and catalog images. In the original image from which each person was taken, he or she was touching a computer, keyboard, or mouse."
“I almost got locked in here once,” Jenny Odell tells me as we step into a mausoleum. We’re at the Chapel of the Chimes, which sits at the base of Oakland’s sprawling Mountain View Cemetery. The chapel first opened in 1909, and was redesigned in 1928 by Julia Morgan (the architect of Hearst Castle) with Gothic flourishes that mirror the Alhambra in Spain — rooms are filled with glass bookshelves, marbled hallways spill out into courtyards, skylights abound, and once you’re inside it’s difficult to find your way out even if you, like Odell, come here on an almost weekly basis. The books that line the walls are not actually books, they are urns. It’s essentially a library of the dead — the acoustics are perfect and there’s no sound inside save for our footsteps. The Chapel used to keep cages of canaries scattered around, but people wouldn’t stop setting them free. Read more…
Interested in more by Jill Talbot and Marcia Aldrich? Read their collaborative essay, Trouble.
She was old when she had me, or so I thought. She had given birth to two daughters in her twenties during her first marriage. Then her husband died unexpectedly and the period of being a single mother began. Her hair began to turn gray and a red rash ran down the middle of her face, a rash of grief. Eventually she met my father, married, and the rash disappeared. Some years later I arrived when she was 40. Twelve years separated me from my sisters.
Now, when women wait longer to have children, aided by infertility treatments and surrogacy options, my mother wouldn’t seem old at all. She wouldn’t be an outlier. But when I was growing up, my mother looked so much older than all the other mothers. Sometimes I thought she was rushing toward aging, embracing it rather than pushing it away, as if it was the destination she was looking for. She wore her gray hair in a teased bouffant that was hard and outdated, concocted weekly at a hair salon with Julie. I wondered if she deliberately chose the style to ward off touching — touching by my father, touching by me. I don’t remember her ever touching me affectionately, as strange as that may sound. Or touching my father. She looked off-putting, someone who held herself as stiffly as the hard-shelled purse she carried on her arm. If a bee buzzed about her head, it might get caught in the hair-sprayed formation she called her hair. Other mothers were softer looking, and more welcoming. She never wore jeans and sneakers, never allowed her hair to blow onto her face — she never looked disheveled. She looked polished as if she was heading off to a professional meeting that she would be overseeing and yet she held no job.
In 1993 some journalists began to be dimly aware of something clunkily referred to as “the information superhighway” but few had ever had reason to see it in action. At the start of 1995 only 491 newspapers were online worldwide: by June 1997 that had grown to some 3,600.
In the basement of the Guardian was a small team created by editor in chief Peter Preston — the Product Development Unit, or PDU. The inhabitants were young and enthusiastic. None of them were conventional journalists: I think the label might be “creatives.” Their job was to think of new things that would never occur to the largely middle-aged reporters and editors three floors up.
The team — eventually rebranding itself as the New Media Lab — started casting around for the next big thing. They decided it was the internet. The creatives had a PC actually capable of accessing the world wide web. They moved in hipper circles. And they started importing copies of a new magazine, Wired — the so-called Rolling Stone of technology — which had started publishing in San Francisco in 1993, along with the HotWired website. “Wired described the revolution,” it boasted. “HotWired was the revolution.” It was launched in the same month the Netscape team was beginning to assemble. Only 18 months later Netscape was worth billions of dollars. Things were moving that fast.
In time, the team in PDU made friends with three of the people associated with Wired. They were the founders, Louis Rossetto, and Jane Metcalfe; and the columnist Nicholas Negroponte, who was based at the Massachusetts Institute of Technology and who wrote mindblowing columns predicting such preposterous things as wristwatches which would “migrate from a mere timepiece today to a mobile command-and-control center tomorrow . . . an all-in-one, wrist-mounted TV, computer, and telephone.”
As if.
Both Rossetto and Negroponte were, in their different ways, prophets. Rossetto was a hot booking for TV talk shows, where he would explain to baffled hosts what the information superhighway meant. He’d tell them how smart the internet was, and how ethical. Sure, it was a “dissonance amplifier.” But it was also a “driver of the discussion” towards the real. You couldn’t mask the truth in this new world, because someone out there would weigh in with equal force. Mass media was one-way communication. The guy with the antenna could broadcast to billions, with no feedback loop. He could dominate. But on the internet every voice was going to be equal to every other voice.
“Everything you know is wrong,” he liked to say. “If you have a preconceived idea of how the world works, you’d better reconsider it.”
Negroponte, 50-something, East Coast gravitas to Rossetto’s Californian drawl, was working on a book, Being Digital, and was equally passionate in his evangelism. His mantra was to explain the difference between atoms — which make up the physical artifacts of the past — and bits, which travel at the speed of light and would be the future. “We are so unprepared for the world of bits . . . We’re going to be forced to think differently about everything.”
I bought the drinks and listened.
Over dinner in a North London restaurant, Negroponte started with convergence — the melting of all boundaries between TV, newspapers, magazines, and the internet into a single media experience — and moved on to the death of copyright, possibly the nation state itself. There would be virtual reality, speech recognition, personal computers with inbuilt cameras, personalized news. The entire economic model of information was about to fall apart. The audience would pull rather than wait for old media to push things as at present. Information and entertainment would be on demand. Overly hierarchical and status-conscious societies would rapidly erode. Time as we knew it would become meaningless — five hours of music would be delivered to you in less than five seconds. Distance would become irrelevant. A UK paper would be as accessible in New York as it was in London.
Writing 15 years later in the Observer, the critic John Naughton compared the begetter of the world wide web, Sir Tim Berners-Lee, with the seismic disruption five centuries earlier caused by the invention of movable type. Just as Gutenberg had no conception of his invention’s eventual influence on religion, science, systems of ideas, and democracy, so — in 2008 — “it will be decades before we have any real understanding of what Berners-Lee hath wrought.”
The entire economic model of information was about to fall apart.
And so I decided to go to America with the leader of the PDU team, Tony Ageh, and see the internet for myself. A 33-year-old “creative,” Ageh had had exactly one year’s experience in media — as an advertising copy chaser for The Home Organist magazine — before joining the Guardian. I took with me a copy of The Internet for Dummies. Thus armed, we set off to America for a four-day, four-city tour.
In Atlanta, we found the Atlanta Journal-Constitution (AJC), which was considered a thought leader in internet matters, having joined the Prodigy Internet Service, an online service offering subscribers information over dial-up 1,200 bit/second modems. After four months the internet service had 14,000 members, paying 10 cents a minute to access online banking, messaging, full webpage hosting and live share prices.
The AJC business plan envisaged building to 35,000 or 40,000 by year three. But that time, they calculated, they would be earning $3.3 million in subscription fees and $250,000 a year in advertising. “If it all goes to plan,’ David Scott, the publisher, Electronic Information Service, told us, ‘it’ll be making good money. If it goes any faster, this is a real business.”
We also met Michael Gordon, the managing editor. “The appeal to the management is, crudely, that it is so much cheaper than publishing a newspaper,” he said.
We wrote it down.
“We know there are around 100,000 people in Atlanta with PCs. There are, we think, about one million people wealthy enough to own them. Guys see them as a toy; women see them as a tool. The goldmine is going to be the content, which is why newspapers are so strongly placed to take advantage of this revolution. We’re out to maximize our revenue by selling our content any way we can. If we can sell it on CD-ROM or TV as well, so much the better.”
“Papers? People will go on wanting to read them, though it’s obviously much better for us if we can persuade them to print them in their own homes. They might come in customized editions. Edition 14B might be for females living with a certain income.”
It was heady stuff.
From Atlanta we hopped up to New York to see the Times’s online service, @Times. We found an operation consisting of an editor plus three staffers and four freelancers. The team had two PCs, costing around $4,000 each. The operation was confident, but small.
The @Times content was weighted heavily towards arts and leisure. The opening menus offered a panel with about 15 reviews of the latest films, theatre, music, and books – plus book reviews going back two years. The site offered the top 15 stories of the day, plus some sports news and business.
There was a discussion forum about movies, with 47 different subjects being debated by 235 individual subscribers. There was no archive due to the fact that — in one of the most notorious newspaper licensing cock-ups in history — the NYT in 1983 had given away all rights to its electronic archive (for all material more than 24 hours old) in perpetuity to Mead/Lexis.
That deal alone told you how nobody had any clue what was to come.
We sat down with Henry E. Scott, the group director of @Times. “Sound and moving pictures will be next. You can get them now. I thought about it the other day, when I wondered about seeing 30 seconds of The Age of Innocence. But then I realized it would take 90 minutes to download that and I could have seen more or less the whole movie in that time. That’s going to change.”
But Scott was doubtful about the lasting value of what they were doing — at least, in terms of news. “I can’t see this replacing the news- paper,” he said confidently. “People don’t read computers unless it pays them to, or there is some other pressing reason. I don’t think anyone reads a computer for pleasure. The San Jose Mercury [News] has put the whole newspaper online. We don’t think that’s very sensible. It doesn’t make sense to offer the entire newspaper electronically.”
We wrote it all down.
“I can’t see the point of news on-screen. If I want to know about a breaking story I turn on the TV or the radio. I think we should only do what we can do better than in print. If it’s inferior than the print version there’s no point in doing it.”
Was there a business plan? Not in Scott’s mind. “There’s no way you can make money out of it if you are using someone else’s server. I think the LA Times expects to start making money in about three years’ time. We’re treating it more as an R & D project.”
Kickstart your weekend reading by getting the week’s best Longreads delivered to your inbox every Friday afternoon.
From New York we flitted over to Chicago to see what the Tribune was up to. In its 36-storey Art Deco building — a spectacular monument to institutional self-esteem — we found a team of four editorial and four marketing people working on a digital service, with the digital unit situated in the middle of the newsroom. The marketeers were beyond excited about the prospect of being able to show houses or cars for sale and arranged a demonstration. We were excited, too, even if the pictures were slow and cumbersome to download.
We met Joe Leonard, associate editor. “We’re not looking at Chicago Online as a money maker. We’ve no plans even to break even at this stage. My view is simply that I’m not yet sure where I’m going, but I’m on the boat, in the water — and I’m ahead of the guy who is still standing on the pier.”
Reach before revenue.
Finally we headed off to Boulder, Colorado, in the foothills of the Rockies, where Knight Ridder had a team working on their vision of the newspaper of tomorrow. The big idea was, essentially, what would become the iPad — only the team in Boulder hadn’t got much further than making an A4 block of wood with a “front page” stuck on it. The 50-something director of the research centre, Roger Fidler, thought the technology capable of realizing his dream of a ‘personal information appliance’ was a couple of years off.
Tony and I had filled several notebooks. We were by now beyond tired and talked little over a final meal in an Italian restaurant beneath the Rocky Mountains.
We had come. We had seen the internet. We were conquered.
* * *
Looking back from the safe distance of nearly 25 years, it’s easy to mock the fumbling, wildly wrong predictions about where this new beast was going to take the news industry. We had met navigators and pioneers. They could dimly glimpse where the future lay. Not one of them had any idea how to make a dime out of it, but at the same time they intuitively sensed that it would be more reckless not to experiment. It seemed reasonable to assume that — if they could be persuaded to take the internet seriously — their companies would dominate in this new world, as they had in the old world.
We were no different. After just four days it seemed blindingly obvious that the future of information would be mainly digital. Plain old words on paper — delivered expensively by essentially Victorian production and distribution methods — couldn’t, in the end, compete. The future would be more interactive, more image-driven, more immediate. That was clear. But how on earth could you graft a digital mindset and processes onto the stately ocean liner of print? How could you convince anyone that this should be a priority when no one had yet worked out how to make any money out of it? The change, and therefore the threat, was likely to happen rapidly and maybe violently. How quickly could we make a start? Or was this something that would be done to us?
In a note for Peter Preston on our return I wrote, “The internet is fascinating, intoxicating . . . it is also crowded out with bores, nutters, fanatics and middle managers from Minnesota who want the world to see their home page and CV. It’s a cacophony, a jungle. There’s too much information out there. We’re all overloaded. You want someone you trust to fillet it, edit it and make sense of it for you. That’s what we do. It’s an opportunity.”
Looking back from the safe distance of nearly 25 years, it’s easy to mock the fumbling, wildly wrong predictions about where this new beast was going to take the news industry.
I spent the next year trying to learn more and then the calendar clicked on to 1995 — The Year the Future Began, at least according to a recent book by the cultural historian W. Joseph Campbell, who used the phrase as his book title twenty years later. It was the year Amazon.com, eBay, Craigslist, and Match.com established their presence online. Microsoft spent $300m launching Windows 95 with weeks of marketing hype, spending millions for the rights to the Rolling Stones hit “Start Me Up,” which became the anthem for the Windows 95 launch.
Cyberspace — as the cyber dystopian Evgeny Morozov recalled, looking back on that period — felt like space itself. “The idea of exploring cyberspace as virgin territory, not yet colonized by governments and corporations, was romantic; that romanticism was even reflected in the names of early browsers (‘Internet Explorer,’ ‘Netscape Navigator’).”
But, as Campbell was to reflect, “no industry in 1995 was as ill-prepared for the digital age, or more inclined to pooh-pooh the disruptive potential of the Internet and World Wide Web, than the news business.” It suffered from what he called “innovation blindness” — “an inability, or a disinclination to anticipate and understand the consequences of new media technology.”
1995 was, then, the year the future began. It happened also to be the year in which I became editor of the Guardian.
* * *
I was 41 and had not, until very recently, really imagined this turn of events. My journalism career took a traditional enough path. A few years reporting; four years writing a daily diary column; a stint as a feature writer — home and abroad. In 1986 I left the Guardian to be the Observer’s television critic. When I rejoined the Guardian I was diverted towards a route of editing — launching the paper’s Saturday magazine followed by a daily tabloid features section and moving to be deputy editor in 1993. Peter Preston — unshowy, grittily obstinate, brilliantly strategic — looked as if he would carry on editing for years to come. It was a complete surprise when he took me to the basement of the resolutely unfashionable Italian restaurant in Clerkenwell he favored, to tell me he had decided to call it a day.
On most papers the proprietor or chief executive would find an editor and take him or her out to lunch to do the deal. On the Guardian — at least according to tradition dating back to the mid-70s — the Scott Trust made the decision after balloting the staff, a process that involved manifestos, pub hustings, and even, by some candidates, a little frowned-on campaigning.
I supposed I should run for the job. My mission statement said I wanted to boost investigative reporting and get serious about digital. It was, I fear, a bit Utopian. I doubt much of it impressed the would-be electorate. British journalists are programmed to skepticism about idealistic statements concerning their trade. Nevertheless, I won the popular vote and was confirmed by the Scott Trust after an interview in which I failed to impress at least one Trustee with my sketchy knowledge of European politics. We all went off for a drink in the pub round the back of the office. A month later I was editing.
“Fleet Street,” as the UK press was collectively called, was having a torrid time, not least because the biggest beast in the jungle, Rupert Murdoch, had launched a prolonged price war that was playing havoc with the economics of publishing. His pockets were so deep he could afford to slash the price of The Times almost indefinitely — especially if it forced others out of business.
Reach before revenue — as it wasn’t known then.
The newest kid on the block, the Independent, was suffering the most. To their eyes, Murdoch was behaving in a predatory way. We calculated the Independent titles were losing around £42 million (nearly £80 million in today’s money). Murdoch’s Times, by contrast, had seen its sales rocket 80 per cent by cutting its cover prices to below what it cost to print and distribute. The circulation gains had come at a cost — about £38 million in lost sales revenue. But Murdoch’s TV business, BSkyB, was making booming profits and the Sun continued to throw off huge amounts of cash. He could be patient.
But how on earth could you graft a digital mindset and processes onto the stately ocean liner of print.
The Telegraph had been hit hard — losing £45 million in circulation revenues through cutting the cover price by 18 pence. The end of the price war left it slowly clawing back lost momentum, but it was still £23 million adrift of where it had been the previous year. Murdoch — as so often — had done something bold and aggressive. Good for him, not so good for the rest of us. Everyone was tightening their belts in different ways. The Independent effectively gave up on Scotland. The Guardian saved a million a year in newsprint costs by shaving half an inch off the width of the paper.
The Guardian, by not getting into the price war, had “saved” around £37 million it would otherwise have lost. But its circulation had been dented by about 10,000 readers a day. Moreover, the average age of the Guardian reader was 43 — something that pre-occupied us rather a lot. We were in danger of having a readership too old for the job advertisements we carried.
Though the Guardian itself was profitable, the newspaper division was losing nearly £12 million (north of £21 million today). The losses were mainly due to the sister Sunday title, the Observer, which the Scott Trust had purchased as a defensive move against the Independent in 1993. The Sunday title had a distinguished history, but was hemorrhaging cash: £11 million losses.
Everything we had seen in America had to be put on hold for a while. The commercial side of the business never stopped reminding us that only three percent of households owned a PC and a modem.
* * *
But the digital germ was there. My love of gadgets had not extended to understanding how computers actually worked, so I commissioned a colleague to write a report telling me, in language I could understand, how our computers measured up against what the future would demand. The Atex system we had installed in 1987 gave everyone a dumb terminal on their desk — little more than a basic word processor. It couldn’t connect to the internet, though there was a rudimentary internal messaging system. There was no word count or spellchecker and storage space was limited. It could not be used with floppy disks or CD-ROMs. Within eight years of purchase it was already a dinosaur.
There was one internet connection in the newsroom, though most reporters were unaware of it. It was rumored that downstairs a bloke called Paul in IT had a Mac connected to the internet through a dial-up modem. Otherwise we were sealed off from the outside world.
Some of these journalist geeks began to invent Heath Robinson solutions to make the inadequate kit in Farringdon Road to do the things we wanted in order to produce a technology website online. Tom Standage — he later became deputy editor of the Economist, but then was a freelance tech writer — wrote some scripts to take articles out of Atex and format them into HTML so they could be moved onto the modest Mac web server — our first content management system, if you like. If too many people wanted to read this tech system at once the system crashed. So Standage and the site’s editor, Azeem Azhar, would take it in turns sitting in the server room in the basement of the building rebooting the machines by hand — unplugging them and physically moving the internet cables from one machine to another.
What would the future look like? We imagined personalized editions, even if we had not the faintest clue how to produce them. We guessed that readers might print off copies of the Guardian in their homes — and even toyed with the idea of buying every reader a printer. There were glimmers of financial hope. Our readers were spending £56 million a year buying the Guardian but we retained none of it: the money went on paper and distribution. In the back of our minds we ran calculations about how the economics of newspapers would change if we could save ourselves the £56 million a year “old world” cost.
By March 1996, ideas we’d hatched in the summer of 1995 to graft the paper onto an entirely different medium were already out of date. That was a harbinger of the future.
On top of editing, the legal entanglements sometimes felt like a full-time job on their own. Trying to engineer a digital future for the Guardian felt like a third job. There were somehow always more urgent issues. By March 1996, ideas we’d hatched in the summer of 1995 to graft the paper onto an entirely different medium were already out of date. That was a harbinger of the future. No plans in the new world lasted very long.
It was now apparent that we couldn’t get away with publishing selective parts of the Guardian online. Other newspapers had shot that fox by pushing out everything. We were learning about the connectedness of the web — and the IT team tentatively suggested that we might use some “offsite links” to other versions of the same story to save ourselves the need to write our own version of everything. This later became the mantra of the City University of New York (CUNY) digital guru Jeff Jarvis — “Do what you do best, and link to the rest.”
We began to grapple with numerous basic questions about the new waters into which we were gingerly dipping our toes.
Important question: Should we charge?
The Times and the Telegraph were both free online. A March 1996 memo from Bill Thompson, a developer who had joined the Guardian from Pipex, ruled it out:
I do not believe the UK internet community would pay to read an online edition of a UK newspaper. They may pay to look at an archive, but I would not support any attempt to make the Guardian a subscription service online . . . It would take us down a dangerous path.
In fact, I believe that the real value from an online edition will come from the increased contact it brings with our readers: online newspapers can track their readership in a way that print products never can, and the online reader can be a valuable commodity in their own right, even if they pay nothing for the privilege.
Thompson was prescient about how the overall digital economy would work — at least for players with infinitely larger scale and vastly more sophisticated technology.
What time of day should we publish?
The electronic Telegraph was published at 8 a.m. each day — mainly because of its print production methods. The Times, more automated, was available as soon as the presses started rolling. The Guardian started making some copy available from first edition through to the early hours. It would, we were advised, be fraught with difficulties to publish stories at the same time they were ready for the press.
Why were we doing it anyway?
Thompson saw the dangers of cannibalization, that readers would stop buying the paper if they could read it for free online. It could be seen as a form of marketing. His memo seemed ambivalent as to whether we should venture into this new world at all:
The Guardian excels in presenting information in an attractive easy to use and easy to navigate form. It is called a “broadsheet newspaper.” If we try to put the newspaper on-line (as the Times has done) then we will just end up using a new medium to do badly what an old medium does well. The key question is whether to make the Guardian a website, with all that entails in terms of production, links, structure, navigational aids etc. In summer 1995 we decided that we would not do this.
But was that still right a year later? By now we had the innovation team — PDU — still in the basement of one building in Farringdon Road, and another team in a Victorian loft building across the way in Ray Street. We were, at the margins, beginning to pick up some interesting fringe figures who knew something about computers, if not journalism. But none of this was yet pulling together into a coherent picture of what a digital Guardian might look like.
An 89-page business plan drawn up in October 1996 made it plain where the priorities lay: print.
We wanted to keep growing the Guardian circulation — aiming a modest increase to 415,000 by March 2000 — which would make us the ninth-biggest paper in the UK — with the Observer aiming for 560,000 with the aid of additional sections. A modest investment of £200,000 a year in digital was dwarfed by an additional £6 million cash injection into the Observer, spread over three years.
As for “on-line services” (we were still hyphenating it) we did want “a leading-edge presence” (whatever that meant), but essentially we thought we had to be there because we had to be there. By being there we would learn and innovate and — surely? — there were bound to be commercial opportunities along the road. It wasn’t clear what.
We decided we might usefully take broadcasting, rather than print, as a model — emulating its “immediacy, movement searchability and layering.”
If this sounded as if we were a bit at sea, we were. We hadn’t published much digitally to this point. We had taken half a dozen meaty issues — including parliamentary sleaze, and a feature on how we had continued to publish on the night our printing presses had been blown up by the IRA — and turned them into special reports.
It is a tribute to our commercial colleagues that they managed to pull in the thick end of half a million pounds to build these websites. Other companies’ marketing directors were presumably like ours — anxious about the youth market and keen for their brands to feel “cool.” In corporate Britain in 1996, there was nothing much cooler than the internet, even if not many people had it, knew where to find it or understood what to do with it.
* * *
The absence of a controlling owner meant we could run the Guardian in a slightly different way from some papers. Each day began with a morning conference open to anyone on the staff. In the old Farringdon Road office, it was held around two long narrow tables in the editor’s office — perhaps 30 or 40 people sitting or standing. When we moved to our new offices at Kings Place, near Kings Cross in North London, we created a room that was, at least theoretically, less hierarchical: a horseshoe of low yellow sofas with a further row of stools at the back. In this room would assemble a group of journalists, tech developers and some visitors from the commercial departments every morning at about 10 a.m. If it was a quiet news day we might expect 30 or so. On big news days, or with an invited guest, we could host anything up to 100.
A former Daily Mail journalist, attending his first morning conference, muttered to a colleague in the newsroom that it was like Start the Week — a Monday morning BBC radio discussion program. All talk and no instructions. In a way, he was right: It was difficult, in conventional financial or efficiency terms, to justify 50 to 60 employees stopping work to gather together each morning for anything between 25 and 50 minutes. No stories were written during this period, no content generated.
But something else happened at these daily gatherings. Ideas emerged and were kicked around. Commissioning editors would pounce on contributors and ask them to write the thing they’d just voiced. The editorial line of the paper was heavily influenced, and sometimes changed, by the arguments we had. The youngest member of staff would be in the same room as the oldest: They would be part of a common discussion around news. By a form of accretion and osmosis an idea of the Guardian was jointly nourished, shared, handed down, and crafted day by day.
You might love the Guardian or despise it, but it had a definite sense of what it believed in and what its journalism was.
It led to a very strong culture. You might love the Guardian or despise it, but it had a definite sense of what it believed in and what its journalism was. It could sometimes feel an intimidating meeting — even for, or especially for, the editor. The culture was intended to be one of challenge: If we’d made a wrong decision, or slipped up factually or tonally, someone would speak up and demand an answer. But challenge was different from blame: It was not a meeting for dressing downs or bollockings. If someone had made an error the previous day we’d have a post-mortem or unpleasant conversation outside the room. We’d encourage people to want to contribute to this forum, not make them fear disapproval or denunciation.
There was a downside to this. It could, and sometimes did, lead to a form of group-think. However herbivorous the culture we tried to nurture, I was conscious of some staff members who felt awkward about expressing views outside what we hoped was a fairly broad consensus. But, more often, there would be a good discussion on two or three of the main issues of the day. We encouraged specialists or outside visitors to come in and discuss breaking stories. Leader writers could gauge the temperature of the paper before penning an editorial. And, from time to time, there would be the opposite of consensus: Individuals, factions, or groups would come and demand we change our line on Russia, bombing in Bosnia; intervention in Syria; Israel, blood sports, or the Labor leadership.
The point was this: that the Guardian was not one editor’s plaything or megaphone. It emerged from a common conversation — and was open to internal challenge when editorial staff felt uneasy about aspects of our journalism or culture.
* * *
Within two years — slightly uncomfortable at the power I had acquired as editor — I gave some away. I wanted to make correction a natural part of the journalistic process, not a bitterly contested post-publication battleground designed to be as difficult as possible.
We created a new role on the Guardian: a readers’ editor. He or she would be the first port of call for anyone wanting to complain about anything we did or wrote. The readers’ editor would have daily space in the paper — off-limits to the editor — to correct or clarify anything and would also have a weekly column to raise broader issues of concern. It was written into the job description that the editor could not interfere. And the readers’ editor was given the security that he/she could not be removed by the editor, only by the Scott Trust.
On most papers editors had sat in judgment on themselves. They commissioned pieces, edited and published them — and then were supposed neutrally to assess whether their coverage had, in fact, been truthful, fair, and accurate. An editor might ask a colleague — usually a managing editor — to handle a complaint, but he/she was in charge from beginning to end. It was an autocracy. That mattered even more in an age when some journalism was moving away from mere reportage and observation to something closer to advocacy or, in some cases, outright pursuit.
Allowing even a few inches of your own newspaper to be beyond your direct command meant that your own judgments, actions, ethical standards and editorial decisions could be held up to scrutiny beyond your control. That, over time, was bound to change your journalism. Sunlight is the best disinfectant: that was the journalist-as-hero story we told about what we do. So why wouldn’t a bit of sunlight be good for us, too?
The first readers’ editor was Ian Mayes, a former arts and obituaries editor then in his late 50s. We felt the first person in the role needed to have been a journalist — and one who would command instant respect from a newsroom which otherwise might be somewhat resistant to having their work publicly critiqued or rebutted. There were tensions and some resentment, but Ian’s experience, fairness and flashes of humor eventually won most people round.
One or two of his early corrections convinced staff and readers alike that he had a light touch about the fallibility of journalists:
In our interview with Sir Jack Hayward, the chairman of Wolverhampton Wanderers, page 20, Sport, yesterday, we mistakenly attributed to him the following comment: “Our team was the worst in the First Division and I’m sure it’ll be the worst in the Premier League.” Sir Jack had just declined the offer of a hot drink. What he actually said was: “Our tea was the worst in the First Division and I’m sure it’ll be the worst in the Premier League.” Profuse apologies.
In an article about the adverse health effects of certain kinds of clothing, pages 8 and 9, G2, August 5, we omitted a decimal point when quoting a doctor on the optimum temperature of testicles. They should be 2.2 degrees Celsius below core body temperature, not 22 degrees lower.
But in his columns he was capable of asking tough questions about our editorial decisions — often prompted by readers who had been unsettled by something we had done. Why had we used a shocking picture which included a corpse? Were we careful enough in our language around mental health or disability? Why so much bad language in the Guardian? Were we balanced in our views of the Kosovo conflict? Why were Guardian journalists so innumerate? Were we right to link to controversial websites?
In most cases Mayes didn’t come down on one side or another. He would often take readers’ concerns to the journalist involved and question them — sometimes doggedly — about their reasoning. We learned more about our readers through these interactions; and we hoped that Mayes’s writings, candidly explaining the workings of a newsroom, helped readers better understand our thinking and processes.
It was, I felt, good for us to be challenged in this way. Mayes was invaluable in helping devise systems for the “proper” way to correct the record. A world in which — to coin a phrase — you were “never wrong for long” posed the question of whether you went in for what Mayes termed “invisible mending.” Some news organizations would quietly amend whatever it was that they had published in error, no questions asked. Mayes felt differently: The act of publication was something on the record. If you wished to correct the record, the correction should be visible.
But we had some inkling that the iron grip of centralized control that a newspaper represented was not going to last.
We were some years off the advent of social media, in which any error was likely to be pounced on in a thousand hostile tweets. But we had some inkling that the iron grip of centralized control that a newspaper represented was not going to last.
I found liberation in having created this new role. There were few things editors can enjoy less than the furious early morning phone call or email from the irate subject of their journalism. Either the complainant is wrong — in which case there is time wasted in heated self-justification; or they’re right, wholly or partially. Immediately you’re into remorseful calculations about saving face. If readers knew we honestly and rapidly — even immediately — owned up to our mistakes they should, in theory, trust us more. That was the David Broder theory, and I bought it. Readers certainly made full use of the readers’ editor’s existence. Within five years Mayes was dealing with around 10,000 calls, emails, and letters a year — leading to around 1,200 corrections, big and small. It’s not, I think, that we were any more error-prone than other papers. But if you win a reputation for openness, you’d better be ready to take it as seriously as your readers will.
Our journalism became better. If, as a journalist, you know there are a million sleuth-eyed editors out there waiting to leap on your tiniest mistake, it makes you more careful. It changes the tone of your writing. Our readers often know more than we do. That became a mantra of the new world, coined by the blogger and academic Dan Gillmor, in his 2004 book We the Media8 but it was already becoming evident in the late 1990s.
The act of creating a readers’ editor felt like a profound recognition of the changing nature of what we were engaged in. Journalism was not an infallible method guaranteed to result in something we would proclaim as The Truth — but a more flawed, tentative, iterative and interactive way of getting towards something truthful.
Admitting that felt both revolutionary and releasing.
***
Alan Rusbridger was editor in chief of Guardian News and Media from 1995 to 2015. He is the author of Play It Again: An Amateur Against the Impossible and is currently chair of the Reuters Institute for the Study of Journalism and principal of Lady Margaret Hall, Oxford University.
You must be logged in to post a comment.